Fix https://github.com/huggingface/peft/issues/1974#issue-2437471248
#74
by
Finger-ebic
- opened
The issue is also posted at https://github.com/THUDM/GLM-4/issues/493. THUDM/glm-4-9b-chat
uses custom code, which is not compatible with PromptEncoder. As the error indicates, this line fails:
batch_size, seq_length = input_ids.shape
AttributeError: 'NoneType' object has no attribute 'shape'
This is because the prompt encoder does not pass input_ids, instead passing the inputs_embeds directly. I could patch the issue by using inputs_embeds instead by editing this line:
if input_ids is not None:
batch_size, seq_length = input_ids.shape
else:
batch_size, seq_length, _ = inputs_embeds.shape
Supplement the problem scenario. This problem occurs when using peft for ptuning fine-tuning