Update modeling_kangaroo.py
#6
by
Jiqing
- opened
The transformers has deprecated get_max_length
, we should use get_seq_length
instead to avoid error in the following:
File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_
kangaroo.py", line 1397, in chat
outputs = self.generate(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_kangaroo.py", line 1294, in generate return super().generate(inputs_embeds=encoder_input, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 2223, in generate
result = self._sample(
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 3204, in _sample
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_
kangaroo.py", line 1312, in prepare_inputs_for_generation if past_key_values.get_max_length() is not None
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1928, in __getattr__
raise AttributeError( AttributeError: 'DynamicCache' object has no attribute 'get_max_length'. Did you mean: 'get_seq_length'?
Hi @FrankJJHu @WEBing . Could you please review this change? Thanks!