Issue Running BottleneckT5LMWithPerturb: Unexpected Keyword Argument ‘cache_position’

#3
by Gal-Lahat - opened

Hi,

I’ve been trying to use BottleneckT5LMWithPerturb, and I find it surprising that this is the only implementation available on the internet. However, when I try to run it either locally or on Google Colab using the provided notebook, I keep getting the following error:

TypeError: BottleneckT5LMWithPerturb.forward() got an unexpected keyword argument 'cache_position'

I tried manually providing this argument to fix the issue, but then I started getting other errors related to the attention mask.

I’m not sure if I’m using the wrong version of the model or dependencies, or if there’s something else I’m missing. Has anyone encountered this before, or does anyone know how to fix it? Any help would be greatly appreciated!

I have the same issue

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment