Only BF16 Work. FP16 and 8INT generate non-sense for me currently.
#9
by
JulesGM
- opened
JulesGM
changed discussion title from
FP16 and 8INT generate non-sense for me currently
to Only BF16 Work. FP16 and 8INT generate non-sense for me currently.
Same issue here. BF16 seems like it is working.
I think it's because there's a bug in the config with the activation function. See #11.
that's cool. thanks @michaelroyzen , will test it
No it still doesn't work with float16
@michaelroyzen
We are investigating the issue in https://github.com/huggingface/transformers/issues/20287
Feel free to continue the thread there as you'll probably have the most up-to-date information about this issue
This is now fixed, please use the latest version of transformers
with: pip install git+https://github.com/huggingface/transformers.git@main
ybelkada
changed discussion status to
closed