Issue for flash attention 2.0 support
#20
by
xiongsy
- opened
this model use custom code and you should not use it as other transformers models. in addition, this model will not be updated. please use qwen2
jklj077
changed discussion status to
closed