Model halucinate
#23
by
hamidtech
- opened
Hi there, thank you for reporting this issue.
There are several possible causes:
- the default system prompt is 'You are a helpful assistant', where the identity of the model is undefined, it may randomly output a reasonable answer
- our post-training data are most in English and Chinese, it may degrade in Persian