problems about sample strategies

#1
by thuzhizhi - opened

In generation_config file, it says that

{
  "do_sample": true,
  "max_new_tokens": 4096,
  "temperature": 0.0,
  "transformers_version": "4.41.0"
}

However, I went into error message that says

ValueError: `temperature` (=0.0) has to be a strictly positive float, otherwise your next token scores will be invalid. If you're looking for greedy decoding strategies, set `do_sample=False`.

I want to know if the default setting is wrong?
Thank you for your help!

PowerInfer org

Thanks for your feedback. Actually you can set the do_sample to false to solve this problem.

Sign up or log in to comment