Model responses missing inference labels using PromptFormat_exaone

#1
by xldistance - opened

Added thinktag and it doesn't work

Are you adding -think on the command line? That's what actually triggers prefixing the response with think tags, and the format class just defines what the tags are (overriding the default ("<think>", "</think>") when needed.) I.e.:

image.png

xldistance changed discussion status to closed

Sign up or log in to comment