Max Output Tokens of Llama-4
#46
by
MengboZhou
- opened
I am wondering what the maximum number of output tokens is for the Llama-4 model during inference. Also, is there a public document listing the output limits?