Shouldn't CodeLlama 34B have 16K context and rope_theta 1M?
#3
by
TheBloke
- opened
No description provided.
ehartford
changed pull request status to
merged