Lachlan Cahill
lcahill
AI & ML interests
None yet
Organizations
None yet
How to use it in PyCharm for auto completion?
👍
3
3
#6 opened 2 months ago
by
DrNicefellow

Max output tokens for Llama 3.1
8
#6 opened 12 months ago
by
abhirup-sainapse
Training this model
❤️
3
4
#33 opened over 1 year ago
by
ottogutierrez
Why 12b? Who could run that locally?
😔
👍
11
47
#1 opened 11 months ago
by
kaidu88
GPU requirements
7
#32 opened about 1 year ago
by
jmoneydw
Align tokenizer with mistral-common
3
#39 opened about 1 year ago
by
Rocketknight1

feat/tools-in-chat-template
❤️
1
6
#21 opened about 1 year ago
by
lcahill
You are truly a godsend!
#16 opened about 1 year ago
by
lcahill
thanks and question function-calling.
10
#17 opened about 1 year ago
by
NickyNicky

Not generating [TOOL_CALLS]
3
#20 opened about 1 year ago
by
ShukantP

Examples on usage
4
#7 opened over 1 year ago
by
dgallitelli

Can you add chat template to the `tokenizer_config.json`file?
5
#3 opened over 1 year ago
by
hdnh2006

add chat template jinja in tokenizer.json
4
#4 opened over 1 year ago
by
Jaykumaran17
Added Chat Template
1
#5 opened over 1 year ago
by
lcahill
Adding `safetensors` variant of this model
👍
1
2
#10 opened over 1 year ago
by
SFconvertbot

Adding `safetensors` variant of this model
#94 opened over 1 year ago
by
lcahill
Adding `safetensors` variant of this model
❤️
4
4
#42 opened almost 2 years ago
by
nth-attempt
Adding `safetensors` variant of this model
❤️
4
4
#42 opened almost 2 years ago
by
nth-attempt