Where config.json?
Chop chop ๐ ๐
need to wait for someone to HF-ify it, though I wonder if the weights are already proper (though weirdly named)? ๐ค
Currently doing it, will be available very soon ๐ค
Please create a new repo for it. Don't make me download 123B twice.
@cyrilvallez ^ PLEASE.
Behemoth 2?
Please create a new repo for it. Don't make me download 123B twice.
Oh wow, new finetune I can't run
Please create a new repo for it. Don't make me download 123B twice.
I'm guessing your download is done already, but for anyone else reading this, when using the HuggingFace Python API you can exclude the original weights from your download by passing ignore_patterns='consolidated*'
to your call to snapshot_download.
@MikeRoz No, it's not that simple. That doesn't work all the time. Even if you download the model beforehand, running something like Mergekit or Axolotl will re-download it without the ignore pattern. Downloading the entire repo is unavoidable in most cases.
Will be closing this discussion since it seems everything is good now!