Edit model card

Solar-10.7B-SLERP-EXL2

Model: Solar-10.7B-SLERP
Made by: jan-hq

Based on original model: SOLAR-10.7B-Instruct-v1.0
Created by: upstage

List of quants:

4bpw h8 (main)
4.65bpw h8
5bpw h8
6bpw h8
8bpw h8

Quantization notes

In my experience, it's one of the best models within 13B range for Slavic languages.
And overall it has very good language skills. I haven't encountered cases when the model randomly switches language unlike most models I tried.
It also capable to understand instructions in other languages, a pretty rare feat.

30.01.2024 Replaced old quants with newer ones that were made with exllamav2-0.0.12, calibrated with default exllamav2 dataset.
This model uses a lot less memory compared to Llama2-13B models and it almost fits 12GB VRAM even at 8bpw.
But with 8-bit cache it even uses just a tiny bit less than 12GB VRAM.

How to run

This quantization method uses GPU and requires Exllamav2 loader which can be found in following applications:

Text Generation Webui

KoboldAI

ExUI

Original model card:

Jan banner

Jan - Discord

Model Description

This model uses the Slerp merge method from the best models on 14th Dec on the OpenLLM Leaderboard:

  1. upstage/SOLAR-10.7B-Instruct-v1.0
  2. janhq/Pandora-v1-10.7B

The yaml config file for this model is here:

slices:
  - sources:
      - model: upstage/SOLAR-10.7B-Instruct-v1.0
        layer_range: [0, 48]
      - model: janhq/Pandora-v1-10.7B
        layer_range: [0, 48]
merge_method: slerp
base_model: upstage/SOLAR-10.7B-Instruct-v1.0
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16

Prompt template

  • ChatML
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

Run this model

You can run this model using Jan Desktop on Mac, Windows, or Linux.

Jan is an open source, ChatGPT alternative that is:

  • ๐Ÿ’ป 100% offline on your machine: Your conversations remain confidential, and visible only to you.
  • ๐Ÿ—‚๏ธ An Open File Format: Conversations and model settings stay on your computer and can be exported or deleted at any time.
  • ๐ŸŒ OpenAI Compatible: Local server on port 1337 with OpenAI compatible endpoints
  • ๐ŸŒ Open Source & Free: We build in public; check out our Github

image/png

About Jan

Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.

Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.

Jan Model Merger

This is a test project for merging models.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here.

Metric Value
Avg. ?
ARC (25-shot) ?
HellaSwag (10-shot) ?
MMLU (5-shot) ?
TruthfulQA (0-shot) ?
Winogrande (5-shot) ?
GSM8K (5-shot) ?

Acknowlegement

Downloads last month
12
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for cgus/Solar-10.7B-SLERP-exl2

Finetuned
(1)
this model