IvanHU's picture
Add files using upload-large-folder tool
89e5a57 verified
metadata
license: mit
library_name: mlx
pipeline_tag: text-generation
language:
  - en
  - zh
tags:
  - code
  - math
  - mlx
arxiv: 2412.17743
base_model: yulan-team/YuLan-Mini-Instruct
model-index:
  - name: YuLan-Mini-Instruct
    results:
      - task:
          type: text-generation
        dataset:
          name: HumanEval
          type: openai_humaneval
        metrics:
          - type: pass@10
            value: 0.866
            name: pass@10
            verified: false
      - task:
          type: text-generation
        dataset:
          name: MBPP
          type: mbpp
        metrics:
          - type: pass@10
            value: 0.857
            name: pass@10
            verified: false
      - task:
          type: text-generation
        dataset:
          name: MATH
          type: math
        metrics:
          - type: maj@1
            value: 0.552
            name: maj@1
            verified: false
      - task:
          type: text-generation
        dataset:
          name: GSM8K
          type: gsm8k
        metrics:
          - type: maj@1
            value: 0.717
            name: maj@1
            verified: false

IvanHU/YuLan-Mini-Instruct-8bit

This model IvanHU/YuLan-Mini-Instruct-8bit was converted to MLX format from yulan-team/YuLan-Mini-Instruct using mlx-lm version 0.22.2.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("IvanHU/YuLan-Mini-Instruct-8bit")

prompt = "hello"

if tokenizer.chat_template is not None:
    messages = [{"role": "user", "content": prompt}]
    prompt = tokenizer.apply_chat_template(
        messages, add_generation_prompt=True
    )

response = generate(model, tokenizer, prompt=prompt, verbose=True)