Training procedure

The following bitsandbytes quantization config was used during training:

  • load_in_8bit: False
  • load_in_4bit: True
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: True
  • bnb_4bit_compute_dtype: bfloat16

Framework versions

  • PEFT 0.4.0

prompt: prompt = f""" You are going to determine whether [{data_point["Description"]}] includes the business model. Don't use any prior knowledge, only base your answer off of what's given. It might not be explicitly stated but infer whether the class is B2C, B2B, B2G, or No business model. Respond in sentence form with the class and reasoning -> : {data_point['Answer']} """

Downloads last month
66
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support