FinCreditPhi-3.5-mini

๋ชจ๋ธ ๊ฐœ์š”

FinCreditPhi-3.5-mini๋Š” ๊ธˆ์œต ์‹ ์šฉ ํ‰๊ฐ€๋ฅผ ์œ„ํ•ด ํŠน๋ณ„ํžˆ ์„ค๊ณ„๋œ ํ•œ๊ตญ์–ด ์–ธ์–ด ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

๋ฒ ์ด์Šค ๋ชจ๋ธ: unsloth/Phi-3.5-mini-instruct ๋ฐ์ดํ„ฐ์…‹: himedia/financial_dummy_data_v4 ํ•™์Šต ๋ฐฉ๋ฒ•: LoRA (Low-Rank Adaptation) ํ•™์Šต ์ผ์‹œ: 20250622_131709

๐Ÿ“Š ํ•™์Šต ๊ฒฐ๊ณผ

  • Final Training Loss: 0.1521
  • Final Validation Loss: 0.1550
  • Best Validation Loss: 0.1550 (step 1000)
  • Overall Improvement: 87.0%
  • Training Time: 73.66 minutes

ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ

  • Learning Rate: 0.0002
  • Max Steps: 1000
  • Batch Size: 4
  • Gradient Accumulation: 4
  • LoRA r: 32
  • LoRA alpha: 32
  • Max Sequence Length: 2048
  • Warmup Steps: 5

๐Ÿ”ง ๋ฉ”๋ชจ๋ฆฌ ์‚ฌ์šฉ๋Ÿ‰

  • GPU: NVIDIA RTX A5000
  • Peak Memory: 6.381 GB
  • Memory Usage: 27.1%

์‚ฌ์šฉ ๋ฐฉ๋ฒ•

from transformers import AutoTokenizer, AutoModelForCausalLM

# ๋ชจ๋ธ๊ณผ ํ† ํฌ๋‚˜์ด์ € ๋กœ๋“œ
tokenizer = AutoTokenizer.from_pretrained("himedia/fincredit-Phi-3.5-mini-lr2e04-bs16-r32-steps1000-20250622_131709")
model = AutoModelForCausalLM.from_pretrained("himedia/fincredit-Phi-3.5-mini-lr2e04-bs16-r32-steps1000-20250622_131709")

# ๊ฐ„๋‹จํ•œ ์ถ”๋ก  ์˜ˆ์ œ
prompt = "๊ณ ๊ฐ์˜ ์‹ ์šฉ๋“ฑ๊ธ‰์„ ํ‰๊ฐ€ํ•ด์ฃผ์„ธ์š”:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result)

๐Ÿ“Š ํ•™์Šต ๋ฐ์ดํ„ฐ ํŒŒ์ผ

์ด ๋ ˆํฌ์ง€ํ† ๋ฆฌ์—๋Š” ๋‹ค์Œ ํ•™์Šต ๊ด€๋ จ ํŒŒ์ผ๋“ค์ด ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค:

  • training_log.json: ์ „์ฒด ํ•™์Šต ๋กœ๊ทธ (JSON ํ˜•์‹)
  • FinCreditPhi-3.5-mini_20250622_131709_training_curves.png: ํ•™์Šต ๊ณก์„  ์‹œ๊ฐํ™” ์ด๋ฏธ์ง€

๋ ˆํฌ์ง€ํ† ๋ฆฌ๋ช… ๊ตฌ์„ฑ

fincredit-Phi-3.5-mini-lr2e04-bs16-r32-steps1000-20250622_131709 = fincredit-lamma3-4b-lr2e04-bs4-r32-steps1000-20250622_131709
  • fincredit-lamma3-4b: ๋ชจ๋ธ ๊ธฐ๋ณธ๋ช…
  • lr2e04: Learning Rate
  • bs4: Batch Size
  • r32: LoRA rank
  • steps1000: ํ•™์Šต ์Šคํ…
  • 20250622_131709: ํ•™์Šต ์‹œ๊ฐ

์„ฑ๋Šฅ

์ด ๋ชจ๋ธ์€ ํ•œ๊ตญ์–ด ๊ธˆ์œต ํ…์ŠคํŠธ์— ๋Œ€ํ•ด ํŒŒ์ธํŠœ๋‹๋˜์–ด ์‹ ์šฉ ํ‰๊ฐ€ ๊ด€๋ จ ์งˆ์˜์‘๋‹ต์— ํŠนํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.

๋ผ์ด์„ ์Šค

Apache 2.0

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for himedia/fincredit-Phi-3.5-mini-lr2e04-bs16-r32-steps1000-20250622_131709

Finetuned
(87)
this model