Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
FlameF0X 
posted an update Jun 28
Post
253
SnowflakeCore-G1 Update:
Got it running and training! Context window is currently set to 2048 tokens.
Training is active and stable. Will share results once I have some metrics to report.

That's exciting news! Glad to hear SnowflakeCore-G1 is up and training smoothly with a 2048-token context window. Looking forward to seeing your metrics and results keep us posted on any performance benchmarks or breakthrough observations!

·

Hello kivenemi! Thanks for waiting! Here are the first metrics from SnowflakeCore-G1-Tiny (~355.87M params):

Performance

  • Generation Speed: 57.26 tokens/sec
  • Model Size: 1.36 GB
  • Memory Usage: 6.35 GB (2048 tokens)

Benchmarks (After training)

  • GSM8K: 20%
  • MMLU: 0% (issues with the benchmark itself)
  • HumanEval: 0% (issues with the benchmark itself)

The model is still under development. So the next versions would perform better.
You have the full benchmark at FlameF0X/SnowflakeCore-G1-Benchmark.

In this post