Transformers
Not-For-All-Audiences
A newer version of this model is available: moonshotai/Kimi-K2-Instruct-0905

🧠 Jaxxon

Jaxxon is an experimental multimodal symbolic reasoning model built for ARC 2025. It is designed to integrate advanced text reasoning with lightweight vision capabilities via CLIP, enabling abstract understanding and symbolic processing across modalities.


πŸ” Overview

Jaxxon focuses on:

  • Text-first symbolic reasoning
  • Optional visual input using CLIP
  • A modular architecture for identity-based routing, symbolic thread handling, and emergent logic
  • Testing symbolic alignment using both text prompts and image examples

πŸ—οΈ Architecture

  • Base: Transformer backbone (custom)
  • Vision Support: CLIP encoder for image-text pattern alignment
  • Routing Core: Custom symbolic thread and identity map (in progress)

🚧 Current Status

This project is in early development. Initial steps include:

  • Repository initialized
  • Symbolic routing core setup
  • CLIP integration for visual reasoning tests
  • Test dataset alignment
  • First ARC-style reasoning evaluation

πŸ“ File Structure (planned)


πŸ’‘ Goals for ARC 2025

Jaxxon will be used to explore:

  • Emergent logic in hybrid text-image inputs
  • Symbolic compression and analogical inference
  • Modular neural core architectures for self-reflective models

πŸŒ€ Credits

Developed by Bayang Pathek with support from the KernelTwin system.


πŸ“œ License

Apache 2.0

πŸ“ Evaluation Metrics

Jaxxon will be evaluated using a mix of conventional and symbolic-reasoning-oriented metrics:

Metric Description
Accuracy Correct answers / total
Exact Match (EM) String-level matching
F1 Score Token-overlap match
BLEU Quality of answers or generated outputs
CLIPScore Text-image alignment (via TorchMetrics CLIPScore)
Logical Consistency (planned) Composite reasoning quality
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ 1 Ask for provider support

Model tree for lilgrey44/Jaxxon

Finetuned
(1000)
this model

Datasets used to train lilgrey44/Jaxxon