Edit model card

raincandy-u/TinyChat-1776K

A tiny LM trained on TinyChat dataset from scratch.

The aim is to try to achieve natural responses on the smallest possible model. Trained using a dataset of 3 year old children level English conversations.

Note: It has no world knowledge, so you should not ask it any intellectual questions.

Model Spec

config = AutoConfig.for_model(
    model_type="llama",
    hidden_size=192,
    intermediate_size=640,
    num_attention_heads=16,
    num_hidden_layers=3,
    num_key_value_heads=4,
    tie_word_embeddings=True,
    vocab_size=2048,
    max_position_embeddings=256
)

Template

<A>Hi, Tom. How are you? <end>
<B>I'm fine, thank you. And you? <end>
<A>Fine. What's your favorite color? <end>
<B>My favorite color is black. <end>
<A>Do you like cats? <end>
<B>

Example output:

Yes, I do. I like it too. They are good for me.

Generation Param

top_k=40,
top_p=0.8,
temperature=1
Downloads last month
77
Safetensors
Model size
1.78M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train raincandy-u/TinyChat-1776K