aya-8b / README.md
Danna8's picture
Update README.md
a6157d6 verified
metadata
language: en
tags:
  - text-generation
  - ollama
  - aya
  - llm
  - conversational
pipeline_tag: text-generation
library_name: transformers
inference: true

Aya-8B

Model Description

This is the Aya-8B model, originally designed for Ollama and converted to be compatible with Hugging Face. Aya is an open-source language model known for its conversational abilities and text generation capabilities.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("Danna8/aya-8b")
model = AutoModelForCausalLM.from_pretrained("Danna8/aya-8b")

inputs = tokenizer("Hello, how are you today?", return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=100)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)

Model Details

  • Model Type: Transformer-based language model
  • Size: 8 billion parameters

Limitations and Biases

Like all language models, Aya-8B may reproduce biases present in its training data. Users should be aware of these limitations when deploying the model.

License