Danna8 commited on
Commit
a6157d6
·
verified ·
1 Parent(s): 07b2e55

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +45 -9
README.md CHANGED
@@ -1,9 +1,45 @@
1
- # Aya 8B Model
2
-
3
- This is the Aya 8B model, a multilingual model by Cohere that supports 23 languages.
4
-
5
- This model was uploaded directly from Ollama's Aya model files.
6
- ## Model Structure
7
-
8
- - `/blobs`: Contains the model weight files
9
- - `/manifests`: Contains the model manifests
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - text-generation
5
+ - ollama
6
+ - aya
7
+ - llm
8
+ - conversational
9
+ pipeline_tag: text-generation
10
+ library_name: transformers
11
+ inference: true
12
+ ---
13
+
14
+ # Aya-8B
15
+
16
+ ## Model Description
17
+
18
+ This is the Aya-8B model, originally designed for Ollama and converted to be compatible with Hugging Face. Aya is an open-source language model known for its conversational abilities and text generation capabilities.
19
+
20
+ ## Usage
21
+
22
+ ```python
23
+ from transformers import AutoModelForCausalLM, AutoTokenizer
24
+
25
+ tokenizer = AutoTokenizer.from_pretrained("Danna8/aya-8b")
26
+ model = AutoModelForCausalLM.from_pretrained("Danna8/aya-8b")
27
+
28
+ inputs = tokenizer("Hello, how are you today?", return_tensors="pt")
29
+ outputs = model.generate(inputs["input_ids"], max_length=100)
30
+ response = tokenizer.decode(outputs[0], skip_special_tokens=True)
31
+ print(response)
32
+ ```
33
+
34
+ ## Model Details
35
+
36
+ - **Model Type:** Transformer-based language model
37
+ - **Size:** 8 billion parameters
38
+
39
+
40
+ ## Limitations and Biases
41
+
42
+ Like all language models, Aya-8B may reproduce biases present in its training data. Users should be aware of these limitations when deploying the model.
43
+
44
+ ## License
45
+