Can't Use it with VLLM, although gemma-2B from Google is supported
β
1
1
#8 opened about 1 year ago
by
yaswanth-iitkgp
Can't generate dectent text out of it
6
#7 opened about 1 year ago
by
useless-ai
compare with original gemma 2b?
#6 opened about 1 year ago
by
supercharge19
Tests & Eval
π
β
7
#5 opened about 1 year ago
by
segmond
Performance on long context benchmarks?
π
7
#4 opened about 1 year ago
by
odusseys
OOM on A100
#3 opened about 1 year ago
by
chuyi777
Is there any data can show the performance of infer time.
π
1
#2 opened about 1 year ago
by
CMCai0104
Context windows is only 8k???
π
1
1
#1 opened about 1 year ago
by
rombodawg
