Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
<div align="center">
|
2 |
|
3 |
# 🔥 Flame: Flash Linear Attention Made Easy
|
|
|
1 |
+
# This model is from the paper arxiv.org/abs/2504.20966
|
2 |
+
# Softpick: No Attention Sink, No Massive Activations with Rectified Softmax
|
3 |
+
|
4 |
+
See code: https://github.com/zaydzuhri/softpick-attention
|
5 |
+
|
6 |
+
This model is only usable through these repositories:
|
7 |
+
https://github.com/zaydzuhri/flash-linear-attention/tree/softpick-attention
|
8 |
+
https://github.com/zaydzuhri/flame/tree/softpick-attention
|
9 |
+
|
10 |
+
|
11 |
<div align="center">
|
12 |
|
13 |
# 🔥 Flame: Flash Linear Attention Made Easy
|