Not working (runtime error)

#2
by guidocelada - opened

runtime error
Exit code: 1. Reason: ->flash-attn) (2.21.5)
Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: triton==3.1.0 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.0)
Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (1.13.1)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.10/site-packages (from sympy==1.13.1->torch->flash-attn) (1.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.5)
Downloading einops-0.8.1-py3-none-any.whl (64 kB)
Building wheels for collected packages: flash-attn
Building wheel for flash-attn (setup.py): started
Building wheel for flash-attn (setup.py): finished with status 'done'
Created wheel for flash-attn: filename=flash_attn-2.7.4.post1-py3-none-any.whl size=187797312 sha256=b267f80a08e516292cdd748056a2178a45b8abedf7fca123292eb17c21c8c87c
Stored in directory: /tmp/pip-ephem-wheel-cache-oua85wpb/wheels/59/ce/d5/08ea07bfc16ba218dc65a3a7ef9b6a270530bcbd2cea2ee1ca
Successfully built flash-attn
Installing collected packages: einops, flash-attn
Successfully installed einops-0.8.1 flash-attn-2.7.4.post1
/usr/local/lib/python3.10/site-packages/torch/cuda/init.py:716: UserWarning: Can't initialize NVML
warnings.warn("Can't initialize NVML")
Traceback (most recent call last):
File "/home/user/app/app.py", line 24, in
import spaces
File "/usr/local/lib/python3.10/site-packages/spaces/init.py", line 19, in
from .zero.decorator import GPU
File "/usr/local/lib/python3.10/site-packages/spaces/zero/init.py", line 15, in
raise RuntimeError(
RuntimeError: CUDA has been initialized before importing the spaces package
Container logs:

===== Application Startup at 2025-04-21 23:32:53 =====

WARNING: The directory '/home/user/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you should use sudo's -H flag.
Collecting flash-attn
Downloading flash_attn-2.7.4.post1.tar.gz (6.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 98.1 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: torch in /usr/local/lib/python3.10/site-packages (from flash-attn) (2.5.1)
Collecting einops (from flash-attn)
Downloading einops-0.8.1-py3-none-any.whl.metadata (13 kB)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.17.0)
Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (4.12.2)
Requirement already satisfied: networkx in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.5)
Requirement already satisfied: fsspec in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (2024.12.0)
Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cuda-runtime-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cuda-cupti-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (9.1.0.70)
Requirement already satisfied: nvidia-cublas-cu12==12.4.5.8 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.5.8)
Requirement already satisfied: nvidia-cufft-cu12==11.2.1.3 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (11.2.1.3)
Requirement already satisfied: nvidia-curand-cu12==10.3.5.147 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (10.3.5.147)
Requirement already satisfied: nvidia-cusolver-cu12==11.6.1.9 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (11.6.1.9)
Requirement already satisfied: nvidia-cusparse-cu12==12.3.1.170 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.3.1.170)
Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (2.21.5)
Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: triton==3.1.0 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.0)
Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (1.13.1)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.10/site-packages (from sympy==1.13.1->torch->flash-attn) (1.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.5)
Downloading einops-0.8.1-py3-none-any.whl (64 kB)
Building wheels for collected packages: flash-attn
Building wheel for flash-attn (setup.py): started
Building wheel for flash-attn (setup.py): finished with status 'done'
Created wheel for flash-attn: filename=flash_attn-2.7.4.post1-py3-none-any.whl size=187797312 sha256=b267f80a08e516292cdd748056a2178a45b8abedf7fca123292eb17c21c8c87c
Stored in directory: /tmp/pip-ephem-wheel-cache-oua85wpb/wheels/59/ce/d5/08ea07bfc16ba218dc65a3a7ef9b6a270530bcbd2cea2ee1ca
Successfully built flash-attn
Installing collected packages: einops, flash-attn
Successfully installed einops-0.8.1 flash-attn-2.7.4.post1
/usr/local/lib/python3.10/site-packages/torch/cuda/init.py:716: UserWarning: Can't initialize NVML
warnings.warn("Can't initialize NVML")
Traceback (most recent call last):
File "/home/user/app/app.py", line 24, in
import spaces
File "/usr/local/lib/python3.10/site-packages/spaces/init.py", line 19, in
from .zero.decorator import GPU
File "/usr/local/lib/python3.10/site-packages/spaces/zero/init.py", line 15, in
raise RuntimeError(
RuntimeError: CUDA has been initialized before importing the spaces package
WARNING: The directory '/home/user/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you should use sudo's -H flag.
Collecting flash-attn
Downloading flash_attn-2.7.4.post1.tar.gz (6.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 120.3 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: torch in /usr/local/lib/python3.10/site-packages (from flash-attn) (2.5.1)
Collecting einops (from flash-attn)
Downloading einops-0.8.1-py3-none-any.whl.metadata (13 kB)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.17.0)
Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (4.12.2)
Requirement already satisfied: networkx in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.5)
Requirement already satisfied: fsspec in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (2024.12.0)
Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cuda-runtime-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cuda-cupti-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (9.1.0.70)
Requirement already satisfied: nvidia-cublas-cu12==12.4.5.8 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.5.8)
Requirement already satisfied: nvidia-cufft-cu12==11.2.1.3 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (11.2.1.3)
Requirement already satisfied: nvidia-curand-cu12==10.3.5.147 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (10.3.5.147)
Requirement already satisfied: nvidia-cusolver-cu12==11.6.1.9 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (11.6.1.9)
Requirement already satisfied: nvidia-cusparse-cu12==12.3.1.170 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.3.1.170)
Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (2.21.5)
Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127)
Requirement already satisfied: triton==3.1.0 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.0)
Requirement already satisfied: sympy==1.13.1 in /usr/local/lib/python3.10/site-packages (from torch->flash-attn) (1.13.1)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.10/site-packages (from sympy==1.13.1->torch->flash-attn) (1.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.5)
Downloading einops-0.8.1-py3-none-any.whl (64 kB)
Building wheels for collected packages: flash-attn
Building wheel for flash-attn (setup.py): started
Building wheel for flash-attn (setup.py): finished with status 'done'
Created wheel for flash-attn: filename=flash_attn-2.7.4.post1-py3-none-any.whl size=187797312 sha256=b267f80a08e516292cdd748056a2178a45b8abedf7fca123292eb17c21c8c87c
Stored in directory: /tmp/pip-ephem-wheel-cache-se_6ajst/wheels/59/ce/d5/08ea07bfc16ba218dc65a3a7ef9b6a270530bcbd2cea2ee1ca
Successfully built flash-attn
Installing collected packages: einops, flash-attn
Successfully installed einops-0.8.1 flash-attn-2.7.4.post1
/usr/local/lib/python3.10/site-packages/torch/cuda/init.py:716: UserWarning: Can't initialize NVML
warnings.warn("Can't initialize NVML")
Traceback (most recent call last):
File "/home/user/app/app.py", line 24, in
import spaces
File "/usr/local/lib/python3.10/site-packages/spaces/init.py", line 19, in
from .zero.decorator import GPU
File "/usr/local/lib/python3.10/site-packages/spaces/zero/init.py", line 15, in
raise RuntimeError(
RuntimeError: CUDA has been initialized before importing the spaces package

Should be fixed now, sorry about the issues!

mrfakename changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment