runtime error
Exit code: 1. Reason: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 406, in hf_raise_for_status response.raise_for_status() File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api-inference.huggingface.co/models/meta-llama/Llama-3.3-70B-Instruct/v1/chat/completions The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> completion = client.chat.completions.create( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 892, in chat_completion data = self.post(model=model_url, json=payload, stream=stream) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 306, in post hf_raise_for_status(response) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 460, in hf_raise_for_status raise _format(BadRequestError, message, response) from e huggingface_hub.errors.BadRequestError: (Request ID: k9NNcThBf2YNjOQUBg1xG) Bad request: Authorization header is correct, but the token seems invalid
Container logs:
Fetching error logs...