runtime error

Exit code: 1. Reason: β”‚ OLLAMA_BASE_URL = None β”‚ β”‚ β”‚ β”‚ OLLAMA_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ β”‚ OPENAI_BASE_URL = None β”‚ β”‚ β”‚ β”‚ OPENAI_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ β”‚ random = <module 'random' from β”‚ β”‚ β”‚ β”‚ '/usr/local/lib/python3.10/random.py'> β”‚ β”‚ β”‚ β”‚ TOKEN_INDEX = 1 β”‚ β”‚ β”‚ β”‚ TOKENIZER_ID = None β”‚ β”‚ β”‚ β”‚ TOKENIZER_ID_COMPLETION = None β”‚ β”‚ β”‚ β”‚ VLLM_BASE_URL = None β”‚ β”‚ β”‚ β”‚ VLLM_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ ╰──────────────────────────────────────────────────────────────────────────╯ β”‚ ╰──────────────────────────────────────────────────────────────────────────────╯ Exception: Error loading InferenceEndpointsLLM: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct. 403 Client Error. (Request ID: Root=1-67fe6856-55572f52531b7e7c758730bf;82292b4c-853a-4273-97d7-f9d4a26bd63e) Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct/resolve/main/config.json . Access to model meta-llama/Llama-3.2-3B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct to ask for access.

Container logs:

Fetching error logs...