runtime error
Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:161: UserWarning: Field "model_id" has conflict with protected namespace "model_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`. warnings.warn( 0%| | 0.00/139M [00:00<?, ?iB/s][A 100%|████████████████████████████████████████| 139M/139M [00:00<00:00, 203MiB/s] TGI client has no function call support: Expecting value: line 1 column 1 (char 0) Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 974, in json return complexjson.loads(self.text, **kwargs) File "/usr/local/lib/python3.10/json/__init__.py", line 346, in loads return _default_decoder.decode(s) File "/usr/local/lib/python3.10/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/local/lib/python3.10/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 79, in <module> Settings.llm = TextGenerationInference( File "/usr/local/lib/python3.10/site-packages/llama_index/llms/text_generation_inference/base.py", line 171, in __init__ context_window = get_max_input_length(model_url) or DEFAULT_CONTEXT_WINDOW File "/usr/local/lib/python3.10/site-packages/llama_index/llms/text_generation_inference/utils.py", line 29, in get_max_input_length model_info = dict(requests.get(url).json()) File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 978, in json raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Container logs:
Fetching error logs...