Python error raised when deploying model to HF Inference Endpoint

#12
by alprielse - opened

Error being returned:

{"error":"'str' object has no attribute 'pop'"}

Steps to reproduce:

  • Spin up this model in a HF Inference endpoint
  • Copy paste the HF inference endpoint boilerplate
  • Customize input field with prompt template found in Model Card

Code I'm running ("default" HF inference endpoint boilerplate) alongside the prompt template used in the Usage example in the Model Card

import requests

API_URL = "https://bkqo4fhhs95p3jmw.us-east-1.aws.endpoints.huggingface.cloud"
headers = {
    "Accept" : "application/json",
    "Authorization": "Bearer hf_XXXXX",
    "Content-Type": "application/json" 
}

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()

output = query({
    "inputs": "<|input|>\n### Template:\n{template}\n### Text:\n{text}\n\n<|output|>",  // Used my own template and text strings
    "parameters": {
        "max_new_tokens": 300
    }
})

Here's the full prompt I'm using:
https://gist.github.com/AlpriElse/f35f0c911ef9cc4dcf2603220fd5f1b5

Attached are the logs I'm seeing in the HF Dedicated Endpoint
Screenshot 2024-12-11 at 11.57.31 PM.png

This looks to be more of an issue with the huggingface endpoints rather than our model, so I'm not sure how much I can help unfortunately.

From the trace it seems like maybe one of the inputs you are sending is empty? Otherwise something is a string while it should be a dictionary.

Yea the logs don't make sense to me since it's not clear if I should be invoking the API with different params, probably just a HF Endpoints issue like you suggested

Thanks for taking a look :)

alprielse changed discussion status to closed

Sign up or log in to comment