Unable to deploy to SageMaker via Studio notebook

#18
by dualblades - opened

Kernel specifications:

Image: Data Science 3.0
Kernel: Python 3
Instance type: ml.t3.medium
Start-up script: No script

This is my exact notebook code, copied from the "Deploy" button on this page:

import sagemaker
from sagemaker.huggingface import HuggingFaceModel

role = sagemaker.get_execution_role()

# Hub Model configuration. https://huggingface.co/models
hub = {
    'HF_MODEL_ID':'HuggingFaceM4/idefics-80b',
    'HF_TASK':'text-generation'
}

# create Hugging Face Model Class
huggingface_model = HuggingFaceModel(
    transformers_version='4.26.0',
    pytorch_version='1.13.1',
    py_version='py39',
    env=hub,
    role=role, 
)

# deploy model to SageMaker Inference
predictor = huggingface_model.deploy(
    initial_instance_count=1, # number of instances
    instance_type='ml.m5.xlarge' # ec2 instance type
)

data = {
 "inputs": "Can you please let us know more details about your "
}
predictor.predict(data)

I am able to deploy the model and I can see the endpoint. However, running the predict method always throws this error:

ModelError                                Traceback (most recent call last)
Cell In[17], line 1
----> 1 predictor.predict(data)

File /opt/conda/lib/python3.10/site-packages/sagemaker/base_predictor.py:185, in Predictor.predict(self, data, initial_args, target_model, target_variant, inference_id, custom_attributes)
    138 """Return the inference from the specified endpoint.
    139 
    140 Args:
   (...)
    174         as is.
    175 """
    177 request_args = self._create_request_args(
    178     data,
    179     initial_args,
   (...)
    183     custom_attributes,
    184 )
--> 185 response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
    186 return self._handle_response(response)

File /opt/conda/lib/python3.10/site-packages/botocore/client.py:535, in ClientCreator._create_api_method.<locals>._api_call(self, *args, **kwargs)
    531     raise TypeError(
    532         f"{py_operation_name}() only accepts keyword arguments."
    533     )
    534 # The "self" in this scope is referring to the BaseClient.
--> 535 return self._make_api_call(operation_name, kwargs)

File /opt/conda/lib/python3.10/site-packages/botocore/client.py:980, in BaseClient._make_api_call(self, operation_name, api_params)
    978     error_code = parsed_response.get("Error", {}).get("Code")
    979     error_class = self.exceptions.from_code(error_code)
--> 980     raise error_class(parsed_response, operation_name)
    981 else:
    982     return parsed_response

ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from primary with message "{
  "code": 400,
  "type": "InternalServerException",
  "message": "\u0027idefics\u0027"
}
".

What can I do to fix this issue and properly invoke the endpoint?

Hi,
I am not familiar with the SM part, at a quick glance, i would say that you need to update your transformers version. idefics was released as part of the 4.32.0 release (and some tiny details were fixed in 4.32.1 too).
if it doesn't entirely solve your problem, perhaps @philschmid knows?

Hi @VictorSanh , thank you for the suggestion. Unfortunately, 4.32.0 is not supported yet. I upgraded my SageMaker SDK using the command in the screenshot, but still no luck.
Screenshot 2023-08-28 at 10.17.09 AM.png

This seems like a concerning issue if the model simply cannot be run on SageMaker.

Got it! i will let @philschmid advise

@philschmid Any update here? Are we able to run idefics on SageMaker?

You can customize the transformers with providing a requirements.txt and manually update until we have a new container. But this being said a m5 instance is not enough to run a 80B parameter model neither would the task text-generation be correct.

If that's the case, I'm confused why the "Deploy with SageMaker" provides those configurations. Could you please update these configurations?
Screenshot 2023-08-30 at 12.27.52 PM.png

Here is an example on how to dpeloy it. https://www.philschmid.de/sagemaker-idefics

Sign up or log in to comment