Any plans to open source AI Sheets?

#3
by deshetti - opened

Congratulations on the great product! I wanted to check if you guys plan to open source the solution behind AI Sheets, so it could enable integrating into other data products.

deshetti changed discussion title from Any plans to open source AI Sheets to Any plans to open source AI Sheets?
Hugging Face Sheets org

Hi @deshetti , thanks!

We'd love to know more about the uses you have in mind.

Currently it's possible to deploy the app locally/on your server using a docker image, would this be useful for you?

@dvilasuero Thanks for the response. I tried running it locally using the Dockerfile, but nothing really works apart from the landing page. Some documentation for running it using Docker would be helpful.

We are currently working on a data product that we plan to open source soon. The app allows users to perform analysis on datasets using AI. We currently use a simple grid to manage the datasets. I was thinking the sheets UI you guys have is a great way to manage the datasets with AI features.

Hugging Face Sheets org

Hi @deshetti , Thanks for your messages.

We've made a step-by-step guide on how to run Sheets locally. Please, check the Running Sheets locally article and let us know if you have any questions or need further assistance.

Do you also plan to make it possible in the future to run it with a locally deployed LLM?

Hugging Face Sheets org

Thanks for your question, @chschinner

We're using the inference package to call the inference endpoints. The client supports calling local endpoints (see the docs), so it would be easy to allow this kind of setup

Thanks a lot, @frascuchon

Was not aware of this and will give it a try!

did you manage to do it @chschinner ? Happy to help otherwise

Thanks for offering help, @julien-c . I've successfully set up AI-Sheets locally with Docker, which was very straightforward. However, I'm facing a challenge getting it to use my local Ollama LLM (on port 11434) instead of relying on Hugging Face's cloud inference servers. My initial approach was to try configuring a local endpoint URL using an environment variable (like HF_INFERENCE_ENDPOINT or INFERENCE_API_URL) but this is not working. Could you provide guidance on how to properly direct ai-sheets to utilize a local LLM endpoint like Ollama in this setup?

Hugging Face Sheets org

Hi @chschinner , that's exactly what you need to do. But I realized that there is no info about the environment variable name to use a custom endpoint. To properly set up Sheets using a local, you should use the MODEL_ENDPOINT_URL environment var.

Since you're running Sheets using Docker, you should set it up to reach the service running outside Docker. For that, you can use host.docker.internal to reach the host.

If your custom ollama LLM is running on http://localhost:11434, you should define the endpoint URL as MODEL_ENDPOINT_URL=http://host.docker.internal:11434 when running the docker container.

Let me know if this works for you. We're happy to help if you struggle with it.

Thanks a lot @frascuchon ! AI-Sheets is now successfully sending requests to my local Ollama LLM. However, I'm encountering 400 Bad Request errors in the Ollama console. I suspect this might be related to a missing model specification in the requests coming from AI-Sheets. I have gemma:7b installed in Ollama. Is there another environment variable in AI-Sheets to specify which model it should ask Ollama to use (e.g., gemma:7b)?

Sign up or log in to comment