What is the actual context size of nomic embed text?
#51
by
carlttt
- opened
I am running a nomic embed text v1.5 with ollama. As I know, nomic embed text support up to 8192 tokens as context. But It can only accept 2048 tokens when testing with Postman http request.
Here is info I print out from ollama:
And Here is the result of Postman post request:
No matter how long is my text, the model only take first 2048 tokens to generate the embeddings. What should I configure to allow more than 2048 tokens?