Help: My Local Model Seems Not Producing Corrent Recognize Result

#3
by hjkcai - opened

I tried to use typhoon-ocr-7b-GGUF Q4_K_M and Q8_0 and F16, with typhoon-ocr-7b.mmproj-f16.gguf in LM Studio (llama.cpp based). I tried to recognize some Thai text from a screenshot of a pdf file, it could produce some result with lots of errors. The result was not usable in production.

However I tried to use the online demo of typhoon-ocr it works perfectly. The result is exactly the same.

Could you please provide me some clues to make it work? Thanks a lot!

LM Studio:

image.png

Input Image:

thai003-line1.jpg

Recognized Text:

ผู้เข้าร่วมโดยสิทธิการเช่าของตนเองเมื่อเรียนคณะสถาปัตยกรรม

Original Text:

(๘) ผู้เช่าจะไม่โอนสิทธิการเช่าของตนซึ่งมีอยู่เหนือทรัพย์สินที่เช่า ไม่ว่าทั้งหมดหรือบางส่วนให้แก่บุคคลอื่น

Not really, we are just providing the quants. Your problem could be a software problem (LM studio often causes trouble when llama.cpp works), could be a problem with your setup/parameters, could be a problem with the model or could be a problem with llama.cpp itself. Or just random variation. All of these are likely :)

Sign up or log in to comment