Add `handler.py` and `requirements.txt`
This PR adds the handler.py
and the requirements.txt
files required to run Microsoft OmniParser v2 on Inference Endpoints and on Azure ML. This PR contains the code adapted from both the Microsoft OmniParser v2 Gradio demo and the code shared by
@ThomasDh-C
on his fork of OmniParser v2.
Any contributions are welcomed, and note that the code has been slightly modified as in organization and variable naming, but the code remains the same otherwise. Also note that the code can be modified at any point, so @ThomasDh-C feel free to jump in if you feel like changing anything. Thanks in advance 🤗
This PR also updates the README.md
so as to add the custom_code
and endpoints-template
tags required to identify this model as a custom model architecture that runs with a custom handler.py
on Inference Endpoints, respectively.
This looks great! This will be very useful.
Couple things:
- Add parentheses to the initial one-liner (CUDA or not) to clarify order of operations
- The YOLO threshold should be 0.05 as per this file.
- The default arg for easyocr readtext is paragraph = false so think this can be cleaned up
- Finally, commented out lines can be removed.
Looks great to me!