Update README.md (#2)
Browse files- Update README.md (be95c6b154df5b92a323d762e216e573d7f7a205)
Co-authored-by: Yu-Hui Chen <[email protected]>
README.md
CHANGED
@@ -72,6 +72,45 @@ for everyone.
|
|
72 |
}
|
73 |
```
|
74 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
75 |
## Model Data
|
76 |
|
77 |
Data used for model training and how the data was processed.
|
|
|
72 |
}
|
73 |
```
|
74 |
|
75 |
+
## Use the models
|
76 |
+
|
77 |
+
### Colab
|
78 |
+
|
79 |
+
*Disclaimer: The target deployment surface for the LiteRT models is
|
80 |
+
Android/iOS/Web and the stack has been optimized for performance on these
|
81 |
+
targets. Trying out the system in Colab is an easier way to familiarize yourself
|
82 |
+
with the LiteRT stack, with the caveat that the performance (memory and latency)
|
83 |
+
on Colab could be much worse than on a local device.*
|
84 |
+
|
85 |
+
[](https://colab.sandbox.google.com/github/google-ai-edge/mediapipe-samples/blob/main/codelabs/litert_inference/gemma3_1b_tflite.ipynb)
|
86 |
+
|
87 |
+
### Android via Google AI Edge Gallery and MediaPipe
|
88 |
+
|
89 |
+
* Download and install
|
90 |
+
[the apk](https://github.com/google-ai-edge/gallery/releases/latest/download/ai-edge-gallery.apk).
|
91 |
+
* Follow the instructions in the app.
|
92 |
+
|
93 |
+
To build the demo app from source, please follow the [instructions](https://github.com/google-ai-edge/gallery/blob/main/README.md)
|
94 |
+
from the GitHub repository.
|
95 |
+
|
96 |
+
### Android or Desktop via LiteRT LM
|
97 |
+
|
98 |
+
Follow the LitRT LM [instructions](https://github.com/google-ai-edge/LiteRT-LM/blob/main/README.md) to build our Open Source LiteRT LM runtime to run LiteRT models.
|
99 |
+
|
100 |
+
### iOS via MediaPipe
|
101 |
+
|
102 |
+
* Clone the [MediaPipe samples](https://github.com/google-ai-edge/mediapipe-samples)
|
103 |
+
repository and follow the [instructions](https://github.com/google-ai-edge/mediapipe-samples/tree/main/examples/llm_inference/ios/README.md)
|
104 |
+
to build the LLM Inference iOS Sample App using XCode.
|
105 |
+
* Run the app via the iOS simulator or deploy to an iOS device.
|
106 |
+
|
107 |
+
### Web
|
108 |
+
|
109 |
+
* Build and run our [sample web app](https://github.com/google-ai-edge/mediapipe-samples/blob/main/examples/llm_inference/js/README.md).
|
110 |
+
|
111 |
+
To add the model to your web app, please follow the instructions in our [documentation](https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference/web_js).
|
112 |
+
|
113 |
+
|
114 |
## Model Data
|
115 |
|
116 |
Data used for model training and how the data was processed.
|