This is a "Hello World" for running onnx models on android.
The inference class is located in model.py. An example use is located it main.py.
You also need to update your buildozer.spec:
requirements = python3, kivy, numpy
android.api = 35
android.gradle_dependencies = com.microsoft.onnxruntime:onnxruntime-android:1.22.0
Performance is in general not that great because onnxruntime-android only supports gpu acceleration through nnapi.
Only a few models are fully compatible with nnapi since it requires a static model and supports only a handful of operations.
That's why it's often worth it to convert the model to tflite and run it using LiteRT.
PRs welcome :)
MIT © Enno A