-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using GPU for tflite #205
Comments
same issue to me |
1 similar comment
same issue to me |
Hi, I encountered the same problem. I found the following sugggestion on tensorflow/tensorflow#60720 which resolved the issue for me I added the following line to AndroidManifest.xml
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, i want to use mobile GPU to infer my model
I see document and use this code to load model
but it logout error like this
so i change GpuDelegateV2() to XNNPackDelegate() and it work
what is different between these two and do XNNPackDelegate() use GPU to infer model
The text was updated successfully, but these errors were encountered: