-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rebuild the Model explorer backend - To get the full weights values #82
Comments
so i built for ai_edge_model_explorer_adapter 0.1.0 I have ai_edge_model_explorer_adapter 0.1.0 which is installed using internet downloaded .whl file Is this problem bcz of the glic version ? |
Hi @tsantosh1098, can you share the modifications to
So that we may match your changes and test this out? Thanks for your help. |
Hi @tsantosh1098, are you sure you need your changes? Have you tried adjusting the maximum element count for constant tensor values: If you enter a -1 here, it will allow you to see unlimited tensor values. Let me know if you have any more questions. |
Hi @pkgoogle, Thanks for the information. I was able to achieve to get the full weights by setting the config.const_element_count_limit = -1 in the builtin adapter files. Thanks once again |
Hi @tsantosh1098, just to be clear the UI solution also works right? (We'll need to add it to the wiki so I wanted to verify :) ). Also if you have no more open items regarding this issue, please feel free to close as completed, thanks! |
Marking this issue as stale since it has been open for 7 days with no activity. This issue will be closed if no further activity occurs. |
This issue was closed because it has been inactive for 14 days. Please post a new issue if you need further assistance. Thanks! |
I am trying to modify the ConvertFlatbufferDirectlyToJson() part of the direct_flatbuffer_to_json_graph_convert.cc file.
To re-build the builtin adaptors I followed these steps - https://github.com/google-ai-edge/model-explorer/tree/main/src/builtin-adapter. But after generating the .so file now I am not able to load the extension module ".builtin_tflite_mlir_adapter".
! Failed to load extension module ".builtin_tflite_mlir_adapter":
/home/armnn/Documents/mcw/model_explorer/lib/python3.12/site-packages/ai_edge_model_explorer_adapter/_pywrap_convert_wrapper.so: undefined symbol:
_ZN10tensorflow11CSRMatMulOpIN5Eigen16ThreadPoolDeviceEfEC2EPNS_20OpKernelConstructionE
When I check the path of the .so file, its present in the provided directory.
Currently, I am building this on ubuntu 24 machine - with Bazel 6.5.0, on Intel(R) Core(TM) i7-8665U CPU @ 1.90GHz
The text was updated successfully, but these errors were encountered: