Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PatternAttribution for iNNvestigate 2.0 version #282

Open
moritzaugustin opened this issue Aug 9, 2022 · 6 comments
Open

PatternAttribution for iNNvestigate 2.0 version #282

moritzaugustin opened this issue Aug 9, 2022 · 6 comments

Comments

@moritzaugustin
Copy link

The great method "PatternAttribution" was available in the previous iNNvestigate version (for tensorflow 1) and is not available anymore for the current version for tensorflow 2.

I wonder whether it is planned to implement the method again and if so when it will be available? :-)

In the README.md it is stated: pattern.net: PatternNet estimates the input signal of the output neuron. (Note: not available in iNNvestigate 2.0)

@adrhill
Copy link
Collaborator

adrhill commented Aug 10, 2022

Hi Moritz,

I currently don't have the bandwidth to implement "new" features, but I'd welcome contributions.
To check the correctness of a new implementation, data for reference tests can be generated via https://github.com/adrhill/test-data-innvestigate.

I'll leave this issue open as a feature request.

Note: For users that have the option to use TensorFlow 1, PatternAttribution and PatternNet are still available by installing iNNvestigate 1.0.8..

@mg97tud
Copy link

mg97tud commented Aug 22, 2022

I am using tensorflow-directml 1.15 (to use my AMD GPU) and I can use iNNvestigate 2.0 without problems. But when I want to use iNNvestigate 1.0.8.3. I get the Error "module 'tensorflow._api.v1.compat.v2' has no attribute 'internal'". Is this, because
iNNvestigate 1.0.8.3 was written and tested for tensorflow 1.12? I sadly cant downgrade from 1.15.

@adrhill
Copy link
Collaborator

adrhill commented Aug 22, 2022

Yes exactly, iNNvestigate 2.0.0 is a breaking release that switched from TF1 to TF2, dropping TF1 compatibility.
I'm not familiar with tensorflow-directml, but it would have to be compatible with TF1 for iNNvestigate 1.0.8 to work.

@mg97tud
Copy link

mg97tud commented Aug 22, 2022

Thats what bugging me. Tensorflow 1.15 is the latest Tensorflow 1 Version, right? tensorflow-directml is build on TF 1.15.
I can use innvestigate 2.0 with my 1.15 Tensorflow, but not innvestigate 1.0.8

@adrhill
Copy link
Collaborator

adrhill commented Aug 22, 2022

The latest stable release of TensorFlow by Google is 2.9.1. The major version number is what people refer to when they talk about TF1 and TF2.

The latest release of tensorflow-directml is 1.15.7. tensorflow-directml is not the "official" TensorFlow, but a fork by Microsoft. Since iNNvestigate builds upon Google's TensorFlow, our tests and CI don't guarantee any functionality for forks.

Since I'm not familiar with tensorflow-directml, I'm not able to tell you how compatible it is with packages like iNNvestigate that use internals of Google's TensorFlow 1 and 2. Maybe the tensorflow-directml FAQ have an answer to your question.

@adrhill
Copy link
Collaborator

adrhill commented Aug 22, 2022

In case it helps: previous releases of iNNvestigate (1.0.8, 1.0.9) used Keras 2.2.4 and TensorFlow 1.12:

o(Currently only the Tensorflow backend is supported. We test with Python 3.6, Tensorflow 1.12 and Cuda 9.x.):

"keras==2.2.4",

We then jumped all TensorFlow versions between 1.12 and ^2.6 for iNNvestigate 2.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants