Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

keep poping error:"/pytorch/torch/csrc/autograd/python_function.cpp:638: UserWarning: Legacy autograd function with non-static forward method is deprecated and will be removed in 1.3. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function)" #29

Open
BarCodeReader opened this issue Sep 23, 2019 · 5 comments

Comments

@BarCodeReader
Copy link

when run the notebook, it will keep poping the error:
/pytorch/torch/csrc/autograd/python_function.cpp:638: UserWarning: Legacy autograd function with non-static forward method is deprecated and will be removed in 1.3. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function)

anyone counter this?

@DocMorg
Copy link

DocMorg commented Jan 14, 2020

Me too. Trying to solve it. No results yet.

@mys007
Copy link

mys007 commented Mar 3, 2020

As a workaround, one can replace

return ConstantPad1d(target_size, dimension, value, pad_start)(input)
with

    pads = [0] * (input.ndim * 2)
    pads[2 * dimension + (1 if pad_start else 0)] = target_size - input.shape[dimension]
    return torch.nn.functional.pad(input, pads[::-1], mode='constant', value=value)

@angeloyeo
Copy link

The response from @mys007 works!

@neko-is-kitty
Copy link

Afair I tried this on a fork of this suited for pytorch 1.6 and I tried it on this repository as well, and the workaround from @mys007 doesn't work for me.

Running colab bc that's the only platform I can use rn. This is frustrating.

@drewads
Copy link

drewads commented May 2, 2022

I'm also on colab and am having an issue with this workaround as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants