Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConstantPad1d Deprecated #44

Open
jaytimbadia opened this issue Mar 3, 2022 · 7 comments
Open

ConstantPad1d Deprecated #44

jaytimbadia opened this issue Mar 3, 2022 · 7 comments

Comments

@jaytimbadia
Copy link

Hey,

Can you please help?

I am facing Runtime error for this function.

``
class ConstantPad1d(Function):
def init(self, target_size, dimension=0, value=0, pad_start=False):
super(ConstantPad1d, self).init()
self.target_size = target_size
self.dimension = dimension
self.value = value
self.pad_start = pad_start

def forward(self, input):
    self.num_pad = self.target_size - input.size(self.dimension)
    assert self.num_pad >= 0, 'target size has to be greater than input size'

    self.input_size = input.size()

    size = list(input.size())
    size[self.dimension] = self.target_size
    output = input.new(*tuple(size)).fill_(self.value)
    c_output = output

    # crop output
    if self.pad_start:
        c_output = c_output.narrow(self.dimension, self.num_pad, c_output.size(self.dimension) - self.num_pad)
    else:
        c_output = c_output.narrow(self.dimension, 0, c_output.size(self.dimension) - self.num_pad)

    c_output.copy_(input)
    return output

def backward(self, grad_output):
    grad_input = grad_output.new(*self.input_size).zero_()
    cg_output = grad_output

    # crop grad_output
    if self.pad_start:
        cg_output = cg_output.narrow(self.dimension, self.num_pad, cg_output.size(self.dimension) - self.num_pad)
    else:
        cg_output = cg_output.narrow(self.dimension, 0, cg_output.size(self.dimension) - self.num_pad)

    grad_input.copy_(cg_output)
    return grad_input

``
RuntimeError: Legacy autograd function with non-static forward method is deprecated. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function)
Can yo please help?

@Timtti
Copy link

Timtti commented Mar 5, 2022

I've had the same problem.

@Timtti
Copy link

Timtti commented Mar 5, 2022

Here is another example of a custom autograd. I'm currently rewriting this function myself. https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html

@Timtti
Copy link

Timtti commented Mar 5, 2022

Or maybe everyone encountering this problem should be using this branch instead https://github.com/Vichoko/pytorch-wavenet
Since ConstantPad1D is now built into PyTorch
https://pytorch.org/docs/stable/generated/torch.nn.ConstantPad1d.html

@MirkoDeVita98
Copy link

were you able to solve? I tried the other branch but I get the same error

@YuLong-Liang
Copy link

YuLong-Liang commented Apr 7, 2022

the same problem that we all meet has been solved by following steps:

  1. update the code "ConstantPad1d(nn.Module)" in 80 line of wavenet_modules.py.
  2. comment out the code "loss = loss.data[0]" in 73 line of wavenet_train.py

by the way, i run the wavenet_demo successfully.
the env list is as follow:
torch: 1.8.2+cu111
python: 3.7

@YuLong-Liang
Copy link

嘿,

你能帮忙吗?

我面临此功能的运行时错误。

`` 类 ConstantPad1d(Function): def init (self, target_size, dimension=0, value=0, pad_start=False): super(ConstantPad1d, self)。init () self.target_size = target_size self.dimension = 维度 self.value = 值 self.pad_start = pad_start

def forward(self, input):
    self.num_pad = self.target_size - input.size(self.dimension)
    assert self.num_pad >= 0, 'target size has to be greater than input size'

    self.input_size = input.size()

    size = list(input.size())
    size[self.dimension] = self.target_size
    output = input.new(*tuple(size)).fill_(self.value)
    c_output = output

    # crop output
    if self.pad_start:
        c_output = c_output.narrow(self.dimension, self.num_pad, c_output.size(self.dimension) - self.num_pad)
    else:
        c_output = c_output.narrow(self.dimension, 0, c_output.size(self.dimension) - self.num_pad)

    c_output.copy_(input)
    return output

def backward(self, grad_output):
    grad_input = grad_output.new(*self.input_size).zero_()
    cg_output = grad_output

    # crop grad_output
    if self.pad_start:
        cg_output = cg_output.narrow(self.dimension, self.num_pad, cg_output.size(self.dimension) - self.num_pad)
    else:
        cg_output = cg_output.narrow(self.dimension, 0, cg_output.size(self.dimension) - self.num_pad)

    grad_input.copy_(cg_output)
    return grad_input

`` RuntimeError:不推荐使用具有非静态转发方法的旧版 autograd 函数。请使用带有静态转发方法的新型 autograd 函数。(例如:https ://pytorch.org/docs/stable/autograd.html#torch.autograd.Function ) 你能帮忙吗?

see the lasted comment

@YuLong-Liang
Copy link

你能解决吗?我尝试了另一个分支,但我得到了同样的错误

can refer my comment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants