Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] FlopsProfiler upsample flops compute bug #5537

Open
xgbj opened this issue May 15, 2024 · 0 comments
Open

[BUG] FlopsProfiler upsample flops compute bug #5537

xgbj opened this issue May 15, 2024 · 0 comments
Labels
bug Something isn't working training

Comments

@xgbj
Copy link

xgbj commented May 15, 2024

Describe the bug
the upsample flops compute code :
`def _upsample_flops_compute(*args, **kwargs):

scale_factor = kwargs.get('scale_factor', None)
if scale_factor is None and len(args) > 2:
    scale_factor = args[2]
assert scale_factor is not None, "either size or scale_factor should be defined"

flops = input.numel()
if isinstance(scale_factor, tuple) and len(scale_factor) == len(input):
    flops *= int(_prod(scale_factor))
else:
    flops *= scale_factor**len(input)
return flops, 0`

len(scale_factor) == len(input) -> len(scale_factor) == len(input.size())
flops *= scale_factor ** len(input) -> flops *= scale_factor ** (len(input.size())-1)

upsample flops compute in torchstat: https://github.com/Swall0w/torchstat/blob/master/torchstat/compute_flops.py#L83

@xgbj xgbj added bug Something isn't working training labels May 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working training
Projects
None yet
Development

No branches or pull requests

1 participant