Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

你好,我想问问在unet_training代码中定义的dice_loss函数中temp_target[...,:-1] 是什么意思啊,切片索引的是啥?还有torch.sum相加中的axis是啥意思? #88

Open
BaronDuan opened this issue Dec 29, 2023 · 0 comments

Comments

@BaronDuan
Copy link

def Dice_loss(inputs, target, beta=1, smooth = 1e-5):
n, c, h, w = inputs.size()
nt, ht, wt, ct = target.size()
if h != ht and w != wt:
inputs = F.interpolate(inputs, size=(ht, wt), mode="bilinear", align_corners=True)

temp_inputs = torch.softmax(inputs.permute(0, 2, 3, 1).contiguous().view(n, -1, c), -1)
temp_target = target.view(n, -1, ct)

# 计算dice loss
tp = torch.sum(temp_target[..., :-1] * temp_inputs, axis=[0, 1])
fp = torch.sum(temp_inputs                        , axis=[0, 1]) - tp
fn = torch.sum(temp_target[..., :-1]              , axis=[0, 1]) - tp

score = ((1 + beta ** 2) * tp + smooth) / ((1 + beta ** 2) * tp + beta ** 2 * fn + fp + smooth)
dice_loss = 1 - torch.mean(score)
return dice_loss
@BaronDuan BaronDuan changed the title 在unet_training代码中定义的dice_loss函数中temp_target[...,:-1] 是什么意思啊,切片索引的是啥?还有torch.sum相加中的axis是啥意思? 你好,我想问问在unet_training代码中定义的dice_loss函数中temp_target[...,:-1] 是什么意思啊,切片索引的是啥?还有torch.sum相加中的axis是啥意思? Dec 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant