-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about softmax_loss layer #3
Comments
Yes, it happens if all voxels within the patch have a zero weight or are labelled with the ignore value. If you use the weight image, they should be non-zero somewhere. |
but if we have two class to classified, the patches for training sometimes just have class 1, so the weights of class 2 will be zero. because the samples are imbalance, it is not possible to make all the patches for training contain two class label at the same time. |
@qinhaifangpku Could you solve the |
hi~
It is very appreciated that you release this code. It is very helpful for my research. but I have the question about the softmax_loss layer. in some situation that the batch data has just C class which C < K, K is the total class in the networks. and it will LOG error that: "sum of pixel-wise loss weights is zero"
Do you have ever happened to this? Is it matter with the results?
thank you very much in advance!
The text was updated successfully, but these errors were encountered: