You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description:
I am using gradients to calculate the N-dimensional integral of a Multidimensional Normal Distribution. However, I observed that the gradient values along dimensions are different, which seems strange. Below are the steps and code snippets demonstrating the issue.
Steps to Reproduce:
Compute the integral and gradients using the provided code.
Compare the results with finite difference approximations.
When I calculate the same derivatives with finite differences, all values are the same as expected. We can see that the finite differences values are close to the last two values of delta_limit.
Additionally, if the cov_matrix is an identity matrix, we do not observe this behavior. The gradient values are the same along the dimensions and are close to the finite difference values.
Hmm, interesting, thanks for flagging this. I'm not entirely sure what causes this, but it's got to be something in the implementation-weeds. One thing to note is that MVNXPB is approximates the integral, so the gradients are also approximate - depending on the order the bivariate conditioning goes through the dimensions I can see those errors being different for different dimensions.
I would assume that the errors here would grow larger as as the covariance gets more and more singular, I'm curious if you've observed that in practice. E.g. what happens to the size of the errors as you let diag_val in cov_matrix = (0.5 * torch.ones((N_dim, N_dim), dtype=dtype)).fill_diagonal_(diag_val) approach 0.5?
🐛 Bug
Description:
I am using gradients to calculate the N-dimensional integral of a Multidimensional Normal Distribution. However, I observed that the gradient values along dimensions are different, which seems strange. Below are the steps and code snippets demonstrating the issue.
Steps to Reproduce:
Compute the integral and gradients using the provided code.
Compare the results with finite difference approximations.
** Code snippet to reproduce **
When I calculate the same derivatives with finite differences, all values are the same as expected. We can see that the finite differences values are close to the last two values of delta_limit.
Additionally, if the cov_matrix is an identity matrix, we do not observe this behavior. The gradient values are the same along the dimensions and are close to the finite difference values.
** Stack trace/error message **
Expected Behavior
System information
Please complete the following information:
Question:
Is this a bug or is there an explanation for this behavior?
The text was updated successfully, but these errors were encountered: