Reset a model parameter value regularly during training #19788
anupsingh15
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have created a model in the following way. The model mimics the straight-through gradient estimation as proposed in the VQ-VAE, with the difference that I reset the
codewords
parameter after every epoch for some training epochs.I want to reset the parameter value before a training epoch starts. It will get updated with every training step, but I'd like to reset its values for the next training epoch. I want to reiterate this process for some training epochs only.
The way I do it right now is to reset its data value (in the
run_kmeans
method) as mentioned hereunder, but I get its gradient to be None:self.codewords.data = torch.from_numpy(self.kmeans.centroids).to(device).detach().clone()
Could someone please let me know what is the correct implementation?
Beta Was this translation helpful? Give feedback.
All reactions