-
Notifications
You must be signed in to change notification settings - Fork 231
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running main.lua for training gives reshape function exception #35
Comments
I change the code just like you do : for i = 1,#dim do x = torch.FloatTensor(torch.FloatStorage(s)) kitti slow -a train_tr |
I have the same problem. Do you have a solution for this? |
you should give s a initial value,like this : for i = 1,#dim do |
It works. Thank you 👍 |
your problem fixed? if type == 'float32' then elseif type == 'int64' then else x = x:reshape(torch.LongStorage(dim)) -------------------------------------------------------------------------error message----------------------------------- |
Variable s should be outside of a FOR loop. If it is inside, it always resets to value 1 for each repeat. if type == 'float32' then
elseif type == 'int32' then
elseif type == 'int64' then
else |
I am able to run the test set correctly. But training gives me such error. The images are in the correct folder. What might be the problem?
./main.lua kitti slow -a train_trkitti
slow -a train_tr
luajit: ./main.lua:378: inconsistent tensor size, expected tensor [389 x 1 x 350 x 1242] and src [] to have the same number of elements, but got 169098300 and 0 elements respectively at /home/rohan140290/torch/pkg/torch/lib/TH/generic/THTensorCopy.c:86
stack traceback:
[C]: in function 'reshape'
./main.lua:378: in function 'fromfile'
./main.lua:428: in main chunk
[C]: at 0x00405d50
It seems like torch is not able to parse the file correctly.
x = torch.FloatTensor(torch.FloatStorage(fname)) :- Seems to have issues.
I instead tried to do this:-
for i = 1,#dim do
s = s * dim[i]
end
x = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile(fname,'r'):binary():readFloat(x:storage())
This seems to work for me.
The text was updated successfully, but these errors were encountered: