Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to resize storage that is not resizable #37

Open
GoodStudyDayUpUp opened this issue Jul 28, 2017 · 8 comments
Open

Trying to resize storage that is not resizable #37

GoodStudyDayUpUp opened this issue Jul 28, 2017 · 8 comments

Comments

@GoodStudyDayUpUp
Copy link

Hi everyone,

I'm trying to run this script: https://github.com/jzbontar/mc-cnn (I am trying to transform left.bin and right.bin to .png file which can be shown) which produces the following error:

$ ...luajit samples/bin2png.lua
Writing left.png
luajit: ...ocal/torch_update/install/share/lua/5.1/torch/Tensor.lua:462: Trying to resize storage that is not resizable at /usr/local/torch_update/pkg/torch/lib/TH/generic/THStorage.c:183
stack traceback:
[C]: in function 'set'
...ocal/torch_update/install/share/lua/5.1/torch/Tensor.lua:462: in function 'view'
samples/bin2png.lua:9: in main chunk
[C]: at 0x00406670

Anyone knows what is wrong? I found someone said like this:

"""This happens because of this commit torch/torch7#389
I think that the author of the mc-cnn code should update his normalize script to take into account this change in torch"""

If that is the reason, can I update the normalize script (Normalize2.lua?) by myself to make the code work? How?

I am new to machine learning and GPU programming! I use ubuntu 16.04 and a server (from intranet) with GeForce GTX TITAN installed. I appreciate any hints from you!^_^

@BroadDong
Copy link

I have also run into the same issue.Do you solve it?

@rohanchabra
Copy link

rohanchabra commented Aug 3, 2017

I guess new updates on Torch causes these issues.
A fix on bin2png.lua:9:-

s = d * h * w

left = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile('left.bin','r'):binary():readFloat(left:storage())

You might have to do similar thing in fromfile(fname) function in main.lua if you want to train network. Here is my fix:-
function fromfile(fname)
local file = io.open(fname .. '.dim')
local dim = {}
for line in file:lines() do
table.insert(dim, tonumber(line))
end
if #dim == 1 and dim[1] == 0 then
return torch.Tensor()
end

local file = io.open(fname .. '.type')
local type = file:read('*all')

local d = torch.LongStorage(dim)

local s= 1
for i = 1,d:size() do
s = s * d[i]
end

local x
if type == 'float32' then
--x = torch.FloatTensor(torch.FloatStorage(fname))
x = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile(fname,'r'):binary():readFloat(x:storage())
elseif type == 'int32' then
--x = torch.IntTensor(torch.IntStorage(fname))
x = torch.IntTensor(torch.IntStorage(s))
torch.DiskFile(fname,'r'):binary():readInt(x:storage())
elseif type == 'int64' then
--x = torch.LongTensor(torch.LongStorage(fname))
x = torch.LongTensor(torch.LongStorage(s))
torch.DiskFile(fname,'r'):binary():readLong(x:storage())
else
print(fname, type)
assert(false)
end

--x = x:reshape(torch.LongStorage(dim))
x = x:reshape(d)
return x
end

@GoodStudyDayUpUp
Copy link
Author

GoodStudyDayUpUp commented Aug 4, 2017

@BroadDong Thanks for your reply, unfortunately, I haven't solved the problem.
According to @rohanchabra rohanchabra's suggestion, I changed the code. However, the following problem appeared when I was trying to run bin2png.lua:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "_" before comma in line 15 of bin2png.lua as what it was originally, ( i.e. _, left = left:min(2) ), but another problem happened which is:

Writing left.png
luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value)
stack traceback:
samples/bin2png.lua:16: in main chunk
[C]: at 0x00406670

@GoodStudyDayUpUp
Copy link
Author

@rohanchabra Thanks a lot for your suggestion.
I changed the code. However, the following problem appeared when I was trying to run bin2png.lua:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "_" before comma in line 15 of bin2png.lua as what it was originally, ( i.e. _, left = left:min(2) ), but another problem happened which is:

Writing left.png
luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value)
stack traceback:
samples/bin2png.lua:16: in main chunk
[C]: at 0x00406670

@rohanchabra
Copy link

rohanchabra commented Aug 7, 2017

@GoodStudyDayUpUp Sorry, I might haven't given whole code that I have updated.

s = d * h * w

print('Writing left.png')
left = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile('left.bin','r'):binary():readFloat(left:storage())
left = left:view(1, d, h, w):cuda()

Rest should be same

Maybe this will help

@GoodStudyDayUpUp
Copy link
Author

GoodStudyDayUpUp commented Aug 14, 2017

@rohanchabra Thanks! Actually, last time you have given me the whole code. ( I changed both main.lua and bin2png.lua according to your suggestion.)
As for the main.lua, I made it 100% the same as you did:

function fromfile(fname)
local file = io.open(fname .. '.dim')
local dim = {}
for line in file:lines() do
table.insert(dim, tonumber(line))
end
if #dim == 1 and dim[1] == 0 then
return torch.Tensor()
end

local file = io.open(fname .. '.type')
local type = file:read('*all')

local d = torch.LongStorage(dim)

local s= 1
for i = 1,d:size() do
s = s * d[i]
end

local x
if type == 'float32' then
--x = torch.FloatTensor(torch.FloatStorage(fname))
x = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile(fname,'r'):binary():readFloat(x:storage())
elseif type == 'int32' then
--x = torch.IntTensor(torch.IntStorage(fname))
x = torch.IntTensor(torch.IntStorage(s))
torch.DiskFile(fname,'r'):binary():readInt(x:storage())
elseif type == 'int64' then
--x = torch.LongTensor(torch.LongStorage(fname))
x = torch.LongTensor(torch.LongStorage(s))
torch.DiskFile(fname,'r'):binary():readLong(x:storage())
else
print(fname, type)
assert(false)
end

--x = x:reshape(torch.LongStorage(dim))
x = x:reshape(d)
return x
end

Then I revised the bin2png.lua as:

require 'cutorch'
require 'image'
require 'torch'

d = 70---48
h = 370---512
w = 1226---612

s = d * h * w

print('Writing left.png')
left = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile('left.bin','r'):binary():readFloat(left:storage())
left = left:view(1, d, h, w):cuda()
, left = left:min(2)
image.save('left.png', left_[1]:float():div(d))

print('Writing right.png')
right = torch.FloatTensor(torch.FloatStorage(s))
torch.DiskFile('right.bin','r'):binary():readFloat(right:storage())
right = right:view(1, d, h, w):cuda()
, right = right:min(2)
image.save('right.png', right_[1]:float():div(d))

print('Writing disp.png')
disp = torch.FloatTensor(torch.FloatStorage(h*w))
torch.DiskFile('disp.bin','r'):binary():readFloat(disp:storage())
disp = disp:view(1, 1, h, w)
image.save('disp.png', disp[1]:div(d))

THE PROBLEM HAPPENED:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "_" myself before comma in line 15 and 22 to change the code

_, left = left:min(2)
_, right = right:min(2)

THEN THE PROBLEM BECAME AS:

Writing left.png
luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value)
stack traceback:
samples/bin2png.lua:16: in main chunk
[C]: at 0x00406670

I don't know why the error happened in my server.

@jzbontar
Copy link
Owner

Hey guys, can you try again? The bin2png.lua script broke when the newer version of torch came out. It should work now.

@GoodStudyDayUpUp
Copy link
Author

@jzbontar
Hi Jure Zbontar,

Thanks a lot for your update! I have written my own code to transform bin to png but I will try your updated code later! Now I meet with an interesting problem. That is:
When I was trying to use the middleburry net (I have tried both: net_mb_fast_-a_train_all; net_mb_slow_-a_train_all), the left side of the disparity map is totally white (I have checked the disp.bin, the left several columns all have the same value which is the largest through the whole binary file. And in my own 'bin2png' code, I will assign 255 to these pixels for suitable display in black and white range.). And the more interesting thing is: the number of left rows being white is just the value of 'disp_max-1' (disp_max is the disparity range you defined. e.g. ./main.lua kitti fast -a predict -net_fname net/net_kitti_fast_-a_train_all.t7 -left samples/input/kittiL.png -right samples/input/kittiR.png -disp_max 70)! Do you know what is wrong?

Best regards!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants