|
From: | Nicholas Jankowski |
Subject: | Re: Loading a 318 MB matrix into Octave 32 bit |
Date: | Mon, 29 Jul 2013 07:57:56 -0400 |
I am running 32 bit octave on a 64 bit system (I would really like to keep it that way if possible) and I have a .mat with a double matrix size 5492743 by 3. Octave says it takes 131,825,832 bytes. When I run the following script (with num=140000) I get, immediately[, error: memory exhausted or requested size too large for range of Octave's index type -- trying to return to prompt.data = "">for i = 1:length(data)-numdata(i,:)=[data(i,1:3),vec(data((i+1):(i+num),1:3)')'];disp(i)enddata = "">Can I fix this without recompiling? If I have to recompile, how do I do it?
Thanks,Elliot Gorokhovskyjust trying to interpret what you did... you're saying you are loading a "5492743 by 3" double precision array into the variable 'data', then running the above script with num=140000? I get the same error message doing the following:
----------------------------------
octave:18> num=140000;data="">octave:19> whos
Variables in the current scope:
Attr Name Size Bytes Class
==== ==== ==== ===== =====
data 5492743x3 131825832 double
num 1x1 8 double
Total is 16478230 elements using 131825840 bytes
octave:20> data = ""> error: memory exhausted or requested size too large for range of Octave's index type -- trying to return to prompt
----------------------------------so, i guess we need to look at what's going on in that first line of the script. I played around with it, and on Win764-bit running the Windows Octave 3.6.4 32-bit (MinGW version) from the precompiled archive, your script fails for n > 4.
-----------------------------------
octave:33> clear all;num=4;data="" = [data,zeros(length(data),num*3)];
octave:34> whos
Variables in the current scope:
Attr Name Size Bytes Class
==== ==== ==== ===== =====
data 5492743x15 659129160 double
num 1x1 8 double
Total is 82391146 elements using 659129168 bytes
-----------------------------------
n=5 throws the error. If I get a chance I'll look into why, but if you play with a much smaller 'data', you'll see that you're multiplying the size of data by some variable related to n, which is going to make data very large very quickly using that script.
NickJ
[Prev in Thread] | Current Thread | [Next in Thread] |