[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: csvread
From: |
c. |
Subject: |
Re: csvread |
Date: |
Thu, 1 Nov 2012 14:11:25 +0100 |
On 1 Nov 2012, at 09:37, address@hidden wrote:
> I am trying to read a csv file with about 25M data points. (250,000 rows, 100
> columns). Using MacBook Pro OSX 10.8 with 8GB RAM, 2.3Ghz i7. I used
> csvread('file'). The process has been running for 1.5 days. It's currently
> using 2.5GB ram. The file is 230MB.
>
> This seems too slow. I didn't make a matrix of zeros before running the
> process. Also, now I have about 1GB of RAM left.
>
> Can someone give me insight into what's happening? If I interrupt the process
> will it keep the information that is already loaded? Or will I lose
> everything? Should I start quiting other processes to free up ram?
>
> Thanks,
> Matt
I tried creating a file with the same amount of data (which resulted in a much
larger file though) and loading it in Octave 3.7+ on OSX 10.6.8:
>> outdata = randn (250e3, 100);
>> csvwrite ('csvtest.csv', outdata);
>> tic, indata = csvread ('csvtest.csv'); toc
Elapsed time is 184.5596 seconds.
>> norm (indata - outdata, inf)
ans = 1.5730e-14
>> ls -larth csvtest.csv
-rw-r--r-- 1 carlo staff 457M 1 Nov 14:02 csvtest.csv
>>
this took me about 3 minutes, I really doubt your process is doing anything
useful, I guess it's probably stuck ...
c.