[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Optimal way to handle big data table ?
From: |
CdeMills |
Subject: |
Optimal way to handle big data table ? |
Date: |
Thu, 7 Mar 2013 02:07:19 -0800 (PST) |
Hello,
I was doing recently lamp spectrum analysis to extract photometric
properties. This implies to compute the integral of the spectrum by the
CIE1931 sensitivity functions; they are tabulated at 400 wavelength, each
time 4 values. What's the best way to use those data inside a function ?
1) encode them inside the function body ? It will be compiled once.
CIE31Table = [360 0.000130 0.000004 0.000606
361 0.000146 0.000004 0.000681
362 0.000164 0.000005 0.000765 ... ];
2) read them from a text file ?
3) read them from a binary file ?
4) other ?
The point is to minimise the computational load of each time refilling this
matrix with its 1600 entries.
Regards
Pascal
--
View this message in context:
http://octave.1599824.n4.nabble.com/Optimal-way-to-handle-big-data-table-tp4650575.html
Sent from the Octave - General mailing list archive at Nabble.com.
- Optimal way to handle big data table ?,
CdeMills <=