Info
This question is closed. Reopen it to edit or answer.
Matlab loads part of huge file
2 views (last 30 days)
Show older comments
Good day to all.
I'm loading a huge file of data. First row contains station name. First column contains dates. The data are of type double. The csv file i'm loading haw a total of 35065 rows and 5000 columns. I use the following code:
TestData=importdata (filename,';');
vars = fieldnames(TestData);
for i = 1:length(vars)
assignin('base', vars{i}, TestData.(vars{i}));
end
Matlab loads only 17504 rows and ignores the rest. Im working on a relatively powerfull linux machine, with an i7 processor and 24G memory.
Does anyone know the nature of the problem here. Is it matlab limitaion? Can i do something to fix it?
Thank you all!
1 Comment
Stephen23
on 19 Sep 2014
Do not create or use variable names dynamically like this. Use a cell array or structure instead:
Answers (2)
per isakson
on 19 Sep 2014
Edited: per isakson
on 19 Sep 2014
My guess is that row number 17504 of your file contains something that causes importdata to stop reading.
A little experiment
>> importdata( 'cssm.txt' )
ans =
data: [4x5 double]
textdata: {'A' 'B' 'C' 'D' 'E'}
colheaders: {'A' 'B' 'C' 'D' 'E'}
>> ans.data
ans =
17 24 1 8 15
23 5 7 14 16
4 6 13 20 22
10 12 NaN NaN NaN
>>
where cssm.txt contains
A B C D E
17 24 1 8 15
23 5 7 14 16
4 6 13 20 22
10 12 BBB 21 3
11 18 25 2 9
"BBB" in the 4th data-row causes importdata to
- stop reading
- pad that row with NaN and
- skip the remaning rows
What is worse is that importdata does this without warning the user.
0 Comments
thankar
on 22 Sep 2014
1 Comment
per isakson
on 22 Sep 2014
I assume you have 64bit system.
>> 35065 * 5000 * 8 /1e9
ans =
1.4026
That's 1.4GB and another <3GB for the file itself.
Sounds a bit strange. I would like to make a test. Could you you upload the file somewhere?
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!