Out of Memory error in R2019a but not R2017b

1 view (last 30 days)
Hi there,
I have been trying to use the most up-to-date version of MATLAB, however, I've been noticing that many of the programs that I wrote in R2017b can no longer run in R2019a due to out of memory errors. I've managed my ulimit settings, purchased a couple of new sticks of RAM, and still haven't had any luck. The system I'm attempting to use is an Ubuntu 18.04, w/ 32 GB RAM, and an AMD Ryzen processor so I don't think that the hardware is the issue especially given that everything works in the previous releases.
The file-size I'm trying to import is around 600 MB, nothing crazy.
I don't think that anything major has changed between versions that would cause this to script to no longer be functional either.
If anyone has any other ideas on how to resolve this I would be greatful!
  4 Comments
Mitchell Morningstar
Mitchell Morningstar on 16 May 2019
Edited: Mitchell Morningstar on 16 May 2019
Hi Jan,
To clarify I just use that GitHub repo, I didn't write it or exactly know what each part of that code does, but this is the documentation on the file type if you are curious: https://open-ephys.atlassian.net/wiki/spaces/OEW/pages/65667092/Open+Ephys+format.
The error message is as follows:
Error using fread
Out of memory. Type "help memory" for your options.
Error in load_open_ephys_data_faster/segRead (line 182)
seg = fread(fid, numIdx*dblock(segNum).Repeat, sprintf('%d*%s', ...
Error in load_open_ephys_data_faster (line 141)
data = segRead('data', 'b') .* info.header.bitVolts;
I was able to trace it to that specific segment of code which looks like this:
function seg = segRead_int16(segName, mf)
%% This function is specifically for reading continuous data.
% It keeps the data in int16 precision, which can drastically decrease
% memory consumption
if nargin == 1, mf = 'l'; end
segNum = find(strcmp({dblock.Str},segName));
fseek(fid, sum(blockBytes(1:segNum-1))+NUM_HEADER_BYTES, 'bof');
seg = fread(fid, numIdx*dblock(segNum).Repeat, [sprintf('%d*%s', ...
dblock(segNum).Repeat,dblock(segNum).Types) '=>int16'], sum(blockBytes) - blockBytes(segNum), mf);
end
function seg = segRead(segName, mf)
if nargin == 1, mf = 'l'; end
segNum = find(strcmp({dblock.Str},segName));
fseek(fid, sum(blockBytes(1:segNum-1))+NUM_HEADER_BYTES, 'bof');
seg = fread(fid, numIdx*dblock(segNum).Repeat, sprintf('%d*%s', ...
dblock(segNum).Repeat,dblock(segNum).Types), sum(blockBytes) - blockBytes(segNum), mf);
end
From there, I made segRead use int16 just the same as segRead_int16. It allowed me to import the data, however the overall function lost some utility and I had to convert back to a double. This was well and good, but then when I went to analyze the data with MATLAB's built in pspectrum function it spun its wheels for an hour or so before I terminated the process. In R2017b analyzing the same data with the same function takes me around 5-10 minutes.
If you'd like any other information I'll do my best to provide it!
Mitch
Jan
Jan on 16 May 2019
@Mitchel: I cannot really follow your explanations. You used segRead_int16 in R2017b but now use segRead in R2019b for any reasons. Importing the data as |doubles| require 4 times more memory. This can kill your machine, if this memory is not available in a contiguous block.
Although the function crashed inside the fread command, you explain, that it ran, but the termination occurred in the "pspectrum function". Therefore I'm confused an do not understand, which function exhausts the memory.

Sign in to comment.

Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!