Out of Memory error in R2019a but not R2017b
1 view (last 30 days)
Show older comments
Hi there,
I have been trying to use the most up-to-date version of MATLAB, however, I've been noticing that many of the programs that I wrote in R2017b can no longer run in R2019a due to out of memory errors. I've managed my ulimit settings, purchased a couple of new sticks of RAM, and still haven't had any luck. The system I'm attempting to use is an Ubuntu 18.04, w/ 32 GB RAM, and an AMD Ryzen processor so I don't think that the hardware is the issue especially given that everything works in the previous releases.
The bread-and-butter of most of my scripts is here: https://github.com/open-ephys/analysis-tools/blob/master/load_open_ephys_data_faster.m
The file-size I'm trying to import is around 600 MB, nothing crazy.
I don't think that anything major has changed between versions that would cause this to script to no longer be functional either.
If anyone has any other ideas on how to resolve this I would be greatful!
4 Comments
Mitchell Morningstar
on 16 May 2019
Edited: Mitchell Morningstar
on 16 May 2019
Jan
on 16 May 2019
@Mitchel: I cannot really follow your explanations. You used segRead_int16 in R2017b but now use segRead in R2019b for any reasons. Importing the data as |doubles| require 4 times more memory. This can kill your machine, if this memory is not available in a contiguous block.
Although the function crashed inside the fread command, you explain, that it ran, but the termination occurred in the "pspectrum function". Therefore I'm confused an do not understand, which function exhausts the memory.
Answers (0)
See Also
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!