limits in array size for filtfilt?
Show older comments
just wondering if there is a hard size limit for the filtfilt function? I was looking at a ECG time series collected at 500Hz over several hours, with an array of ~14e7 entries in double. In the process of extracting the QRS complex i first run through a Butterworth bandpass filter, using N = 3, Wn = [5 15]*2/fs in the butter(N,Wn) function. It runs fine up to 6810691 entries, anything over it gives all NaN output. Just wondering have I missed something obvious? I am running on a machine with 16gb ram, i7-4xxx, and during the process it haven't hit the memory/cpu limits. Thank you!
Regards,
Alan
Accepted Answer
More Answers (1)
Walter Roberson
on 6 Mar 2019
0 votes
Yes, there is a hard size limit in filtfilt. Data less than 10000 points is put through a different routine that handles endpoints differently. I suspect the routine for the larger arrays exists to be more space efficient, but I am not positive.
Other than that: NO, there is no built-in limit other than what your memory can hold.
Note that several temporary arrays are needed during filtering, so do not expect to be able to process anything more than roughly 1/5th of your memory limit.
Categories
Find more on Time-Frequency Analysis in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!