Time complexity of neural network

13 views (last 30 days)
David
David on 29 Dec 2012
Matlab use many training algos, like LM, CG, SD etc for ANN training. I am searching for the time complexity for LM & CG based ANN implementation in Matlab. specificly, What is the time complexity of LM & CG method implementd in Matlab. Are they faster than O(n3).
Thanks in advance for your help.

Accepted Answer

Greg Heath
Greg Heath on 5 Jan 2013
Edited: Greg Heath on 5 Jan 2013
It depends on the network efficiency settings that control the speed/storage tradeoff
For example, the current algorithms for regression and classification are FITNET and PATTERNNET, respectively. BOTH call FEEDFORWARDNET.
The default settings are 1 but can be overwritten.
net = feedforwardnet;
% To verify unity defaults
efficiency = net.efficiency
cacheDelayedInputs = net.efficiency.cacheDelayedInputs
flattentime = net.efficiency.flattenTime
memoryReduction = net.efficiency.memoryReduction
% To overide defaults
net.efficiency.cacheDelayedInputs = cacheDelayedInputs0;
net.efficiency.flattenTime = flattentime0;
net.efficiency.memoryReduction = memoryReduction0;
Search the newsgroup and answers using
neural speed memory
Hope this helps.
Thank you for formally accepting my answer.
Greg

More Answers (1)

Matt J
Matt J on 30 Dec 2012
Edited: Matt J on 30 Dec 2012
As a disclaimer, I have no background in neural networks. I do have some background in function optimization, though, and I think I can confidently say,
  1. The time complexity will depend on the structure of your network, i.e., how densely things are interconnected. If connections are sparse, then sparse math can be used for the gradient computations, etc... leading to reduced complexity.
  2. In general, the worst case complexity won't be better than O(N^3). LM requires the solution of an NxN system of equations in every iteration, which can be O(N^3). CG can require O(N^2) for each gradient computation per iteration and N iterations to converge for a total of O(N^3). These are worst case bounds, of course, and might not reflect what you see in practice.
  5 Comments
Matt J
Matt J on 5 Jan 2013
Since this information isn't given in the documentation, it stands to reason that TMW doesn't want the internal details of the code exposed.
However, you could test the complexity yourself just by running examples with increasing N and plotting the trend in the computation time as a function of N.
Greg Heath
Greg Heath on 5 Jan 2013
The current version of the source code is hard to decipher
type trainlm
Although obsolete versions of the code may be easier to understand, they may not be relevant re what the new code does.

Sign in to comment.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!