I try to construct the matrix to test with DNN. but I have a problem;
I have vectors that will vary in size from 200 columns to 1200 columns. Each vector represents the characteristic of a handwritten word (5000mot). what to do so that I can normalize vectors for use with RBM and DBN.
for MNIST it does not pose a problem because the images are the same size 28 * 28 which gives us 784 columns network input.
I hope I find the solution.
Thank you for your cooperation.
I try to use your code in my work of research but i have error!!!
This is my problem;
After extracting the characteristics of online handwritten words from a database. I obtained the feature vectors composed of the real data for each word but are variable in their sizes. Also I prepared equity vectors representing the labels of each word.
Can i use your proposed DBN-DNN for my data ?
the different Matlab processes communicate via the file system in the multicore package. All function input and output arguments are saved to/read from the file system. In your example, the multicore master process will save 100 files of about 8 MB (1 million doubles) to the disk which are read by the slaves. The overhead is in this case clearly larger than the benefit of the parallel processing.
For further dicussion, please use the Yahoo group: http://groups.yahoo.com/group/multicore_for_matlab