Train shallow neural network – MATLAB train
Parallel Computing Toolbox™ allows Deep Learning Toolbox™ to simulate and train networks faster and on larger datasets
than can fit on one PC. Parallel training is currently supported for
backpropagation training only, not for self-organizing maps.
Here training and simulation happens across parallel MATLAB
workers.
parpool [X,T] = vinyl_dataset; net = feedforwardnet(10); net = train(net,X,T,'useParallel'
,'yes'
,'showResources'
,'yes'
); Y = net(X);
Use Composite values to distribute the data manually, and get back the
results as a Composite value. If the data is loaded as it is distributed
then while each piece of the dataset must fit in RAM, the entire dataset is
limited only by the total RAM of all the workers.
[X,T] = vinyl_dataset; Q = size(X,2); Xc = Composite; Tc = Composite; numWorkers = numel(Xc); ind = [0 ceil((1:numWorkers)*(Q/numWorkers))];for
i=1:numWorkers indi = (ind(i)+1):ind(i+1); Xc{i} = X(:,indi); Tc{i} = T(:,indi);end
net = feedforwardnet; net = configure(net,X,T); net = train(net,Xc,Tc); Yc = net(Xc);
Note in the example above the function configure was used to set the
dimensions and processing settings of the network’s inputs. This normally
happens automatically when train is called, but when providing composite
data this step must be done manually with non-Composite data.