Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Deep Learning With Matlab Quick Start Guide PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

QUICK START GUIDE

Deep Learning with MATLAB


Deep Learning Toolbox™ provides built-in functionality for creating, training, and validating deep neural networks. This reference shows
some common use cases. For additional examples, visit the documentation: mathworks.com/help/deeplearning/examples.html

Choosing an Architecture Pretrained Networks

Convolution Neural Network (CNN) Import Networks


• Image data: classification, detection
The toolbox provides several functions for exporting
• Common layers:
models and layers. More can be found on GitHub
• Convolution layer
and File Exchange.
• Max pooling
• ReLU layer importCaffeLayers
Import layers
• Batch normalization importKerasLayers
• Train from scratch or use transfer learning importCaffeNetwork
Import network
with pretrained models importKerasNetwork
Long Short Term Memory (LSTM) Network Export exportONNXNetwork
• Sequential data: time series forecasting,
signal classification, text prediction Pretrained Models
• Common layers: From Add-on Explorer, use one of the following
• LSTM layer Use the Deep Network Designer app commands to import a network:
• BiLSTM layer to interactively create and alexnet vgg19 inceptionv3
• Perform regression or classification tasks evaluate networks googlenet resnet50 squeezenet
vgg16 resnet101

Training Options Validation Improving Accuracy

Training Options Inference Improving model accuracy depends on the


predict Returns probabilities belonging to task and the data. Common approaches
Execution Parallel, GPU, multi- include:
Environment
each class
GPU, auto (default)
classify Returns labels and probabilities Network architecture:
An epoch is one full
belonging to each class • Use pretrained models from community
MaxEpochs pass over entire
training set [Ypred,scores] = classify(net,X); experts
Subset of training set to State • Update layers and adjust parameters
MiniBatchSize evaluate gradient and Network state can be captured and updated Data preparation:
update weights with predictAndUpdateState and • Add data
A higher initial rate will classifyAndUpdateState • Training/validation/test split
InitialLearnRate speed up training but
may diverge
Visualization • Normalize data
Several forms of validations and visualizations • Remove outliers
Drop the learn rate
LearnRateSchedule can be specified through trainingOptions • Balance classes (add weights)
over time by a factor
ValidationData Validate during training Hyperparameter tuning:
Plots Visualize progress
Stop training if
• Tune the training parameters with Bayes
Set to true to display
ValidationPatience accuracy is repeated a optimization
Verbose training progress each
certain (saves time) epoch • Set up problem with
VerboseFrequency
optimizableVariable
How often to display
• Write function calling model and options
OutputFcn Custom function
• Perform optimization with bayesopt
Directory to save model
CheckpointPath
each epoch obj = bayesopt(ObjFcn,OptVars,…);

Learn more: mathworks.com/solutions/deep-learning

mathworks.com

© 2018 The MathWorks, Inc. MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See mathworks.com/trademarks for a list of additional trademarks.
Other product or brand names may be trademarks or registered trademarks of their respective holders.

You might also like