Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Soalan

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

DS07 Deep Learning

1. Every neural network must have at least?

Select one:

a. two layers
b. four layers
c. three layers 
d. one layer

2. The purpose of activation function is

Select one:

a. to balance the weights of a hidden layer


b. to produce a probabilistic output
c. to convert an input neuron to an output neuron
d. to introduce non-linearity into the network 

3. Backpropagation computes the ___ and propagates it back to earlier layers.

Select one:

a. output
b. error 
c. prediction
d. data

4. What activation function is the following?

a. Hyperbolic tanget
b. Linear
c. Sigmoid
d. Rectified linear unit 

5. The following are regularization techniques.

Select one or more:

a. Batch normalization 
b. Early stopping 
c. Xavier initialization
d. Dropout 
6. Which of the following statements are True about convolutional layer.

Select one or more.

a. The filter (kernel) must be square.


b. Leverages spatial or temporal structure of the data.
c. The convolution operation is a summation of weighted inputs. 
d. The depth defines the number of filter. 

7. The purpose of pooling layer is ____

Select one:

a. to extract salient features


b. to emphasize of the local features
c. to remove irrelevant features
d. to reduce the size of feature map 

8. Which of the following is the motivation of transfer learning?

Select one:

a. To reduce the variance of the Neural Network 


b. Later layers in a Neural Network are difficult (i.e.slowest) to train
c. Early layers in a Neural Network are the difficult (i.e. slowest) to train
d. To reduce the bias of the Neural Network

9. Which of the following statement is False about transfer learning?

Select one:

a. In transfer learning, a model is trained on one kind of problem, and then used on a different
but related problem
b. Transfer learning is to keep the later layers of a pre-trained network, and re-train the early
layers for a specific application 
c. Transfer learning allows developers to circumvent the need for lots of new data
d. Transfer learning decreases the training time for a neural network model.

10. The following are popular pre-trained models.

Select one or more:

a. AlexNet 
b. VGG 
c. GoogleNet 
d. ResNet 

You might also like