0% found this document useful (0 votes)
2K views9 pages

Machine Learning Coursera by Andrew NG Week 4 Quiz 1

This document contains a quiz about neural networks with multiple choice and true/false questions. Some key points: - A two layer neural network cannot represent the XOR function without a hidden layer. Any logical function over binary inputs can be approximated with some neural network. - The outputs of a neural network for classification problems should sum to 1, as they represent probabilities. - The activation values of hidden units in a neural network using sigmoid activation are always between 0 and 1. - Swapping the first and second weight matrices of a neural network will not change the output.

Uploaded by

Hương Đặng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views9 pages

Machine Learning Coursera by Andrew NG Week 4 Quiz 1

This document contains a quiz about neural networks with multiple choice and true/false questions. Some key points: - A two layer neural network cannot represent the XOR function without a hidden layer. Any logical function over binary inputs can be approximated with some neural network. - The outputs of a neural network for classification problems should sum to 1, as they represent probabilities. - The activation values of hidden units in a neural network using sigmoid activation are always between 0 and 1. - Swapping the first and second weight matrices of a neural network will not change the output.

Uploaded by

Hương Đặng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Machine Learning Week 4 Quiz 1 (Neural

Networks: Representation) Stanford


Coursera
Question 1

True
or Statement Explan
False

We must compose multi


A two layer (one input layer, one output layer; no hidden
False using a hidden layer to r
layer) neural network can represent the XOR function.
function.

True Any logical function over binary-valued (0 or 1) inputs x1 and Since we can build the b
x2 can be (approximately) represented using some neural functions with a two laye
network. (approximately) represen
composing these basic fu
True
or Statement Explan
False

layers.

Suppose you have a multi-class classification problem with


three classes, trained with a 3 layer network. Let
The outputs of a neural n
False a(3)1=(hΘ(x))1 be the activation of the first output unit, and
probabilities, so their sum
similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it
must be the case that a(3)1+a(3)2+a(3)3=1.

The activation values of the hidden units in a neural network,


True with the sigmoid activation function applied at every layer, None Needed
are always in the range (0, 1).

Question 2
Answer Explanation

AND

Question 3
Answer

This corr
of Θ(2) an
of a(2)0
Question 4
 
Answer Explanation

a2 = sigmoid (Theta1 In the lecture's notation a(2) = g(Θ(1)x), so this version computes it directly, as
* x); will act element-wise.

Question 5
Answer Explanation

It will stay the Swapping Θ(1) swaps the hidden layers output a^{(2)}. But the swap of Θ(2) cancel
same. output will remain unchanged.

You might also like