Computer Science, asked by gksoftnick99, 6 months ago

Perceptron can learn:
(A) AND
(B) XOR
(C) Both A and B
(D) None of these

Answers

Answered by akashkumarpraj13
0

Explanation:

To build up towards the (useful) multi-layer Neural Networks, we will start with considering the (not really useful) single-layer Neural Network. This is called a Perceptron.

The Perceptron

Input is multi-dimensional (i.e. input can be a vector):

input x = ( I1, I2, .., In)

Input nodes (or units) are connected (typically fully) to a node (or multiple nodes) in the next layer.

A node in the next layer takes a weighted sum of all its inputs:

Summed input =

Example

input x = ( I1, I2, I3) = ( 5, 3.2, 0.1 )

Summed input = = 5 w1 + 3.2 w2 + 0.1 w3

The rule

The output node has a "threshold" t.

Rule: If summed input ≥ t, then it "fires" (output y = 1).

Else (summed input < t) it doesn't fire (output y = 0).

This implements a function

Obviously this implements a simple function from multi-dimensional real input to binary output. What kind of functions can be represented in this way?

We can imagine multi-layer networks. Output node is one of the inputs into next layer.

Perceptron has just 2 layers of nodes (input nodes and output nodes). Often called a single-layer network on account of having 1 layer of links, between input and output.

Similar questions