Explain neural networks?
How does an LSTM network work?
What are conjugate gradients, levenberg-marquardt, etc.?
What can you do with an nn and what not?
Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results. a) True – this works always, and these multiple perceptrons learn to classify even complex problems. b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded d) False – just having a single perceptron is enough
What are neural networks? What are the types of neural networks?
A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0. a) True b) False c) Sometimes – it can also output intermediate values as well d) Can’t say
. Why are linearly separable problems of interest of neural network researchers? a) Because they are the only class of problem that network can solve successfully b) Because they are the only class of problem that Perceptron can solve successfully c) Because they are the only mathematical functions that are continue d) Because they are the only mathematical functions you can draw
What is back propagation? a) It is another name given to the curvy function in the perceptron b) It is the transmission of error back through the network to adjust the inputs c) It is the transmission of error back through the network to allow weights to be adjusted so that the network can learn. d) None of the mentioned
How neural networks became a universal function approximators?
What are artificial neural networks?
Who is concerned with nns?
Are neural networks helpful in medicine?