How neural networks became a universal function approximators?
How are nns related to statistical methods?
List some commercial practical applications of artificial neural networks?
What can you do with an nn and what not?
How does ill-conditioning affect nn training?
Explain Generative Adversarial Network.
Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results. a) True – this works always, and these multiple perceptrons learn to classify even complex problems. b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded d) False – just having a single perceptron is enough
Neural Networks are complex ______________ with many parameters. a) Linear Functions b) Nonlinear Functions c) Discrete Functions d) Exponential Functions
What learning rate should be used for backprop?
Which of the following is not the promise of artificial neural network? a) It can explain result b) It can survive the failure of some nodes c) It has inherent parallelism d) It can handle noise
Why use artificial neural networks? What are its advantages?
How many kinds of nns exist?
How human brain works?