What is the most used activation function?
Answer / Ramanand Kumar
The most used activation function is the rectified linear unit (ReLU), due to its simplicity and ability to introduce non-linearity while avoiding the vanishing gradient problem. Other common activation functions include sigmoid, tanh, softmax, and more recently developed functions like leaky ReLU and Swish.
| Is This Answer Correct ? | 0 Yes | 0 No |
Are cuda cores important?
What do you understand by autoencoder?
What is data normalization and why do we need it?
What is an rnn?
What are the main benefits of mini-batch gradient descent?
What is matrix element-wise multiplication? Explain with an example.
What do you understand by deep learning?
What do you understand by perceptron?
Which laptop is best for research?
What is Gradient Descent?
What is relu function?
What is a gpu used for?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)