Answer Posted / Ramanand Kumar
The most used activation function is the rectified linear unit (ReLU), due to its simplicity and ability to introduce non-linearity while avoiding the vanishing gradient problem. Other common activation functions include sigmoid, tanh, softmax, and more recently developed functions like leaky ReLU and Swish.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category