What are the main benefits of mini-batch gradient descent?
Answer Posted / Prandeep Kaur
Mini-batch Gradient Descent is an optimization technique that allows for efficient training of large neural networks. Its main benefits include reducing memory usage, allowing parallel computation, and enabling faster convergence to a local minimum.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category