What are the differences between batch gradient descent, stochastic gradient descent, and mini-batch gradient descent?
Answer / Shrikant Gupta
Batch gradient descent computes the gradient of the loss function over the entire dataset at each iteration; stochastic gradient descent uses a single data point at each iteration to estimate the gradient; mini-batch gradient descent is a compromise between these two methods, using a small subset (mini-batch) of the data for each update.
| Is This Answer Correct ? | 0 Yes | 0 No |
What challenges do AI-powered education systems face?
How do you prioritize tasks on complex AI projects?
Can you describe a time when an AI project didn't go as planned and how you adapted?
How does Explainable AI aid in fairness and bias detection in machine learning models?
Explain the concept of vehicle-to-everything (V2X) communication in AI.
How would you go about finding the best model for a given task?
How can machine learning prevent phishing attacks?
How does natural language processing (NLP) enhance educational tools?
What are your career aspirations?
What is the role of AI in risk analysis?
How would you handle imbalanced datasets?
Describe how you would build a chatbot.
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)