What is data normalization and why do we need it?
Answer / Lalit Prasad Maurya
Data Normalization (or Standardization) is a preprocessing step in machine learning where the values of features are scaled or transformed to have a mean of zero and standard deviation of one. This is important because many machine learning algorithms perform better when their input data has comparable scales and reduces the effects of outliers, which can otherwise dominate the learning process.
| Is This Answer Correct ? | 0 Yes | 0 No |
How does deep learning relate to ai?
How does Deep Learning differ from Machine Learning?
What is a swish function?
What is Overfitting and Underfitting and how to combat them?
What is the softmax function?
What are the applications of deep learning?
What are the supervised learning algorithms in deep learning?
What is a gpu used for?
Which laptop is good for deep learning?
What are the different layers of autoencoders? Explain briefly.
What do you understand by autoencoder?
Please explain what is deep learning?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)