Question: Explain the concept of batch normalization and its advantages in training deep neural networks.Answer: Batch normalization normalizes the inputs of a layer within a mini-batch, reducing internal covariate shift. It stabilizes and accelerates the training process, enables the use of higher learning rates, and acts as a form of regularization, reducing the reliance on techniques like dropout. |
احفظ للمراجعة
احفظ هذا العنصر في الإشارات المرجعية، او حدده كصعب، او ضعه في مجموعة مراجعة.
سجل الدخول لحفظ الإشارات المرجعية والاسئلة الصعبة ومجموعات المراجعة.
هل هذا مفيد؟ نعم لا
الاكثر فائدة حسب تقييم المستخدمين:
- Explain the purpose of an activation function in a neural network.
- What is transfer learning, and how is it used in deep learning?
- What is a convolutional neural network (CNN), and how is it different from a fully connected neural network?