Question : What is the vanishing gradient problem, and how does it affect deep neural networks?Reponse : The vanishing gradient problem occurs when gradients become extremely small during backpropagation, leading to negligible weight updates in early layers. This hinders the training of deep networks, as early layers fail to learn meaningful representations. |
Enregistrer pour revision
Ajoutez cet element aux favoris, marquez-le comme difficile ou placez-le dans un ensemble de revision.
Connectez-vous pour enregistrer des favoris, des questions difficiles et des ensembles de revision.
Est-ce utile ? Oui Non
Les plus utiles selon les utilisateurs :
- Explain the purpose of an activation function in a neural network.
- What is transfer learning, and how is it used in deep learning?
- What is a convolutional neural network (CNN), and how is it different from a fully connected neural network?