Question: Explain the difference between batch gradient descent and stochastic gradient descent.Answer: Batch gradient descent updates the model parameters using the entire dataset, while stochastic gradient descent updates the parameters using one randomly selected data point at a time. Mini-batch gradient descent is a compromise, using a small subset of the data for each update. |
복습용 저장
이 항목을 북마크하거나, 어렵게 표시하거나, 복습 세트에 넣을 수 있습니다.
도움이 되었나요? 예 아니요
Most helpful rated by users:
- Explain the concept of feature engineering.
- What is the purpose of regularization in machine learning?
- Explain the term \'hyperparameter\' in the context of machine learning.
- What is the purpose of the activation function in a neural network?
- Explain the term \'precision\' in the context of classification.