What happens when you increase the regularization hyperparameter lambda?

  1. What happens when you increase the regularization hyperparameter lambda?
    •  Weights are pushed toward becoming smaller (closer to 0)
    •  Weights are pushed toward becoming bigger (further from 0)
    •  Doubling lambda should roughly result in doubling the weights
    •  Gradient descent taking bigger steps with each iteration (proportional to lambda)

Get All Week Quiz Answer:

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization Coursera Quiz Answer

Similar Posts