Every hyperparameter, if set poorly, can have a huge negative impact on training, and so all hyperparameters are about equally important to tune well.

  1. Every hyperparameter, if set poorly, can have a huge negative impact on training, and so all hyperparameters are about equally important to tune well. True or False?
    •  True
    •  False

Yes We’ve seen in lecture that some hyperparameters, such as the learning rate, are more critical than others.

Get All Week Quiz Answer:

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization Coursera Quiz Answer

Similar Posts