Which of the following statements about Adam is False?

  1. Which of the following statements about Adam is False?
    •  Adam combines the advantages of RMSProp and momentum
    •  We usually use “default” values for the hyperparameters β1, β2 and ε in Adam (β1 = 0.9, β2 = 0.999,  )
    •  The learning rate hyperparameter α in Adam usually needs to be tuned.
    •  Adam should be used with batch gradient computations, not with mini-batches.

Get All Week Quiz Answer:

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization Coursera Quiz Answer

Similar Posts