# Which of these statements about mini-batch gradient descent do you agree with?

Which of these statements about mini-batch gradient descent do you agree with?

**One iteration of mini-batch gradient descent (computing on a single mini-batch) is faster than one iteration of batch gradient descent.**- Training one epoch (one pass through the training set) using mini-batch gradient descent is faster than training one epoch using batch gradient descent.
- You should implement mini-batch gradient descent without an explicit for-loop over different mini-batches, so that the algorithm processes all mini-batches at the same time (vectorization).

**Get All Week Quiz Answer**: