Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Huawei Exam H13-311_V3.5 Topic 7 Question 18 Discussion

Actual exam question for Huawei's H13-311_V3.5 exam
Question #: 18
Topic #: 7
[All H13-311_V3.5 Questions]

Which of the following are common gradient descent methods?

Show Suggested Answer Hide Answer
Suggested Answer: A, B, D

The gradient descent method is a core optimization technique in machine learning, particularly for neural networks and deep learning models. The common gradient descent methods include:

Batch Gradient Descent (BGD): Updates the model parameters after computing the gradients from the entire dataset.

Mini-batch Gradient Descent (MBGD): Updates the model parameters using a small batch of data, combining the benefits of both batch and stochastic gradient descent.

Stochastic Gradient Descent (SGD): Updates the model parameters for each individual data point, leading to faster but noisier updates.

Multi-dimensional gradient descent is not a recognized method in AI or machine learning.


Contribute your Thoughts:

Currently there are no comments in this discussion, be the first to comment!


Save Cancel