BlackFriday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

HP Exam HPE2-N69 Topic 3 Question 36 Discussion

Actual exam question for HP's HPE2-N69 exam
Question #: 36
Topic #: 3
[All HPE2-N69 Questions]

A customer mentions that the ML team wants to avoid overfitting models. What does this mean?

Show Suggested Answer Hide Answer
Suggested Answer: C

Overfitting occurs when a model is trained too closely on the training data, leading to a model that performs very well on the training data but poorly on new data. This is because the model has been trained too closely to the training data, and so cannot generalize the patterns it has learned to new data. To avoid overfitting, the ML team needs to ensure that their models are not overly trained on the training data and that they have enough generalization capacity to be able to perform well on new data.


Contribute your Thoughts:

Carla
4 months ago
Hmm, D sounds like they want to avoid the headache of hardware management. If only training models was as simple as picking the right CPU!
upvoted 0 times
...
Jutta
4 months ago
I'm going with C. Overfitting is like when you cram for an exam - you might ace the test, but you won't remember anything afterwards.
upvoted 0 times
Reuben
2 months ago
I'm going with C. Overfitting is like when you cram for an exam - you might ace the test, but you won't remember anything afterwards.
upvoted 0 times
...
Jerry
2 months ago
C) The team wants to avoid training models to the point where they perform less well on new data.
upvoted 0 times
...
Dong
3 months ago
Exactly, it's important to find the right balance in model training to avoid overfitting.
upvoted 0 times
...
Renea
3 months ago
That's a great analogy! Overfitting can definitely lead to poor performance on new data.
upvoted 0 times
...
Carmen
3 months ago
C) The team wants to avoid training models to the point where they perform less well on new data.
upvoted 0 times
...
...
Von
4 months ago
Hah, obviously not B. Who wants to spend less time on the actual model creation? That's the fun part!
upvoted 0 times
Ernie
4 months ago
C) The team wants to avoid training models to the point where they perform less well on new data.
upvoted 0 times
...
Aaron
4 months ago
A) The team wants to avoid wasting resources on training models with poorly selected hyperparameters.
upvoted 0 times
...
...
Vilma
4 months ago
I see your point, Rana. It's important to optimize resources for better model performance.
upvoted 0 times
...
Rana
5 months ago
A) The team wants to avoid wasting resources on training models with poorly selected hyperparameters.
upvoted 0 times
...
Katie
5 months ago
Definitely C - overfitting means the model learns the training data too well and performs poorly on new, unseen data. The team wants to avoid this and focus on generalization.
upvoted 0 times
Tamala
4 months ago
A) The team wants to avoid wasting resources on training models with poorly selected hyperparameters.
upvoted 0 times
...
Jeanice
4 months ago
C) The team wants to avoid training models to the point where they perform less well on new data.
upvoted 0 times
...
...
Regenia
5 months ago
I agree with C. Overfitting can lead to poor performance on new data.
upvoted 0 times
...
Bo
5 months ago
C) The team wants to avoid training models to the point where they perform less well on new data.
upvoted 0 times
...

Save Cancel