Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam MLS-C01 Topic 3 Question 108 Discussion

Actual exam question for Amazon's MLS-C01 exam
Question #: 108
Topic #: 3
[All MLS-C01 Questions]

An insurance company is creating an application to automate car insurance claims. A machine learning (ML) specialist used an Amazon SageMaker Object Detection - TensorFlow built-in algorithm to train a model to detect scratches and dents in images of cars. After the model was trained, the ML specialist noticed that the model performed better on the training dataset than on the testing dataset.

Which approach should the ML specialist use to improve the performance of the model on the testing data?

Show Suggested Answer Hide Answer
Suggested Answer: D

The machine learning model in this scenario shows signs of overfitting, as evidenced by better performance on the training dataset than on the testing dataset. Overfitting indicates that the model is capturing noise or details specific to the training data rather than general patterns.

One common approach to reduce overfitting is L2 regularization, which adds a penalty to the loss function for large weights and helps the model generalize better by smoothing out the weight distribution. By increasing the value of the L2 hyperparameter, the ML specialist can increase this penalty, helping to mitigate overfitting and improve performance on the testing dataset.

Options like increasing momentum or reducing dropout are less effective for addressing overfitting in this context.


Contribute your Thoughts:

Shawnda
21 days ago
Wait, did someone say 'dents and scratches'? I'm just picturing a bunch of car insurance adjusters playing bumper cars to test the model. Now that's dedication!
upvoted 0 times
...
Kristofer
22 days ago
Increasing the momentum hyperparameter? Sounds like the model is already moving too fast and leaving the testing data in the dust. Slow it down!
upvoted 0 times
...
Berry
24 days ago
Reducing the dropout_rate? That's just asking for trouble! Dropout is key to preventing overfitting, my friend.
upvoted 0 times
Nohemi
11 days ago
A: I think reducing the dropout_rate might not be the best idea.
upvoted 0 times
...
...
Gennie
1 months ago
Increasing the L2 hyperparameter could add more regularization and prevent overfitting on the training data. Worth a try!
upvoted 0 times
India
1 days ago
B: I disagree, I believe increasing the value of the L2 hyperparameter would be more effective in preventing overfitting.
upvoted 0 times
...
Trina
9 days ago
A: I think reducing the value of the learning_rate hyperparameter could help improve the model's performance on the testing data.
upvoted 0 times
...
...
Franchesca
1 months ago
I think reducing the learning_rate hyperparameter is the way to go. Slower learning can help the model generalize better to the testing data.
upvoted 0 times
Sophia
15 days ago
C: I agree, slowing down the learning process might improve the model's performance on the testing data.
upvoted 0 times
...
Remona
1 months ago
B: Maybe increasing the value of the L2 hyperparameter could also help.
upvoted 0 times
...
Virgie
1 months ago
A: I think reducing the learning_rate hyperparameter is a good idea.
upvoted 0 times
...
...
Carlene
2 months ago
Why do you think that?
upvoted 0 times
...
Micaela
2 months ago
I disagree, I believe option D would be more effective.
upvoted 0 times
...
Carlene
2 months ago
I think the ML specialist should choose option C.
upvoted 0 times
...

Save Cancel