BlackFriday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CertNexus Exam AIP-210 Topic 1 Question 28 Discussion

Actual exam question for CertNexus's AIP-210 exam
Question #: 28
Topic #: 1
[All AIP-210 Questions]

You have a dataset with thousands of features, all of which are categorical. Using these features as predictors, you are tasked with creating a prediction model to accurately predict the value of a continuous dependent variable. Which of the following would be appropriate algorithms to use? (Select two.)

Show Suggested Answer Hide Answer
Suggested Answer: C, E

Lasso regression and ridge regression are both types of linear regression models that can handle high-dimensional and categorical data. They use regularization techniques to reduce the complexity of the model and avoid overfitting. Lasso regression uses L1 regularization, which adds a penalty term proportional to the absolute value of the coefficients to the loss function. This can shrink some coefficients to zero and perform feature selection. Ridge regression uses L2 regularization, which adds a penalty term proportional to the square of the coefficients to the loss function. This can shrink all coefficients towards zero and reduce multicollinearity. Reference: [Lasso (statistics) - Wikipedia], [Ridge regression - Wikipedia]


Contribute your Thoughts:

Bong
1 months ago
Wait, I thought K-means was for clustering, not prediction. Maybe I should have paid more attention in class.
upvoted 0 times
...
Jani
1 months ago
Logistic regression? For a continuous dependent variable? I think someone needs to go back to Stats 101.
upvoted 0 times
...
Chandra
1 months ago
C and E are the way to go. Gotta love that L1 and L2 regularization.
upvoted 0 times
...
Craig
1 months ago
K-nearest neighbors? Really? That's like using a hammer to fix a computer.
upvoted 0 times
Kristin
6 days ago
E) Ridge regression
upvoted 0 times
...
Argelia
8 days ago
D) Logistic regression
upvoted 0 times
...
Craig
24 days ago
A) K-means
upvoted 0 times
...
...
Chantay
2 months ago
I would also consider using K-nearest neighbors for this task.
upvoted 0 times
...
Miesha
2 months ago
I agree with Antonio. Logistic regression is suitable for categorical features.
upvoted 0 times
...
Antonio
2 months ago
I think logistic regression would be a good choice.
upvoted 0 times
...
Tawny
2 months ago
I think Lasso regression and Ridge regression would be the best choices here. With so many categorical features, we need some form of regularization to avoid overfitting.
upvoted 0 times
Gladis
19 days ago
I think using Lasso and Ridge regression is a good idea to prevent overfitting.
upvoted 0 times
...
Paola
27 days ago
Logistic regression could be useful for binary classification.
upvoted 0 times
...
Annice
28 days ago
K-means and K-nearest neighbors wouldn't work well with categorical features.
upvoted 0 times
...
Ulysses
1 months ago
I agree, Lasso and Ridge regression would help with regularization.
upvoted 0 times
...
...

Save Cancel