Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CertNexus Exam AIP-210 Topic 5 Question 35 Discussion

Actual exam question for CertNexus's AIP-210 exam
Question #: 35
Topic #: 5
[All AIP-210 Questions]

You have a dataset with thousands of features, all of which are categorical. Using these features as predictors, you are tasked with creating a prediction model to accurately predict the value of a continuous dependent variable. Which of the following would be appropriate algorithms to use? (Select two.)

Show Suggested Answer Hide Answer
Suggested Answer: C, E

Lasso regression and ridge regression are both types of linear regression models that can handle high-dimensional and categorical data. They use regularization techniques to reduce the complexity of the model and avoid overfitting. Lasso regression uses L1 regularization, which adds a penalty term proportional to the absolute value of the coefficients to the loss function. This can shrink some coefficients to zero and perform feature selection. Ridge regression uses L2 regularization, which adds a penalty term proportional to the square of the coefficients to the loss function. This can shrink all coefficients towards zero and reduce multicollinearity. Reference: [Lasso (statistics) - Wikipedia], [Ridge regression - Wikipedia]


Contribute your Thoughts:

Izetta
2 months ago
I'm just picturing a room full of data scientists arguing over the 'correct' answer like it's the meaning of life. Let's just put 'C' and 'E' and call it a day, shall we?
upvoted 0 times
Yuette
23 days ago
Sounds good to me. Let's keep it simple.
upvoted 0 times
...
Carylon
27 days ago
I agree, let's go with Lasso regression and Ridge regression.
upvoted 0 times
...
Kenny
1 months ago
C and E are good choices. They work well with categorical data.
upvoted 0 times
...
...
Terrilyn
2 months ago
Logistic regression? I must have missed the part where the question said the dependent variable was categorical. Somebody needs to brush up on their algorithm selection skills!
upvoted 0 times
Laurene
15 days ago
Logistic regression is not appropriate for this scenario.
upvoted 0 times
...
Tenesha
16 days ago
C) Lasso regression
upvoted 0 times
...
Rosalind
16 days ago
E) Ridge regression
upvoted 0 times
...
Myrtie
17 days ago
B) K-nearest neighbors
upvoted 0 times
...
Valentin
19 days ago
D) Logistic regression
upvoted 0 times
...
Hannah
19 days ago
C) Lasso regression
upvoted 0 times
...
Andra
28 days ago
B) K-nearest neighbors
upvoted 0 times
...
Malcom
1 months ago
A) K-means
upvoted 0 times
...
...
Ryan
2 months ago
I believe K-nearest neighbors can handle categorical data effectively.
upvoted 0 times
...
Tamra
2 months ago
I would also consider using K-nearest neighbors for this task.
upvoted 0 times
...
Estrella
2 months ago
K-nearest neighbors? Really? I don't think that's the best fit for a continuous dependent variable. Maybe I'm missing something here.
upvoted 0 times
Angella
1 months ago
D) Logistic regression
upvoted 0 times
...
Vashti
2 months ago
A) K-means
upvoted 0 times
...
...
Noble
2 months ago
I agree with Van, logistic regression is suitable for categorical features.
upvoted 0 times
...
Patrick
2 months ago
I'm feeling the vibe of Lasso regression and Ridge regression. They seem like they'd be great for handling all those categorical features.
upvoted 0 times
...
Hubert
3 months ago
Hmm, I'm not sure K-means is the way to go here. Isn't that more for clustering rather than prediction?
upvoted 0 times
Rusty
1 months ago
D) Logistic regression
upvoted 0 times
...
Lauran
1 months ago
C) Lasso regression
upvoted 0 times
...
Lennie
1 months ago
B) K-nearest neighbors
upvoted 0 times
...
Jamal
1 months ago
A) K-means
upvoted 0 times
...
...
Van
3 months ago
I think logistic regression would be a good choice.
upvoted 0 times
...

Save Cancel