BlackFriday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

SAS Exam A00-405 Topic 9 Question 53 Discussion

Actual exam question for SAS's A00-405 exam
Question #: 53
Topic #: 9
[All A00-405 Questions]

Which option is the correct activation (unction for the output layer in a CNN model trained to classify an image belonging to one of the n classes (CI. C2, C3, , Cn)?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Shawnta
5 months ago
Yes, ReLU is commonly used for hidden layers to introduce non-linearity.
upvoted 0 times
...
Olga
5 months ago
What about ReLU? Isn't it commonly used for hidden layers in CNNs?
upvoted 0 times
...
Dong
5 months ago
I agree, Softmax is used for multi-class classification tasks like in CNN models.
upvoted 0 times
...
Shawnta
5 months ago
I think the correct activation function is Softmax.
upvoted 0 times
...
William
6 months ago
I prefer Sigmoid as the activation function for the output layer, it helps in binary classification tasks.
upvoted 0 times
...
Elden
6 months ago
I believe Softmax is the correct activation function because it gives probabilities for each class.
upvoted 0 times
...
Glory
6 months ago
I would go with ReLU as the correct activation function for the output layer.
upvoted 0 times
...
Mable
7 months ago
I think the correct activation function for the output layer in a CNN model is Softmax.
upvoted 0 times
...

Save Cancel