How does the temperature setting in a decoding algorithm influence the probability distribution over the vocabulary?
Comprehensive and Detailed In-Depth Explanation=
Temperature adjusts the softmax distribution in decoding. Increasing it (e.g., to 2.0) flattens the curve, giving lower-probability words a better chance, thus increasing diversity---Option C is correct. Option A exaggerates---top words still have impact, just less dominance. Option B is backwards---decreasing temperature sharpens, not broadens. Option D is false---temperature directly alters distribution, not speed. This controls output creativity.
: OCI 2025 Generative AI documentation likely reiterates temperature effects under decoding parameters.
Currently there are no comments in this discussion, be the first to comment!