A directory of Objective Type Questions covering all the Computer Science subjects. Here you can access and discuss Multiple choice questions and answers for various compitative exams and interviews.

Discussion Forum

Que. Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X={0,1} and associated probabilities {1/2,1/2}. The channel matrix is as shown below, where p is the transition probability. Then the conditional entropy is given by :
a. 1
b. -plog(p)-(1-p)log(1-p)
c. 1+plog(p)+(1-p)log(1-p)
d. 0
Answer:-plog(p)-(1-p)log(1-p)
Confused About the Answer? Ask for Details Here
Know Explanation? Add it Here

Similar Questions: