## Discussion Forum

Que. | Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet X={0,1} and associated probabilities {1/2,1/2}. The channel matrix is as shown below, where p is the transition probability. Then the conditional entropy is given by : |

a. | 1 |

b. | -plog(p)-(1-p)log(1-p) |

c. | 1+plog(p)+(1-p)log(1-p) |

d. | 0 |

Answer:-plog(p)-(1-p)log(1-p) |