Q1(i). Find the parity check matrix.
A: Based on the given generator G, we can easily find G = I | P.
Therefore,
From P, we can find P^T.
Since P^T - P^T = 0, P^T = -P^T.
The parity check matrix, H = -PT | I
Q1(ii) How many errors can this code detect?
A: According to Singleton bound, d* <= (n-k + 1).
Apply n=5, k=3, then the minimum distance, d* is
d* <= 5-3+1
d* <= 3
Therefore, the minimum distance is equal or less than 3.
According to algorithm for nearest neighbor decoding, d* <= t+ 1.
Apply d*=3, then the number of detection, t is
3 <= t+1
2 <= t
The number of detection is equal or less than 2.
Q1(iii) How many errors can this code correct?
A: d* <= 2t + 1
Apply d*=3, then the number is correction, t is
3 <= 2t + 1
1 <= t
The number of correction is equal or less than 1.
Q1(iv) How many erasures can this code correct?
A: d* = e + 1
Apply d*=3, then the number of erasures, e is
3 = e + 1
2 = e
Q1(v) Is this a perfect code?
A: In order to be a perfect code, it must be able produce the same result for Sphere Packing Bound or Hamming Bound.
Sphere Packing Bound or Hamming Bound
Apply n=5, q=4 for GF(4), d*=3, k=3.
M = q^k = 4^3 = 64
r = (d* - 1) / 2 = (3-2) /2 = 1
Left Hand Side:
Right Hand Side:
Since Left Hand Side = Right Hand Side, it is a perfect code.
2. Consider the channel model shown in figure 1, which accepts three symbols. The first symbol, A is not affected by noise, while the second (B) and third (C) symbols have a probability p of not being corrupted and a probability q of being changed into the other of the pair. Let a = -p log p - q log q, and let P be the probability that the first symbol is chosen. Furthermore, let Q be the probability that either of the other two is chosen, so that P+2Q = 1.
(a) Show that H(X) = -P log P - 2Q log Q.
(b) Show that H(X|Y) = 2Qa.
(c) State how you would determine the capacity of this channel.
X is the set of input symbol while Y is the set of output symbols.
A:
(a) Probability of Input X=A, P(X=A) =P
Entropy, H(X=A) = -P log P
Probability of Input X=B, P(X=B) = Q
Entropy, H(X=B) = -Q log Q
Probability of Input X=C, P(X=C) = Q
Entropy, H(X=C) = - Q log Q
Input Entropy, H(X) = H(X=A) + H(X=B) + H(X=C) = -P log P -2Q log Q
(b) Probability of Input X=A, P(X=A) = P
Probability of Input X=B, P(X=B) = Q
Probability of Input X=C, P(X=C) = Q
Probability of Output Y=A, P(Y=A) = P
Probability of Output Y=B, P(Y=B) = Q(p+q)
Probability of Output Y=C, P(Y=C) = Q(p+q)
Probability of Input X=A & Output Y=A,
P(X=A|Y=A) = P(X=A,Y=A) / P(Y=A) = P/P =1
Probability of Input X=B & Output Y=B,
P(X=B|Y=B) = P(X=B,Y=B) / P(Y=B) = Qp/Q(p+q) =p/(p+q)
Probability of Input X=B & Output Y=C,
P(X=B|Y=C) = P(X=B,Y=C) / P(Y=C) = Qq/Q(p+q) =q/(p+q)
Probability of Input X=C & Output Y=B,
P(X=C|Y=B) = P(X=C,Y=B) / P(Y=B) = Qq/Q(p+q) =q/(p+q)
Probability of Input X=C & Output Y=C,
P(X=C|Y=C) = P(X=C,Y=C) / P(Y=C) = Qp/Q(p+q) =p/(p+q)
H(X|Y=A) = -(P(X=A|Y=A) log P(X=A|Y=A)) = - (1 log 1) = 0
H(X|Y) = P(Y=A)H(X|Y=A) + P(Y=B)H(X|Y=B) + P(Y=C)H(X|Y=C)
= P*0 + Q(p+q) * a + Q(p+q) * a
= 2Qa
(c) The mutual information,
I(X;Y) = H(X)- H(X|Y) = -P log P - 2Q (log Q+p log p + q log q)
3. Consider the (23,12,7) binary code. What will be the approximate word error rate if it is used over a binary symmetric channel (BSC) with probability of bit error p=0.001?
A: From given q-ary code (n,k,d*), n=23, k=12, d*=7.
d* >= 2t + 1
7 >= 2t + 1
3 >= t
This q-ary code can correct up to 3 errors. In another word, i = 3 for the formulae below.
No comments:
Post a Comment