Introduction to Information Theory

Introduction to Information Theory

Information Theory Linear Block Codes Jalal Al Roumy Hamming distance The intuitive concept of closeness'' of two words is formalized through Hamming distance d (x, y) of words x, y. For two words (or vectors) x, y; d (x, y) = the number of symbols x and y differ. Example: d (10101, 01100) = 3, d (first, second, fifth) = 3 of Hamming distance Properties (1) d (x, y) = 0; iff x = y (2) d (x, y) = d (y, x) (3) d (x, z) d (x, y) + d (y, z) triangle inequality An important parameter of codes C is their minimal distance. d (C) = min {d (x, y) | x, y C, x y}, because it gives the smallest number of errors needed to change one codeword into another. Theorem Basic error correcting theorem (1) A code C can detect up to s errors if d (C) s + 1. (2) A code C can correct up to t errors if d (C) 2t + 1. Note for binary linear codes d (C) = smallest weight W 2 (C) of non-zero codeword, Some notation Notation: An (n,M,d) - code C is a code such that n - is the length of codewords. M - is the number of codewords. d - is the minimum distance in C. Example: C1 = {00, 01, 10, 11} is a (2,4,1)-code. C2 = {000, 011, 101, 110} is a (3,4,2)-code. C3 = {00000, 01101, 10110, 11011} is a (5,4,3)-code. Comment: A good (n,M,d) code has small n and large M and d. 3

Code Rate For q-nary (n,M,d)-code we define code rate, or information rate, R, by R lg q M n . The code rate represents the ratio of the number of input data symbols to the number of transmitted code symbols. log (64) R 6 / 32 32 For a Hadamard code eg, this is an important parameter for real implementations, because it shows what fraction of the bandwidth is being used to transmit actual data. Recall that log2(n) = ln(n)/ln(2) 2 4 Equivalence of codes Definition Two q -ary codes are called equivalent if one can be obtained from the other by a combination of operations of the following type: (a) a permutation of the positions of the code. (b) a permutation of symbols appearing in a fixed position. Let a code be displayed as an M n matrix. To what correspond operations (a) and (b)? Distances between codewords are unchanged by operations (a), (b). Consequently, equivalent codes have the same parameters (n,M,d) (and correct the same number of errors). Examples of equivalent codes 0

0 1 1 1 0 1 0 0 0 0 0 1 1 0 1 1 1 1 1 1 0 0 0 1 0 0 0 0 1 1 0 1 0 1 1 1 1 0 1 0 0 0 0 0 1 2 2 1 1 1 1 2 0 2 2 2 2 0 1 Lemma Any q -ary (n,M,d) -code over an alphabet {0,1,,q -1} is equivalent to an (n,M,d) -code which contains the all-zero codeword 00 0. 5 The main coding theory problem A good (n,M,d) -code has small n, large M and large d. The main coding theory problem is to optimize one of the parameters n, M, d for given values of the other two. Notation: Aq (n,d) is the largest M such that there is an q -nary (n,M,d) -code. 6

Introduction to linear codes A linear code over GF(q) [Galois Field] where q is a prime power is a subset of the vector space V(n, q) for some positive value of n. C is a subspace of V(n, q) iff (1) u v C for all u and v in C (2) a.u C for all u C, a GF (q) A binary code is linear iff the sum of any two codewords is a codeword. If C is a k - dimentional subspace of V (n, k) the the linear code is called an [n, k] - code or and [n, k, d] - code if the distance is added. A q - ary [n, k, d] - code is a q - ary (n, q k , d) - code but of course not every (n, q k , d) - code is an [n, k, d] - code. the all - zero vector 0 automatically belongs to a linear code. The weight w(x) of a vector in V(n, q) is defined to be the number of non - zero entries of x. The minimum distance of a linear code is equal to the smallest of the weights of the non - zero codewords. 7 Linear Block Codes Information is divided into blocks of length k r parity bits or check bits are added to each block (total length n = k + r),. Code rate R = k/n Decoder looks for codeword closest to received vector (code vector + error vector) Tradeoffs between Efficiency Reliability Encoding/Decoding complexity 8 Linear Block Codes Message vector Generator matrix

Code Vector Code Vector m G Parity check matrix C C H Null vector T 0 Operations of the generator matrix and the parity check matrix The parity check matrix H is used to detect errors in the received code by using the fact that c * HT = 0 ( null vector) Let x = c e be the received message; c is the correct code and e is the error Compute S = x * HT =( c e ) * HT =c HT e HT = e HT

If S is 0 then message is correct else there are errors in it, from common known error patterns the correct message can be decoded. 9 Linear Block Codes Linear Block Code The block length C of the Linear Block Code is C=mG where m is the information codeword block length, G is the generator matrix. G = [Ik | P] k n, I is unit matrix. The parity check matrix H = [PT | In-k ], where PT is the transpose of the matrix p. 10 Forming the generator matrix The generator matrix is formed from the list of codewords by ignoring the all zero vector and the linear combinations; eg 0 1 1 1 0 1 0 0 C 0 0 0 1

0 1 1 1 0 1 0 1 1 0 1 0 0 1 0 0 1 0 1 1 0 1 0 0 1 1 0 1 0 1 1 0 0 1 0 1

0 1 0 0 0 1 1 0 1 1 1 1 0 0 1 0 C [7, 4, 3] code 0 1 1 0 0 0 1 1 0 0 1 1 1 0 0 1 0 1 0 1 0

0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 1 0 0 0 1 0 1 0 1 1 1 0 1 1 giving G 1

0 1 0 1 1 1 0 0 1 1 0 0 0 1 1 0 0 1 0 1 0 1 1 0 1 11 Equivalent linear [n,k]-codes Two k x n matrices generate equivalent linear codes over GF(q) if one matrix can be obtained from the other by a sequence of operations of the following types:

(R1) permutation of rows (R2) multiplication of a row by a non-zero scaler (R3) Addition of a scaler multiple of one row to another (C1) Permutation of columns (C2) Multiplication of any column by a non-zero scaler The row operations (R) preserve the linear independence of the rows of the generator matrix and simply replace one basis by another of the same code. The column operations (C) convert the generator matrix to one for an equivalent code. 12 Transforming the generator matrix Transforming to the form G = [Ik | P] 1 1 1 0 1 0 0 0 1 1 1 1 1 1 1 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 0 0 0 0 1 0 1 1 0

1 0 0 1 1 1 0 0 1 1 1 0 1 0 0 1 0 1 1 0 1 0 0 1 Therefore G [I k | A] 0 0 0 0 1 1 1 1 1 1 1 0 1 1 1 0 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 0 1 0 1 1 0 0 1 1 1 0 1 0 1 1 0 0 0 1 0 1 1 0 0 1 0 1 0 0 1 1 1 1 0 1 1 0 0 1 0 1 1 0 0 0 1 0 1 1 1 1 0 1 0

0 1 1 1 0 1 1 1 0 0 0 1 13 Encoding with the generator Codewords = message vector u x G For example, where 1 0 0 0 1 0 1 0 1 0 0 1 1 1 then G 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 is encoded as 0 0 0 0 0 0 0 1 0 0 0 is encoded as 1 0 0 0 1 0 1 1 1 1 0 is encoded as 1 1 1 0 1 0 0 Note that 1110 is found by adding R1 , R2 and R3 of G 14 Parity-check matrix A parity check matrix H for an [n, k]-code C is and (n - k) x n matrix such that x. HT = 0 iff x C. A parity-check matrix for C is a generator matrix for the duel code C . If G = [Ik | A] is the standard form generator matrix for an [n, k]-code C, then the parity-check matrix for C is H = [-AT | In-k ]. A parity check matrix of the form [B | In-k ] is said to be in standard form.

1 G 0 0 . a11 . . . . . . . 1 ak1 a11 . H . a1,n k a1,n k . .

ak,n k . . ak1 1 . . . . ak,n k 0 0 1 15 Decoding using Slepian matrix An elegant nearest-neighbour decoding scheme was devised by Slepian in 1960. every vector in V(n, q) in in some coset of C every coset contains exactly qk vectors two cosets are either disjoint or coincide 1 0 1 1 Let G giving C {0000, 1011, 0101, 1110} 0 1 0

1 codewords 0000 1011 0101 1110 1000 0011 1101 0110 0100 1111 0001 1010 0010 coset 1001 0111 1100 leaders When y is received (eg 1111) its position is found. The decoder decides that the error is the cos et leader 0100. x y e 1011. 16 Syndrome decoding Suppose C is a q-ary [n, k]-code with the parity-check matrix H. For any vector y = V(n, q), the row vector S(y) = y HT is called the syndrome of y. Two vectors have the same syndromes iff they lie in the same coset. 1 0 1 1 1 0 1 0 G H 0 1

0 1 1 1 0 1 The sydromes of the cos et leaders from our example S(0000) S(1000) S(0100) S(0010) Syndromex 00 11 01 10 00 11 this will then give a syndrome lookup table 01 10 cos et leader f (x) 0000 1000 0100 0010 17 Decoding procedure The rules: Step 1 For a received vector y calculate S(y) yH T Step 2 Let x S(y), and locate z in the first column of the look-up table Step 3 Decode y as y - f (x). For example, if y 1111, then S(y) 01 and we decode as 1111 0100 1011 18 Example 1 0 0 1 G

0 1 1 1 codewords 2 2 4 0000 1000 Slepian matrix is 0100 0010 1001 0001 1101 1011 0111 1111 0011 0101 vectors 2 4 16 1110 0110 1010 1100 0011 is decoded as 0111 1110 1110 Parity check matrix 0 1 1 0 H 1 1 0 1 Syndrome lookup table

S(y) yH T 00 01 11 10 0 1 S(y) y 1 0 1 1 0 1 0000 1000 0100 0010 19

Recently Viewed Presentations

  • Lets Put an end to sentences! Language Arts

    Lets Put an end to sentences! Language Arts

    An end mark is also known as punctuation, and comes at the end of a sentence. It lets you know when to stop. What are the different kinds of end marks? An exclamation mark shows excitement or feeling. That is...
  • Katrina, Rita & All That Jazz: Where Are We Now?

    Katrina, Rita & All That Jazz: Where Are We Now?

    Pop quiz. The Importance of Pro Bono Work is to: Improve Access to Justice. Gain Legal Experience. Market Your Practice or Firm. Uphold Our Profession. All of the above. Pro bono 101Graduation. Now that you've completed Pro Bono . 101,...
  • Robert Quinn - Scoilnet

    Robert Quinn - Scoilnet

    By Aisling Doherty, 4th Class Pupil in St Baithin's N.S. St Johnston, Co Donegal. Robert Quinn. 1889 - 1972. Robert Quinn's life before 1916. Robert Quinn. Robert Quinn was born in 1889. His family were living in Drumcrow, near St...
  • Combining Motivational Interviewing and Acceptance and ...

    Combining Motivational Interviewing and Acceptance and ...

    What is MI? MI is a collaborative, goal oriented style of communication with particular attention to the language of change. It is designed to strengthen personal motivation for and commitment to a specific goal by eliciting and exploring the person's...
  • THE AMERICAN REVOLUTION Happily Ever After..not exactly Colonists,

    THE AMERICAN REVOLUTION Happily Ever After..not exactly Colonists,

    End of War. The French and Indian War finally came to an end in 1763 when a peace treaty was signed in Paris. (Treaty of Paris) A. treaty . is an agreement between nations for peace, trade, or other matters....
  • TECHNOLOGY IN ANCIENT EGYPT HOW TO MAKE A

    TECHNOLOGY IN ANCIENT EGYPT HOW TO MAKE A

    TECHNOLOGY IN ANCIENT EGYPT HOW TO MAKE A GLASS POT 1.Glassmakers used damp sand and dung to make the mold for the pot. 2.Next they attached a metal rod and kept it in there until it hardened in the kiln....
  • Lecture 1 - Texas Tech University

    Lecture 1 - Texas Tech University

    In this course (and in most of the modern world, except the USA!) we will use (almost) exclusively the SI system of units. SI = "Systéme International" (French) More commonly called the "MKS system" (meter-kilogram-second) or more simply, "the metric...
  • The Might of the Right The impact of

    The Might of the Right The impact of

    Civil Rights March on Washington, D.C. / [WKL]. 1963. Library of Congress. Here we see the right to assembly, the right to freedom of speech, and the right to petition the government being demonstrated.