by Steven Roman (Author)
This book is an introduction to coding theory and information theory for undergraduate students of mathematics and computer science. Among the topics it discusses are: a review of probablity theory; the efficiency of codes, the capacity of communications channels, coding and decoding in the presence of errors, the general theory of linear codes, and examples of specific codes used in ordinalry communications as wwell as cryptography.
Back Jacket
This book is an introduction to coding and information theory, with an emphasis on coding theory. It is suitable for undergraduates with a modest mathematical background. While some previous knowledge of elementary linear algebra is helpful, it is not essential. All of the needed elementary discrete probability is developed in a preliminary chapter. After a preliminary chapter, there follows an introductory chapter on variable-length codes that culminates in Kraft's Theorem. Two chapters on Information Theory follow - the first on Huffman encoding and the second on the concept of the entropy of an information source, culminating in a discussion of Shannon's Noiseless Coding Theorem. The remaining four chapters cover the theory of error-correcting block codes. The first chapter covers communication channels, decision rules, nearest neighbor decoding, perfect codes, the main coding theory problem, the sphere-packing, Singleton and Plotkin bounds, and a brief discussion of the Noisy Coding Theorem. There follows a chapter on linear codes that begins with a discussion of vector spaces over the field (actual symbol not reproducible). The penultimate chapter is devoted to a study of the Hamming, Golay, and Reed-Muller families of codes, along with some decimal codes and some codes obtained from Latin squares. The final chapter contains a brief introduction to cyclic codes.
Number of Pages: 326
Dimensions: 0.87 x 9.55 x 7.21 IN
Illustrated: Yes
Publication Date: November 26, 1996