Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information...

Format:Hardcover

Language:English

ISBN:0471062596

ISBN13:9780471062592

Release Date:August 1991

Publisher:Wiley & Sons, Incorporated, John

Length:576 Pages

Weight:0.09 lbs.

Dimensions:1.0" x 6.5" x 9.6"

5 ratings

Published by Thriftbooks.com User , 14 years ago

I am writing this review in response to some confusion and unfairness I see in other reviews. Cover and Thomas have written a unique and ambitious introduction to a fascinating and complex subject; their book must be judged fairly and not compared to other books that have entirely different goals. Claude Shannon provided a working definition of "information" in his seminal 1948 paper, A Mathematical Theory of Communication. Shannon's interest in that and subsequent papers was the attainment of reliable communication in noisy channels. The definition of information that Shannon gave was perfectly fitted to this task; indeed, it is easily shown that in the context studied by Shannon, the only meaningful measure of information content that will apply to random variables with known distribution must be (up to a multiplicative constant) of the now-familiar form h(p) = log(1/p). However, Shannon freely admitted that his definition of information was limited in scope and was never envisioned as being universal. Shannon deliberately avoided the "murkier" aspects of human communication in framing his definitions; problematic themes such as knowledge, semantics, motivations and intentions of the sender and/or receiver, etc., were avoided altogether. For several decades, Information Theory continued to exist as a subset of the theory of reliable communication. Some classical and highly regarded texts on the subject are Gallager, Ash, Viterbi and Omura, and McEliece. For those whose interest in Information Theory is motivated largely by questions from the field of digital communications, these texts remain unrivalled standards; Gallager, in particular, is so highly regarded by those who learned from it that it is still described as superior to many of its more recent, up-to-date successors. In recent decades, Information Theory has been applied to problems from across a wide array of academic disciplines. Physicists have been forced to clarify the extent to which information is conserved in order to completely understand black hole dynamics; biologists have found extensive use of Information Theoretic concepts in understanding the human genome; computer scientists have applied Information Theory to complex issues in computational vs. descriptive complexity (the Mandelbrot set, which has been called the most complex set in all of mathematics, is actually extremely simple from the point of view of Kolmogorov complexity); and John von Neumann's brilliant creation, game theory, which has been called "a universal language for the unification of the behavioral sciences," is intimately coupled to Information Theory, perhaps in ways that have not yet been fully appreciated or explored. Cover and Thomas' book "Elements of Information Theory" is written for the reader who is interested in these eclectic and exciting applications of Information Theory. This book does NOT treat Information Theory as a subset of reliable communication theory; ther

Published by Thriftbooks.com User , 15 years ago

The preface of this book says, 'This is intended to be a simple and accessible book on information theory.' That's true, but it is aimed at the senior year or early graduate level where a theoretical background is needed for computer science, communications engineering, applied mathematics or similar fields. The mathematical nature of the book says that the student should at least have a background through calculus and a couple of upper level courses in statistics/probability. After all, Information Theory is generally considered to be a branch of applied mathematics. On the whole, the writing style of the book (other than the equasions) is rather light and entertaining. For instance his discussion on the similarities between gambling and data compression brings a rather complex notion down something we can identify - that's before he gets into the math of course. One complaint about the first edition of the book was that it didn't have enough problems for the student. This has been solved by the addition of a couple of hundred additional problems. There is also a dedicated web site for this book with more material, including solutions to selected problems.

Published by Thriftbooks.com User , 20 years ago

If you want a single book on information theory, you must have this one. Tom Cover's writing style is great. He wrote a clear, concise, and understandable book on a somewhat involved topic. The notation is neat and clean, something difficult to find in Information Theory books. The problems at the end of each chapter are very useful and a great aid to master the material. There is also a summary at end of each chapter with the main definitions and results just studied. The intuition behind Information Theory important quantities such as entropy and mutual information is well explained through several toy examples and Venn diagrams. The channel coding theorem for example is thoroughly discussed. There is also a sketchy and intuitive proof of the capacity of the AWGN channel based on a sphere packing argument which is given before the more rigorous one comes along. This is the standard text on information theory, deservedly so.

Published by Thriftbooks.com User , 21 years ago

Thomas Cover is a well-known researcher for both his excellent and sometimes surprising work in information theory, and his reputation as a teacher. The result here is a very well-written and gentle "overview" of information theory that is designed as a comprehensive introduction to the subject.One thing to note about this book is that it is by design both an introduction and a survey of information theory, as the title suggests. It starts off with the basic concepts of information theory such as entropy and mutual information, and continues on with brief and gentle reviews of different more intermediate topics such as entropy rates in random processes, introduction to coding, and finally with the channel coding theorem, rate-distortion theorem, network information theory, and other more advanced topics.While I find that his treatment of the intermediate and advanced topics to be excellent, there are a few weak aspects on this book's treatment of the introductory topics here and there. However, with just a little persistence the reader will be well rewarded by Cover's excellent writing. At each topic, the reader is presented with reason, motivation, intuition and example before delving into the rigorous treatment of the subject. Therefore even the most casual reader will be rewarded with good insights into the different topics in information theory.That all said, I highly recommend this book to anybody armed with elementary probability who is interested in the general area of communication, signal processing and information theory. Readers who are alergic to math are recommended to start with J.R. Pierce's "Introduction to Information Theory" and readers looking for a casual introduction to the fundamental concepts in information theory are recommended to find a copy of A. Renyi's hard to find "A Diary on Information Theory".

Published by Thriftbooks.com User , 22 years ago

Cover and Thomas have written an excellent book on information theory. This book is suitable for an introductory type course but the entire book probably cannot be covered at once. All of the explanations and proofs are very understandable, although the section on types can be a tad confusing at times! I would not recommend this book to someone with a casual interest in information theory, rather to someone who wants a more rigorous treatment of the underlying mathematics.