HomeGroupsTalkMoreZeitgeist
Search Site
This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.

Results from Google Books

Click on a thumbnail to go to Google Books.

An Introduction to Information Theory:…
Loading...

An Introduction to Information Theory: Symbols, Signals, and Noise (original 1961; edition 1980)

by John R. Pierce

MembersReviewsPopularityAverage ratingConversations
633236,938 (3.92)None
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.… (more)
Member:AndrewMcBurney
Title:An Introduction to Information Theory: Symbols, Signals, and Noise
Authors:John R. Pierce
Info:Dover Publications (1980), paperback
Collections:Your library
Rating:
Tags:Mathematics, Information Theory

Work Information

An Introduction to Information Theory: Symbols, Signals, and Noise by John R. Pierce (1961)

None
Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

Showing 2 of 2
(Original Review, 1980-12-05)

Final answer to question, "How many joules to send a bit?"

The unit of information is determined by the choice of the arbitrary scale factor K in Shannon's entropy formula:

{ s(Q|X) = -K SUM(p*ln(p)) }

If K is made equal to 1/ln(2), then S is said to be measured in "bits" of information. A common thermodynamic choice for K is kN, where N is the number of molecules in the system considered and k is 1.38e-23 joule per degree Kelvin, Boltzmann's constant. With that choice, the entropy of statistical mechanics is expressed in joules per degree. The simplest thermodynamic system to which we can apply Shannon's equation is a single molecule that has an equal probability of being in either of two states, for example, an elementary magnet. In this case, p=.5 for both states and thus S=+k ln(2). The removal of that much uncertainty corresponds to one bit of information. Therefore, a bit is equal to k ln(2), or approximately 1e-23 joule per degree K. This is an important figure, the smallest thermodynamic entropy change that can be associated with a measurement yielding one bit of information.

The amount of energy needed to transmit a bit of information when limited by thermal noise of temperature T is:

E = kT ln 2 (Joules/bit)

This is derived from Shannon's initial work (1) on the capacity of a communications channel in a lucid fashion by Pierce (2), although it is not obvious that he was the first to derive it. This limit is the same as the amount of energy needed to store or read a bit of information in a computer, which Landauer derived (3) from entropy considerations without the use of Shannon's theorems. Pierce's book is reasonably readable. On page 192 he derives the energy per bit formula (Eq. 10.6), and on page 200 he describes a Maxwell Demon engine generating kT ln 2 of energy from a single molecule and showing that the Demon had to use that amount of energy to "read" the position of the molecule. Then on page 177 Pierce points out that one way of approaching this ideal signalling rate is to concentrate the signal power in a single, short, powerful pulse, and send this pulse in one of many possible time positions, each of which represents a different symbol. This is essentially the concept behind the patent (4) which led me to ask the original question. My thanks to those who helped with their replies.

REFERENCES

1. C. E. Shannon, "A Mathematical Theory of Communication", Bell
System Tech. J., Vol. 27, No. 3, 379-423 and No. 4, 623-656
(1948); re-printed in: C. E. Shannon and W. Weaver, "The
Mathematical Theory of Communication", University of Illinois
Press, Urbana, Illinois (1949).
2. J. R. Pierce, "Symbols, Signals and Noise", Harper, NY (1961)
3. R. Landauer, "Irreversibility and Heat Generation in the
Computing Process," IBM J. Res. & Dev., Vol. 5, 183 (1961).
4. R. L. Forward, "High Power Pulse Time Modulation
Communication System with Explosive Power Amplifier Means",
U. S. Patent 3,390,334 (25 June 1968).

[2018 EDIT: This review was written at the time as I was running my own personal BBS server. Much of the language of this and other reviews written in 1980 reflect a very particular kind of language: what I call now in retrospect a “BBS language”.] ( )
  antao | Nov 6, 2018 |
Brilliant and inspiring book. Enjoyed it immensely. Much use of highlighter. ( )
  jaygheiser | Jul 23, 2008 |
Showing 2 of 2
no reviews | add a review

» Add other authors

Author nameRoleType of authorWork?Status
John R. Pierceprimary authorall editionscalculated
Dorland, Cees vanTranslatorsecondary authorsome editionsconfirmed
Newman, James R.Editorsecondary authorsome editionsconfirmed
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Epigraph
Dedication
To Claude and Betty Shannon
First words
In 1948 Claude E. Shannon published a paper called "A Mathematical Theory of Communication";[sic] it appeared in book form in 1949.
Quotations
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Original language
Canonical DDC/MDS
Canonical LCC

References to this work on external resources.

Wikipedia in English (1)

Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.

No library descriptions found.

Book description
Haiku summary

Current Discussions

None

Popular covers

Quick Links

Rating

Average: (3.92)
0.5
1
1.5
2 1
2.5
3 9
3.5 3
4 9
4.5
5 10

Is this you?

Become a LibraryThing Author.

 

About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 204,831,674 books! | Top bar: Always visible