HomeGroupsTalkMoreZeitgeist
Search Site
This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.

Results from Google Books

Click on a thumbnail to go to Google Books.

Mathematical Foundations of Information…
Loading...

Mathematical Foundations of Information Theory (Dover Books on Mathematics) (original 1953; edition 1957)

by A. Ya. Khinchin (Author)

MembersReviewsPopularityAverage ratingMentions
1572175,000 (3.67)1
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.… (more)
Member:guerriliteracy
Title:Mathematical Foundations of Information Theory (Dover Books on Mathematics)
Authors:A. Ya. Khinchin (Author)
Info:Dover Publications (1957), Edition: 1st Dover Edition, 128 pages
Collections:Your library
Rating:
Tags:None

Work Information

Mathematical Foundations of Information Theory by A. Ya. Khinchin (1953)

None
Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 1 mention

English (1)  Danish (1)  All languages (2)
This book develops Information Theory slightly further than Claude Shannon did. Although Khinchin praises Shannon for going and producing the ideas of Information Theory by himself, he acknowledges that the cases presented by Shannon were rather limited in scope to simplify the solutions.

The book as a whole is divided into two major sections; the first is called The Entropy Concept in Probability Theory and the second is called On The Fundamental Theorems of Information Theory. Both of these were originally papers printed by academic journals in the Russian Language.

The book was interesting, but I did pick out another short one, this book was only 120 pages long. ( )
  Floyd3345 | Jun 15, 2019 |
no reviews | add a review
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Epigraph
Dedication
First words
Quotations
Last words
Disambiguation notice
Publisher's editors
Blurbers
Original language
Canonical DDC/MDS
Canonical LCC

References to this work on external resources.

Wikipedia in English (1)

The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.

No library descriptions found.

Book description
Haiku summary

Current Discussions

None

Popular covers

Quick Links

Rating

Average: (3.67)
0.5
1
1.5
2
2.5
3 4
3.5
4 4
4.5
5 1

Is this you?

Become a LibraryThing Author.

 

About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 206,091,297 books! | Top bar: Always visible