HomeGroupsTalkZeitgeist
This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Mathematical Foundations of Information…
Loading...

Mathematical Foundations of Information Theory (1953)

by A. Ya. Khinchin

MembersReviewsPopularityAverage ratingMentions
1271141,242 (3.63)1

None.

None
Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 1 mention

Indeholder "The Entropy Concept in Probability Theory", " 1. Entropy of Finite Schemes", " 2. The Uniqueness Theorem", " 3. Entropy of Markov chains", " 4. Fundamental Theorems", " 5. Application to Coding Theory", "On the Fundamental Theorems of Information Theory", " Introduction", " Chapter I. Elementary Inequalities", " 1. Two generalizations of Shannon's inequality", " 2. Three inequalities of Feinstein", " Chapter II. Ergodic Sources", " 3. Concept of a source. Stationarity. Entropy", " 4. Ergodic Sources", " 5. The E property. McMillan's theorem", " 6. The martingale concept. Doob's theorem", " 7. Auxiliary proposisions", " 8. Proof of McMillan's theorem", " Chapter III. Channels and the sources driving them", " 9. Concept of channel. Noise. Stationarity. Anticipation and memory", " 10. Connection of the channel to the source", " 11. The ergodic case", " Chapter IV. Feinstein's Fundamental Lemma", " 12. Formulation of the problem", " 13. Proof of the lemma", " Chapter V. Shannon's Theorems", " 14. Coding", " 15. The first Shannon theorem", " 16. The second Shannon theorem.", "Conclusion", "References".

"The Entropy Concept in Probability Theory" handler om ???
" 1. Entropy of Finite Schemes" handler om ???
" 2. The Uniqueness Theorem" handler om ???
" 3. Entropy of Markov chains" handler om ???
" 4. Fundamental Theorems" handler om ???
" 5. Application to Coding Theory" handler om ???
"On the Fundamental Theorems of Information Theory" handler om ???
" Introduction" handler om ???
" Chapter I. Elementary Inequalities" handler om ???
" 1. Two generalizations of Shannon's inequality" handler om ???
" 2. Three inequalities of Feinstein" handler om ???
" Chapter II. Ergodic Sources" handler om ???
" 3. Concept of a source. Stationarity. Entropy" handler om ???
" 4. Ergodic Sources" handler om ???
" 5. The E property. McMillan's theorem" handler om ???
" 6. The martingale concept. Doob's theorem" handler om ???
" 7. Auxiliary proposisions" handler om ???
" 8. Proof of McMillan's theorem" handler om ???
" Chapter III. Channels and the sources driving them" handler om ???
" 9. Concept of channel. Noise. Stationarity. Anticipation and memory" handler om ???
" 10. Connection of the channel to the source" handler om ???
" 11. The ergodic case" handler om ???
" Chapter IV. Feinstein's Fundamental Lemma" handler om ???
" 12. Formulation of the problem" handler om ???
" 13. Proof of the lemma" handler om ???
" Chapter V. Shannon's Theorems" handler om ???
" 14. Coding" handler om ???
" 15. The first Shannon theorem" handler om ???
" 16. The second Shannon theorem." handler om ???
"Conclusion" handler om ???
"References" handler om ???

Informationsteori som matematisk disciplin. Doob, Feinstein og Shannon. ( )
  bnielsen | Jan 9, 2017 |
no reviews | add a review
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
Dedication
First words
Quotations
Last words
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language
Canonical DDC/MDS

References to this work on external resources.

Wikipedia in English (1)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0486604349, Paperback)

The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition. 

(retrieved from Amazon Thu, 12 Mar 2015 18:13:21 -0400)

The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.… (more)

Quick Links

Popular covers

Rating

Average: (3.63)
0.5
1
1.5
2
2.5
3 4
3.5
4 3
4.5
5 1

Is this you?

Become a LibraryThing Author.

 

About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 134,776,929 books! | Top bar: Always visible