HjemGrupperSnakMereZeitgeist
Søg På Websted
På dette site bruger vi cookies til at levere vores ydelser, forbedre performance, til analyseformål, og (hvis brugeren ikke er logget ind) til reklamer. Ved at bruge LibraryThing anerkender du at have læst og forstået vores vilkår og betingelser inklusive vores politik for håndtering af brugeroplysninger. Din brug af dette site og dets ydelser er underlagt disse vilkår og betingelser.

Resultater fra Google Bøger

Klik på en miniature for at gå til Google Books

Indlæser...

Mathematical Foundations of Information Theory (1953)

af A. Ya. Khinchin

MedlemmerAnmeldelserPopularitetGennemsnitlig vurderingOmtaler
1572173,649 (3.67)1
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.… (mere)
Ingen
Indlæser...

Bliv medlem af LibraryThing for at finde ud af, om du vil kunne lide denne bog.

Der er ingen diskussionstråde på Snak om denne bog.

» See also 1 mention

Engelsk (1)  Dansk (1)  Alle sprog (2)
Indeholder "The Entropy Concept in Probability Theory", " 1. Entropy of Finite Schemes", " 2. The Uniqueness Theorem", " 3. Entropy of Markov chains", " 4. Fundamental Theorems", " 5. Application to Coding Theory", "On the Fundamental Theorems of Information Theory", " Introduction", " Chapter I. Elementary Inequalities", " 1. Two generalizations of Shannon's inequality", " 2. Three inequalities of Feinstein", " Chapter II. Ergodic Sources", " 3. Concept of a source. Stationarity. Entropy", " 4. Ergodic Sources", " 5. The E property. McMillan's theorem", " 6. The martingale concept. Doob's theorem", " 7. Auxiliary proposisions", " 8. Proof of McMillan's theorem", " Chapter III. Channels and the sources driving them", " 9. Concept of channel. Noise. Stationarity. Anticipation and memory", " 10. Connection of the channel to the source", " 11. The ergodic case", " Chapter IV. Feinstein's Fundamental Lemma", " 12. Formulation of the problem", " 13. Proof of the lemma", " Chapter V. Shannon's Theorems", " 14. Coding", " 15. The first Shannon theorem", " 16. The second Shannon theorem.", "Conclusion", "References".

"The Entropy Concept in Probability Theory" handler om ???
" 1. Entropy of Finite Schemes" handler om ???
" 2. The Uniqueness Theorem" handler om ???
" 3. Entropy of Markov chains" handler om ???
" 4. Fundamental Theorems" handler om ???
" 5. Application to Coding Theory" handler om ???
"On the Fundamental Theorems of Information Theory" handler om ???
" Introduction" handler om ???
" Chapter I. Elementary Inequalities" handler om ???
" 1. Two generalizations of Shannon's inequality" handler om ???
" 2. Three inequalities of Feinstein" handler om ???
" Chapter II. Ergodic Sources" handler om ???
" 3. Concept of a source. Stationarity. Entropy" handler om ???
" 4. Ergodic Sources" handler om ???
" 5. The E property. McMillan's theorem" handler om ???
" 6. The martingale concept. Doob's theorem" handler om ???
" 7. Auxiliary proposisions" handler om ???
" 8. Proof of McMillan's theorem" handler om ???
" Chapter III. Channels and the sources driving them" handler om ???
" 9. Concept of channel. Noise. Stationarity. Anticipation and memory" handler om ???
" 10. Connection of the channel to the source" handler om ???
" 11. The ergodic case" handler om ???
" Chapter IV. Feinstein's Fundamental Lemma" handler om ???
" 12. Formulation of the problem" handler om ???
" 13. Proof of the lemma" handler om ???
" Chapter V. Shannon's Theorems" handler om ???
" 14. Coding" handler om ???
" 15. The first Shannon theorem" handler om ???
" 16. The second Shannon theorem." handler om ???
"Conclusion" handler om ???
"References" handler om ???

Informationsteori som matematisk disciplin. Doob, Feinstein og Shannon. ( )
  bnielsen | Jan 9, 2017 |
ingen anmeldelser | tilføj en anmeldelse
Du bliver nødt til at logge ind for at redigere data i Almen Viden.
For mere hjælp se Almen Viden hjælpesiden.
Kanonisk titel
Originaltitel
Alternative titler
Oplysninger fra den engelske Almen Viden Redigér teksten, så den bliver dansk.
Oprindelig udgivelsesdato
Personer/Figurer
Vigtige steder
Vigtige begivenheder
Beslægtede film
Indskrift
Tilegnelse
Første ord
Citater
Sidste ord
Oplysning om flertydighed
Forlagets redaktører
Bagsidecitater
Originalsprog
Canonical DDC/MDS
Canonical LCC

Henvisninger til dette værk andre steder.

Wikipedia på engelsk (1)

The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.

No library descriptions found.

Beskrivelse af bogen
Haiku-resume

Current Discussions

Ingen

Populære omslag

Quick Links

Vurdering

Gennemsnit: (3.67)
0.5
1
1.5
2
2.5
3 4
3.5
4 4
4.5
5 1

Er det dig?

Bliv LibraryThing-forfatter.

 

Om | Kontakt | LibraryThing.com | Brugerbetingelser/Håndtering af brugeroplysninger | Hjælp/FAQs | Blog | Butik | APIs | TinyCat | Efterladte biblioteker | Tidlige Anmeldere | Almen Viden | 204,432,807 bøger! | Topbjælke: Altid synlig