We use cookies to provide essential features and services. By using our website you agree to our use of cookies .


Warehouse Stock Clearance Sale

Grab a bargain today!

Mathematical Foundations of Information Theory


Product Description
Product Details

Table of Contents

The Entropy Concept In Probability Theory 1. Entropy of Finite Schemes 2. The Uniqueness Theorem 3. Entropy of Markov chains 4. Fundamental Theorems 5. Application to Coding Theory On the Fundamental Theorems of Information Theory INTRODUCTION CHAPTER I. Elementary Inequalities 1. Two generalizations of Shannon's inequality 2. Three inequalities of Feinstein CHAPTER II. Ergodic Sources 3. Concept of a source. Stationarity. Entropy 4. Ergodic Sources 5. The E property. McMillan's theorem. 6. The martingale concept. Doob's theorem. 7. Auxillary propositions 8. Proof of McMillan's theorem. CHAPTER III. Channels and the sources driving them 9. Concept of channel. Noise. Stationarity. Anticipation and memory 10. Connection of the channel to the source 11. The ergodic case CHAPTER IV. Feinstein's Fundamental Lemma 12. Formulation of the problem 13. Proof of the lemma CHAPTER V. Shannon's Theorems 14. Coding 15. The first Shannon theorem 16. The second Shannon theorem CONCLUSION REFERENCES

Ask a Question About this Product More...
Write your question below:
Look for similar items by category
Item ships from and is sold by Fishpond.com, Inc.
Back to top