Elements of Information Theory, 2/e (Hardcover)
暫譯: 資訊理論要素(第二版)
Thomas M. Cover, Joy A. Thomas
- 出版商: Wiley
- 出版日期: 2006-06-01
- 售價: $2,280
- 貴賓價: 9.8 折 $2,234
- 語言: 英文
- 頁數: 776
- 裝訂: Hardcover
- ISBN: 0471241954
- ISBN-13: 9780471241959
-
相關分類:
物理學 Physics、機率統計學 Probability-and-statistics
-
相關翻譯:
信息論基礎 (Elements of Information Theory, 2/e) (簡中版)
立即出貨 (庫存=1)
買這商品的人也買了...
-
$2,390$2,271 -
$1,176Computer Organization and Design: The Hardware/Software Interface, 3/e(IE) (美國版ISBN:1558606041)
-
$680$646 -
$350$277 -
$880$695 -
$880$695 -
$780$741 -
$2,831Thinking in Java, 4/e (Paperback)
-
$780$663 -
$650$507 -
$550$435 -
$980$774 -
$1,080CMMI: Guidelines for Process Integration and Product Improvement, 2/e
-
$1,200$948 -
$400$340 -
$880$695 -
$990$891 -
$600$480 -
$720$569 -
$620$527 -
$1,550Optimal Control, 3/e (Hardcover)
-
$580$452 -
$3,180$3,021 -
$580$458 -
$580$458
商品描述
Description
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Table of Contents
Preface to the Second Edition.
Preface to the First Edition.
Acknowledgments for the Second Edition.
Acknowledgments for the First Edition.
1. Introduction and Preview.
1.1 Preview of the Book.
2. Entropy, Relative Entropy, and Mutual Information.
2.1 Entropy.
2.2 Joint Entropy and Conditional Entropy.
2.3 Relative Entropy and Mutual Information.
2.4 Relationship Between Entropy and Mutual Information.
2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information.
2.6 Jensen’s Inequality and Its Consequences.
2.7 Log Sum Inequality and Its Applications.
2.8 Data-Processing Inequality.
2.9 Sufficient Statistics.
2.10 Fano’s Inequality.
Summary.
Problems.
Historical Notes.
3. Asymptotic Equipartition Property.
3.1 Asymptotic Equipartition Property Theorem.
3.2 Consequences of the AEP: Data Compression.
3.3 High-Probability Sets and the Typical Set.
Summary.
Problems.
Historical Notes.
4. Entropy Rates of a Stochastic Process.
4.1 Markov Chains.
4.2 Entropy Rate.
4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph.
4.4 Second Law of Thermodynamics.
4.5 Functions of Markov Chains.
Summary.
Problems.
Historical Notes.
5. Data Compression.
5.1 Examples of Codes.
5.2 Kraft Inequality.
5.3 Optimal Codes.
5.4 Bounds on the Optimal Code Length.
5.5 Kraft Inequality for Uniquely Decodable Codes.
5.6 Huffman Codes.
5.7 Some Comments on Huffman Codes.
5.8 Optimality of Huffman Codes.
5.9 Shannon–Fano–Elias Coding.
5.10 Competitive Optimality of the Shannon Code.
5.11 Generation of Discrete Distributions from Fair Coins.
Summary.
Problems.
Historical Notes.
6. Gambling and Data Compression.
6.1 The Horse Race.
6.2 Gambling and Side Information.
6.3 Dependent Horse Races and Entropy Rate.
6.4 The Entropy of English.
6.5 Data Compression and Gambling.
6.6 Gambling Estimate of the Entropy of English.
Summary.
Problems.
Historical Notes.
7. Channel Capacity.
7.1 Examples of Channel Capacity.
7.2 Symmetric Channels.
7.3 Properties of Channel Capacity.
7.4 Preview of the Channel Coding Theorem.
7.5 Definitions.
7.6 Jointly Typical Sequences.
7.7 Channel Coding Theorem.
7.8 Zero-Error Codes.
7.9 Fano’s Inequality and the Converse to the Coding Theorem.
7.10 Equality in the Converse to the Channel Coding Theorem.
7.11 Hamming Codes.
7.12 Feedback Capacity.
7.13 Source–Channel Separation Theorem.
Summary.
Problems.
Historical Notes.
8. Differential Entropy.
8.1 Definitions.
8.2 AEP for Continuous Random Variables.
8.3 Relation of Differential Entropy to Discrete Entropy.
8.4 Joint and Conditional Differential Entropy.
8.5 Relative Entropy and Mutual Information.
8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information.
Summary.
Problems.
Historical Notes.
9. Gaussian Channel.
9.1 Gaussian Channel: Definitions.
9.2 Converse to the Coding Theorem for Gaussian Channels.
9.3 Bandlimited Channels.
9.4 Parallel Gaussian Channels.
9.5 Channels with Colored Gaussian Noise.
9.6 Gaussian Channels with Feedback.
Summary.
Problems.
Historical Notes.
10. Rate Distortion Theory.
10.1 Quantization.
10.2 Definitions.
10.3 Calculation of the Rate Distortion Function.
10.4 Converse to the Rate Distortion Theorem.
10.5 Achievability of the Rate Distortion Function.
10.6 Strongly Typical Sequences and Rate Distortion.
10.7 Characterization of the Rate Distortion Function.
10.8 Computation of Channel Capacity and the Rate Distortion Function.
Summary.
Problems.
Historical Notes.
11. Information Theory and Statistics.
11.1 Method of Types.
11.2 Law of Large Numbers.
11.3 Universal Source Coding.
11.4 Large Deviation Theory.
11.5 Examples of Sanov’s Theorem.
11.6 Conditional Limit Theorem.
11.7 Hypothesis Testing.
11.8 Chernoff–Stein Lemma.
11.9 Chernoff Information.
11.10 Fisher Information and the Cram´er–Rao Inequality.
Summary.
Problems.
Historical Notes.
12. Maximum Entropy.
12.1 Maximum Entropy Distributions.
12.2 Examples.
12.3 Anomalous Maximum Entropy Problem.
12.4 Spectrum Estimation.
12.5 Entropy Rates of a Gaussian Process.
12.6 Burg’s Maximum Entropy Theorem.
Summary.
Problems.
Historical Notes.
13. Universal Source Coding.
13.1 Universal Codes and Channel Capacity.
13.2 Universal Coding for Binary Sequences.
13.3 Arithmetic Coding.
13.4 Lempel–Ziv Coding.
13.5 Optimality of Lempel–Ziv Algorithms.
Compression.
Summary.
Problems.
Historical Notes.
14. Kolmogorov Complexity.
14.1 Models of Computation.
14.2 Kolmogorov Complexity: Definitions and Examples.
14.3 Kolmogorov Complexity and Entropy.
14.4 Kolmogorov Complexity of Integers.
14.5 Algorithmically Random and Incompressible Sequences.
14.6 Universal Probability.
14.7 Kolmogorov complexity.
14.9 Universal Gambling.
14.10 Occam’s Razor.
14.11 Kolmogorov Complexity and Universal Probability.
14.12 Kolmogorov Sufficient Statistic.
14.13 Minimum Description Length Principle.
Summary.
Problems.
Historical Notes.
15. Network Information Theory.
15.1 Gaussian Multiple-User Channels.
15.2 Jointly Typical Sequences.
15.3 Multiple-Access Channel.
15.4 Encoding of Correlated Sources.
15.5 Duality Between Slepian–Wolf Encoding and Multiple-Access Channels.
15.6 Broadcast Channel.
15.7 Relay Channel.
15.8 Source Coding with Side Information.
15.9 Rate Distortion with Side Information.
15.10 General Multiterminal Networks.
Summary.
Problems.
Historical Notes.
16. Information Theory and Portfolio Theory.
16.1 The Stock Market: Some Definitions.
16.2 Kuhn–Tucker Characterization of the Log-Optimal Portfolio.
16.3 Asymptotic Optimality of the Log-Optimal Portfolio.
16.4 Side Information and the Growth Rate.
16.5 Investment in Stationary Markets.
16.6 Competitive Optimality of the Log-Optimal Portfolio.
16.7 Universal Portfolios.
16.8 Shannon–McMillan–Breiman Theorem (General AEP).
Summary.
Problems.
Historical Notes.
17. Inequalities in Information Theory.
17.1 Basic Inequalities of Information Theory.
17.2 Differential Entropy.
17.3 Bounds on Entropy and Relative Entropy.
17.4 Inequalities for Types.
17.5 Combinatorial Bounds on Entropy.
17.6 Entropy Rates of Subsets.
17.7 Entropy and Fisher Information.
17.8 Entropy Power Inequality and Brunn–Minkowski Inequality.
17.9 Inequalities for Determinants.
17.10 Inequalities for Ratios of Determinants.
Summary.
Problems.
Historical Notes.
Bibliography.
List of Symbols.
Index.
商品描述(中文翻譯)
描述
這本經典書籍的最新版本更新了新的問題集和材料。
第二版的這本基礎教科書延續了清晰且引人深思的教學傳統。讀者再次獲得數學、物理、統計學和資訊理論的教學混合。
資訊理論中的所有基本主題都詳細涵蓋,包括熵、數據壓縮、通道容量、速率失真、網絡資訊理論和假設檢驗。作者為讀者提供了對基本理論和應用的堅實理解。每章結尾的問題集和簡明摘要進一步幫助讀者。每章後的歷史註釋回顧了主要要點。
第二版的特色包括:
* 章節重新組織以改善教學
* 200個新問題
* 有關源編碼、投資組合理論和反饋容量的新材料
* 更新的參考文獻
現在的第二版《資訊理論要素》是電機工程、統計學和電信領域高年級本科生和研究生課程的理想教科書。
目錄
第二版前言。
第一版前言。
第二版致謝。
第一版致謝。
1. 介紹與預覽。
1.1 書籍預覽。
2. 熵、相對熵和互信息。
2.1 熵。
2.2 聯合熵和條件熵。
2.3 相對熵和互信息。
2.4 熵與互信息之間的關係。
2.5 熵、相對熵和互信息的鏈規則。
2.6 詹森不等式及其後果。
2.7 對數和不等式及其應用。
2.8 數據處理不等式。
2.9 充分統計量。
2.10 法諾不等式。
摘要。
問題。
歷史註釋。
3. 漸近均分性質。
3.1 漸近均分性質定理。
3.2 漸近均分性質的後果:數據壓縮。
3.3 高概率集和典型集。
摘要。
問題。
歷史註釋。
4. 隨機過程的熵率。
4.1 馬可夫鏈。
4.2 熵率。
4.3 例子:加權圖上的隨機漫步的熵率。
4.4 熱力學第二定律。
4.5 馬可夫鏈的函數。
摘要。
問題。
歷史註釋。
5. 數據壓縮。
5.1 編碼示例。
5.2 克拉夫特不等式。
5.3 最佳編碼。
5.4 最佳編碼長度的界限。
5.5 對唯一可解碼編碼的克拉夫特不等式。
5.6 哈夫曼編碼。
5.7 關於哈夫曼編碼的一些評論。
5.8 哈夫曼編碼的最優性。
5.9 香農-法諾-埃利亞斯編碼。
5.10 香農編碼的競爭最優性。
5.11 從公平硬幣生成離散分佈。
摘要。
問題。
歷史註釋。
6. 賭博與數據壓縮。
6.1 賽馬。
6.2 賭博與側信息。
6.3 依賴賽馬與熵率。
6.4 英語的熵。
6.5 數據壓縮與賭博。
6.6 賭博對英語熵的估計。
摘要。
問題。
歷史註釋。
7. 通道容量。
7.1 通道容量示例。
7.2 對稱通道。
7.3 通道容量的性質。
7.4 通道編碼定理的預覽。
7.5 定義。
7.6 聯合典型序列。
7.7 通道編碼定理。
7.8 零錯誤編碼。
7.9 法諾不等式及編碼定理的對偶。
7.10 通道編碼定理的對偶中的等式。
7.11 哈明編碼。
7.12 反饋容量。
7.13 源-通道分離定理。
摘要。
問題。
歷史註釋。
8. 微分熵。
8.1 定義。
8.2 連續隨機變量的漸近均分性質。
8.3 微分熵與離散熵的關係。
8.4 聯合和條件微分熵。
8.5 相對熵和互信息。
8.6 微分熵、相對熵和互信息的性質。
摘要。
問題。
歷史註釋。
9. 高斯通道。
9.1 高斯通道:定義。
9.2 高斯通道的編碼定理對偶。
9.3 帶限通道。
9.4 並行高斯通道。
9.5 帶有彩色高斯噪聲的通道。
9.6 帶反饋的高斯通道。
摘要。
問題。
歷史註釋。
10. 速率失真理論。
10.1 量化。
10.2 定義。
10.3 速率失真函數的計算。
10.4 速率失真定理的對偶。
10.5 速率失真函數的可達性。
10.6 強典型序列與速率失真。
10.7 速率失真函數的特徵。
10.8 通道容量和速率失真函數的計算。
摘要。
問題。