User:PartlePartle/Books/Introduction to Information Theory
Appearance
The Wikimedia Foundation's book rendering service has been withdrawn. Please upload your Wikipedia book to one of the external rendering services. |
You can still create and edit a book design using the Book Creator and upload it to an external rendering service:
|
This user book is a user-generated collection of Wikipedia articles that can be easily saved, rendered electronically, and ordered as a printed book. If you are the creator of this book and need help, see Help:Books (general tips) and WikiProject Wikipedia-Books (questions and assistance). Edit this book: Book Creator · Wikitext Order a printed copy from: PediaPress [ About ] [ Advanced ] [ FAQ ] [ Feedback ] [ Help ] [ WikiProject ] [ Recent Changes ] |
Introduction to Information Theory
Compiled by Pavle Jeremic
- Initial Introductory Information and Reference
- Stochastic process
- General Information Theory
- History of information theory
- Information theory
- Bit
- Random variable
- Probability
- Probability theory
- Independence (probability theory)
- Shannon's source coding theorem
- Self-information
- Quantities of information
- Shannon–Hartley theorem
- Mutual information
- Entropy
- Entropy (information theory)
- Entropy (statistical thermodynamics)
- Joint entropy
- Conditional entropy
- Entropy rate
- Boltzmann constant
- Coding Theory
- Coding theory
- Channel capacity
- Channel code
- Binary symmetric channel
- Binary erasure channel
- Algorithmic Information Theory
- Algorithmic information theory
- Algorithmically random sequence
- Algorithmic probability
- Bayes' rule
- Lebesgue measure
- Kullback–Leibler divergence
- Chaitin's constant
- Kolmogorov complexity
- Markov Models
- Markov property
- Markov chain
- Serial dependence
- Markov process
- Iverson bracket
- Connected component (graph theory)
- State diagram
- Examples of Markov chains
- Probability vector
- Chapman–Kolmogorov equation
- Marginal distribution
- Ergodic theory
- Invariant measure
- Markov chain Monte Carlo
- Eigenvalues and eigenvectors
- Stochastic matrix
- Detailed balance
- Kolmogorov's criterion
- Harris chain
- Leslie matrix
- Diffusion equation