Search results

Results 1 – 20 of 214
Advanced search

Search in namespaces:

There is a page named "Talk:Joint entropy" on Wikipedia

View (previous 20 | ) (20 | 50 | 100 | 250 | 500)
  • This article was automatically assessed because at least one WikiProject had rated the article as start, and the rating on other projects was brought up...
    1 KB (146 words) - 21:54, 15 February 2024
  • would like to remove the statement in the introduction that states that entropy is only defined for messages with an independent and identically distributed...
    143 KB (23,679 words) - 12:34, 6 August 2024
  • understood as follows: In quantum mechanics, the entropy of the joint system can be less than the sum of the entropy of it's components because the components...
    70 KB (10,819 words) - 22:20, 25 November 2015
  • than d Q {\displaystyle dQ} , because Q is not a state function while the entropy is." That seems really unclear to me. What is the significance of Q not...
    166 KB (27,271 words) - 00:03, 7 July 2017
  • Loop entropy, Free entropy, Residual entropy, Entropy and life, Entropy of fusion, Entropy of mixing, Entropy of vaporization, Tsallis entropy, Joint entropy...
    115 KB (17,528 words) - 09:47, 1 February 2024
  • clarity in our joint work here, may I suggest the following terms? The article has components title - Introduction to thermodynamic entropy lead - of about...
    260 KB (41,043 words) - 13:16, 28 November 2023
  • correction. We cannot speak of a "joint entropy" of two distributions that are not jointly observable. The "joint distribution" formed by considering...
    25 KB (3,785 words) - 00:29, 9 March 2024
  • How these relate could illustrate self-information, entropy, joint entropy, conditional entropy, and mutual information without resorting to non-integer-result...
    3 KB (492 words) - 10:48, 13 August 2023
  • quantum entropy is defined in terms of a single state (rho) with two subsystems (A,B), whereas the old notation was ambiguous, by not specifying the joint density...
    620 bytes (82 words) - 22:37, 30 January 2024
  • the abstract measure over sets which forms the analogy with joint entropy, conditional entropy, and mutual information. The other is the measures over which...
    14 KB (2,157 words) - 14:34, 15 February 2024
  • and joint entropy' the description of the terms describes H(X) and H(Y) as 'marginal entropies' ("where H(X) and H(Y) are the marginal entropies"). But...
    27 KB (4,267 words) - 14:26, 6 February 2024
  • surprise", since "expected surprise" is a familiar phrase, being equal to entropy. — Preceding unsigned comment added by 38.105.200.57 (talk) 21:23, 25 April...
    72 KB (10,715 words) - 21:24, 25 April 2024
  • paragraph. If this article is to be about entropy in general — including the popular concept and information entropy — then it's inappropriate to lead off...
    93 KB (11,130 words) - 16:57, 8 June 2024
  • 81 bytes (0 words) - 20:32, 30 January 2024
  • the calculation for book keeping To calculate the conditional entropy term we need the joint distribution: p(y|x) = kronecker_delta_{y,1-x} p(x,y) = p(y|x)...
    7 KB (1,083 words) - 03:55, 14 June 2024
  • relation to entropy. He does not define it as a unit of entropy, as far as I can tell, and he makes absolutely no case for it as a unit of entropy. He is quite...
    15 KB (2,367 words) - 10:46, 12 January 2024
  • specific quantities used in quantum information theory (say, the joint quantum entropy). I don't see that this is any different. If we have an infinite...
    4 KB (604 words) - 16:37, 30 January 2024
  • to edit or move information to information entropy, joint entropy, mutual information, conditional entropy, etc., or create new articles, even one on...
    103 KB (16,726 words) - 21:35, 12 May 2007
  • 18:33, 18 August 2010 (UTC) To be clear, are we talking about the entropy of the joint probability distribution of the probability vector at time zero with...
    29 KB (4,114 words) - 16:21, 11 January 2024
  • divergence, differential entropy. We shouldn't be scared to have a little math in the article, but regular, joint, and conditional entropy can be defined together...
    103 KB (17,148 words) - 05:35, 9 December 2023
View (previous 20 | ) (20 | 50 | 100 | 250 | 500)