Loading AI tools
Thoeyr in probability From Wikipedia, the free encyclopedia
In probability theory and statistics, two real-valued random variables, , , are said to be uncorrelated if their covariance, , is zero. If two variables are uncorrelated, there is no linear relationship between them.
This article needs additional citations for verification. (January 2013) |
Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.
In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and and are uncorrelated if and only if .
If and are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.[1]: p. 155
Two random variables are called uncorrelated if their covariance is zero.[1]: p. 153 [2]: p. 121 Formally:
Two complex random variables are called uncorrelated if their covariance and their pseudo-covariance is zero, i.e.
A set of two or more random variables is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix of the random vector are all zero. The autocovariance matrix is defined as:
The claim is that and have zero covariance (and thus are uncorrelated), but are not independent.
Proof:
Taking into account that
where the second equality holds because and are independent, one gets
Therefore, and are uncorrelated.
Independence of and means that for all and , . This is not true, in particular, for and .
Thus so and are not independent.
Q.E.D.
If is a continuous random variable uniformly distributed on and , then and are uncorrelated even though determines and a particular value of can be produced by only one or two values of :
on the other hand, is 0 on the triangle defined by although is not null on this domain. Therefore and the variables are not independent.
Therefore the variables are uncorrelated.
There are cases in which uncorrelatedness does imply independence. One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a Bernoulli distribution).[3] Further, two jointly normally distributed random variables are independent if they are uncorrelated,[4] although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see Normally distributed and uncorrelated does not imply independent).
Two random vectors and are called uncorrelated if
They are uncorrelated if and only if their cross-covariance matrix is zero.[5]: p.337
Two complex random vectors and are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if
where
and
Two stochastic processes and are called uncorrelated if their cross-covariance is zero for all times.[2]: p. 142 Formally:
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.