Wikipedia Article of the Day
Randomly selected articles from my personal browsing history
In probability and statistics, given two stochastic processes { X t } {\displaystyle \left\{X_{t}\right\}} and { Y t } {\displaystyle \left\{Y_{t}\right\}} , the cross-covariance is a function that gives the covariance of one process with the other at pairs of time points. With the usual notation E {\displaystyle \operatorname {E} } for the expectation operator, if the processes have the mean functions μ X ( t ) = E ⁡ [ X t ] {\displaystyle \mu _{X}(t)=\operatorname {\operatorname {E} } [X_{t}]} and μ Y ( t ) = E ⁡ [ Y t ] {\displaystyle \mu _{Y}(t)=\operatorname {E} [Y_{t}]} , then the cross-covariance is given by K X Y ⁡ ( t 1 , t 2 ) = cov ⁡ ( X t 1 , Y t 2 ) = E ⁡ [ ( X t 1 − μ X ( t 1 ) ) ( Y t 2 − μ Y ( t 2 ) ) ] = E ⁡ [ X t 1 Y t 2 ] − μ X ( t 1 ) μ Y ( t 2 ) . {\displaystyle \operatorname {K} _{XY}(t_{1},t_{2})=\operatorname {cov} (X_{t_{1}},Y_{t_{2}})=\operatorname {E} [(X_{t_{1}}-\mu _{X}(t_{1}))(Y_{t_{2}}-\mu _{Y}(t_{2}))]=\operatorname {E} [X_{t_{1}}Y_{t_{2}}]-\mu _{X}(t_{1})\mu _{Y}(t_{2}).\,} Cross-covariance is related to the more commonly used cross-correlation of the processes in question. In the case of two random vectors X = ( X 1 , X 2 , … , X p ) T {\displaystyle \mathbf {X} =(X_{1},X_{2},\ldots ,X_{p})^{\rm {T}}} and Y = ( Y 1 , Y 2 , … , Y q ) T {\displaystyle \mathbf {Y} =(Y_{1},Y_{2},\ldots ,Y_{q})^{\rm {T}}} , the cross-covariance would be a p × q {\displaystyle p\times q} matrix K X Y {\displaystyle \operatorname {K} _{XY}} (often denoted cov ⁡ ( X , Y ) {\displaystyle \operatorname {cov} (X,Y)} ) with entries K X Y ⁡ ( j , k ) = cov ⁡ ( X j , Y k ) . {\displaystyle \operatorname {K} _{XY}(j,k)=\operatorname {cov} (X_{j},Y_{k}).\,} Thus the term cross-covariance is used in order to distinguish this concept from the covariance of a random vector X {\displaystyle \mathbf {X} } , which is understood to be the matrix of covariances between the scalar components of X {\displaystyle \mathbf {X} } itself. In signal processing, the cross-covariance is often called cross-correlation and is a measure of similarity of two signals, commonly used to find features in an unknown signal by comparing it to a known one. It is a function of the relative time between the signals, is sometimes called the sliding dot product, and has applications in pattern recognition and cryptanalysis.
History
Dec 26
Undertow (water waves)
Dec 25
F-distribution
Dec 24
Cumulative distribution function
Dec 23
Probability mass function
Dec 22
Book cipher
Dec 21
Poisson point process
Dec 20
Generic top-level domain
Dec 19
Beale ciphers
Dec 18
Heavyweight (podcast)
Dec 17
MurmurHash
Dec 16
Attempted assassination of Ronald Reagan
Dec 15
Mnemonic major system
Dec 14
Peter M. Lenkov
Dec 13
Lagrange polynomial
Dec 12
Polynomial interpolation
Dec 11
Newton polynomial
Dec 10
Quantile function
Dec 9
Static site generator
Dec 8
Flag Day (United States)
Dec 7
Seven-segment display character representations
Dec 6
Tori Kelly
Dec 5
Lynn Conway
Dec 4
G7
Dec 3
Nostr
Dec 2
Negative binomial distribution
Dec 1
Toledo War
Nov 30
Laurent series
Nov 29
Interface control document
Nov 28
ANT (network)
Nov 27
Functional analysis