How does normalization in corrcoef(x) influence the actual correlation
1 view (last 30 days)
Show older comments
Hi all,
I'm trying to find the degree of correlation between 4 vectors (namely MIMO channels). Using the cov(x) function seems to go with the books but it doesn't return a matrix with 1s on its diagonal. So my first question would be why (I assume it has to do with level of fading, i.e. variance of each specific channel)? The function corrcoef(x) is the normalised cov(x) in the following way:
"if C = COV(X), then CORRCOEF(X) is normalised as C(i,j)/SQRT(C(i,i)*C(j,j))".
Can someone help me understand how does the normalisation influence the result (apart from the obvious divison factor) and why it is normalised in such a way?
It seems intuitively better since it returns coefficients equal to 1 for correlated variables (and accross the diagonal).
Thanks!
Answers (0)
See Also
Categories
Find more on Link-Level Simulation in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!