Probability or Statistics – TimeSeries covariance in version 10.0-10.3

The following works in v11.2-12.0:

ts1 = TimeSeries({{0, 0}, {1, 1}, {2, 0}, {3, 1}});
ts2 = TimeSeries({{0, 1}, {1, 0}, {2, 1}, {3, 0}});
Covariance(ts1, ts2)
(* -1/3 *)

fails in version 10.0-10.3 with the message:

Covariance :: vctmat: The arguments for covariance are not a pair of vectors or a pair of matrices of equal length.

The same problem occurs for TemporalData, however Variance and Mean both work in all these versions:

Variance(ts1)
(* 1/3 *)

Mean(ts1)
(* 1/2 *)

Error? Bypass?

Probability Distributions – It is argued that two random variables are not independent and that their covariance is zero

Consider two independent random variables $ X sim mathrm {Unif} (- 1,1) $ and $ Z sim mathrm {Unif} (0, frac {1} {10}) $, To let $ Y = X ^ 2 + Z $,

The first question was to show that the conditional density of $ Y $ given $ X $ is $ mathrm {uniform} (x ^ 2, x ^ 2 + frac {1} {10}) $ what I did.
So
$$ f_ {Y | X} (y | x) = 10I (x ^ 2 leq y leq x ^ 2 + frac {1} {10}).
$$

The next question was the calculation of the joint density of $ X $ and $ Y $ and I calculated it that way
$$ f_ {X, Y} (x, y) = 5I (-1 <x <1) I (x ^ 2 leq y leq x ^ 2 + frac {1} {10}).
$$

Now I have to argue that $ X $ and $ Y $ are not independent. But I can not see how that helps to prove that it is not independent. Even more, I have to show that the covariance of $ X $ and $ Y $ is zero. I have found a way to do it with repeated expectation, but we did not learn it in the class, so I think there is another way.

Functions – MultinormalDistribution error with covariance matrix parameters only

According to the documentation, the use should be possible Multi Normal Distribution with only one parameter of a covariance matrix that implicitly centers the result. But try

Multi Normal Distribution[{{1/4, 0, -1/4}, {0, 0, 0}, {-1/4, 0, 1/4}}]

leads to the error

MultinormalDistribution called with 1 argument; 2 arguments are
expected.

despite the fact that the covariance matrix is ​​valid: symmetric and positive-semidefinite.

Matrices – Negative values ​​for the OLS covariance

I am currently writing a code that performs a regression, and noticed this when calculating the covariance of $ c has { beta} $ For some records, I sometimes get negative values.

The covariance is given as $ c & # 39; (X & # 39; X) ^ {-1} c frac {e & # 39; e} {n-p} $ from where $ c $ is a vector representing the contrast to be tested. $ X $ is the design matrix, $ e $ is a vector of residuals and $ n $ is the number of data points and $ p $ is the number of parameters.

I want to compute a T-statistic, but in doing so, the covariance is rooted with square roots and in some cases fails when the covariance is negative. My question is, is there a mathematical / statistical reason why I have this problem? (or should I ask StackOverflow instead?)

Probability or statistics – how to derive the conditional covariance matrix of random variables (X1, X2, …, Xn)

It is to my surprise that Mathematica has no built-in function to derive a conditional covariance matrix (CCM) for several random variables. Since such a matrix is ​​very useful in statistics, I am surprised that everyone can develop one code (on Mathematica Function for automating the derivative for given probability distributions) to derive a (CCM) from n Random variables according to Gaussian Joint PDF.

In the mathematical specification I like to calculate the following equation:

Cov (X, Y | S) = E[XY|S] – E[X|S] e[Y|S]

from where X= (X1, X2, X3), Y= (Y1, Y2, Y3) and S= (S1, S2)
and the vectors X and Y each have 3 campers and S is another vector with 2 RVs.

Any code that I can write will be very primitive, so I did not even try to develop one, although I can derive the above terms in the definition of CCM.