probability – On the independence of multivariate gaussians.


Hello I have some perplexities about independence of multivariate gaussian distributions:
before getting there, I have studied the following results:

1)Let $m$ r.v $X_1, dots , X_m$ which range in $mathbb{R}^{d_1} , dots , mathbb{R}^{d_m}$ respectively. They are called independent if for every $A_1 subset mathbb{R}^{d_1}, dots , A_m subset mathbb{R}^{d_m}$ we have

$$P(X_1 in A_1 , dots , X_m in A_m) = P(X_1 in A_1) dots P(X_m in A_m)$$

Also, if $phi_1 : mathbb{R}^{d_1} rightarrow mathbb{R} , dots, phi_m:mathbb{R}^{d_m} rightarrow mathbb{R}$ respect some regularity conditions, then $phi_1(X_1), dots , phi_m(X_m)$ are independent.

Now, let $X_1, X_2, X_3$ independent r.v $mathcal{N}(0,1)$ and consider

$$U = 2X_1 – X_2 – X_3 qquad V = X_1 + X_2 + X_3, qquad W = X_1 -3X_2 +2X_3$$

What can I say about the independence of r.v $U,V,W$?

The just stated result would bring me to think that these random variables $(U,V,W)$ are all independent (since $phi$ are here simple linear transformations)..

Of course $X = (X_1, X_2, X_3) sim mathcal{N}(0,I)$ where $I$ is the identity $3 times 3$ matrix. Then $U, V, W$ are jointly gaussian because linear transformations of jointly gaussian, so to check the independence is it sufficient to prove they are not correlated. But it results $Cov(U,W) neq 0$ so $U,W$ seem not to be independent. Why does the initial statement fail here? Thanks