# linear algebra – Why the multiplication of a covariance matrix with the inverse of its sum with the identity matrix is symmetric?

I have an empirical result (meaning it is always true by simple simulation e.g. in R) which I cannot prove to myself:

Let $$A$$ be a $$n times n$$ covariance matrix (i.e. it is symmetric PSD), let $$I_n$$ be the identity matrix, $$theta_1$$ and $$theta_2$$ some scalars (in my case they are always positive but it does not matter). Let:

$$V = (theta_1 A + theta_2I_n)^{-1}A$$

It seems that $$V$$ is always symmetric! Can we prove it?

E.g. in R:

``````A <- cov(rbind(c(1,2.1,3), c(3,4,5.3), c(3,4.2,0)))
isSymmetric(solve(2 * A + 3 * diag(3)) %*% A)
``````
``````(1) TRUE
``````

To anyone interested: it is important to me mainly because this means I have two symmetric matrices $$A, B$$ which multiply to a symmetric matrix $$AB$$, in which case its eigenvalues are in fact multiplications of the eigenvalues of $$A$$ and $$B$$ according to this, which also simplifies its trace.