## matrices – Transformation that rotates a matrix

Is there a transformation (multiplication, or any other combination of operations) that can convert this matrix:
$$begin{bmatrix}1& 2 & 3\4&5&6\7&8&9end{bmatrix}$$
to this:
$$begin{bmatrix}3& 6 & 9\2&5&8\1&4&7end{bmatrix}$$

I have simply ‘rotated’ the matrix 90 degrees .

## algorithms – Given \$n\$ sets of matrices, find \$n\$ matrices that have the least number of LCDs among their entries

Let’s say I have $$n$$ sets of matrices

$$A = left{begin{pmatrix} 2 & 4 & 17\ 5 & 6 & 9\ end{pmatrix} begin{pmatrix} 2 & 4 & 18\ 5 & 6 & 9\ end{pmatrix} right}$$

$$B = left{begin{pmatrix} 13 & 20\ 3 & 16\ end{pmatrix} begin{pmatrix} 14 & 20\ 3 & 16\ end{pmatrix} begin{pmatrix} 13 & 21\ 3 & 16\ end{pmatrix} begin{pmatrix} 14 & 21\ 3 & 16\ end{pmatrix} begin{pmatrix} 13 & 20\ 3 & 17\ end{pmatrix} begin{pmatrix} 14 & 20\ 3 & 17\ end{pmatrix} … right}$$

Let’s define $$T$$ as a vector that contains the lowest common denominators shared among the entries of all the $$n$$ matrices.

I need to find $$n$$ matrices, picking one from each set, that will minimize the length of $$T$$, i.e. in this case

$$A_{1}=begin{pmatrix} 2 & 4 & 18\ 5 & 6 & 9\ end{pmatrix}$$

$$B_{1}=begin{pmatrix} 14 & 20\ 3 & 16\ end{pmatrix}$$

The resulting $$T$$ would be

$$T=begin{bmatrix} 2,3,4,5,9 end{bmatrix}$$

I know I can bruteforce every possible combination, but is there a more efficient way?

## matrices – Reworking matrix to have last row as identity

Given a 3*3 matrix, what is the name and working of the procedure to modify the final row to match an identity matrix?

i.e.
with [ [ a, b, c ], [ d, e, f ], [ h, i, j ] ]

what transform can be performed to get [ [ k, l, m ], [ n, o, p ], [ 0, 0, 1 ] ]

such that when it is applied to a point vector [ x, y, 1 ] I get the same output point.

For context, I have a matrix formed from various operations and wish to use it in a SVG matrix operation which only takes the values of the first two rows, expecting the third to be [ 0, 0, 1 ]

## linear algebra – set of invertible diagonal matrices

Let $$mathcal{T}$$ be the set of invertible diagonal matrices. Show that for any invertible matrix $$B$$ such that $$Bmathcal{T}B^{-1} = mathcal{T}, B=PT$$ for some permutation matrix $$P$$ and invertible diagonal matrix $$T.$$ Here, $$AB := {ab: ain A, bin B}$$ when $$A$$ and $$B$$ are sets.

I’m not sure how to show this result, though I’m pretty sure I need to consider eigenvalues, diagonalizable matrices, and change of basis matrices. Using the definition alone, I get stuck quite easily; I only know that for any invertible diagonal matrix $$T, BTB^{-1}$$ is an invertible diagonal matrix and for any invertible diagonal matrix $$T’, T’ = BT”B^{-1}$$ for some invertible diagonal matrix $$T”.$$ I know how to show that if an $$ntimes n$$ matrix has $$n$$ distinct eigenvalues, then it has $$n$$ distinct eigenvectors (an eigenvector can correspond to only one eigenvalue) and thus these $$n$$ eigenvectors are linearly independent, so the matrix is diagonalizable, but I’m not sure if this is useful.

## functions – Can we determine higher powers of a matrix in terms of lower powered matrices?

Consider a n-ordered square matrix A. Using Cayley-Hamilton Theorem, I can represent the matrix $$A^n$$ as a matrix polynomial P(A) of degree n-1.

Further any matrix $$A^k$$ where $$k>n$$ can also be represented as follows:

$$A^k= a_{k,n-1} A^{n-1} + a_{k,n-2} A^{n-2}+a_{k,n-3} A^{n-3} … + a_{k,2} A^{2}+a_{k,1} A^{1} + a_{k,0}I$$

What I want to know that is there any way to determine the coefficients as a function of k?

## matrices – Matrix derivative w.r.t. a general inverse form: \$(A^TA)^{-1/2}D(A^TA)^{-1/2}\$

I want to find derivative of matrix $$(A^TA)^{-1/2}D(A^TA)^{-1/2}$$ w.r.t. $$A_{ij}$$ where D is a diagonal matrix. Alternatively, it is okay too to have

$$frac{partial}{partial A_{ij}} a^T(A^TA)^{-1/2}D(A^TA)^{-1/2}b$$

Is there any reference for such problem? I have the matrix cookbook which gives results when $$D=I$$. But how is this general form evaluating to?

To give more information, empirical distribution of diagonal of diagonal matrix D converges to some known distribution.

## reference request – rank of a linear combination of matrices

Let $$A_1,…, A_s in M_n(mathbb{R})$$ be symmetric matrices and suppose they are linearly independent over $$mathbb{R}$$. This means that
$$m = min_{(c_1, …, c_s) in mathbb{R}^s backslash {0}} rank( sum_{i=1}^s c_i A_i ) > 0$$
I am interested in the question how large can $$m$$ be?
I am not sure where to start looking… if someone could point me to a good reference it’s very appreciated! Also any comments are appreciated!

## If two matrices are row equivalent, will there transpose be row equivalent?

Let’s suppose I have two matrices A and B of same dimensions. Given is A and B are row equivalent. Will transpose (A) be row equivalent to transpose (B)? If yes then prove it.

## matrices – Derivative of a Matrix w.r.t. its Matrix Square, \$frac{partial text{vec}X}{partialtext{vec}(XX’)}\$

Let $$X$$ be a nonsingular square matrix.

What is
$$frac{partial text{vec}X}{partialtext{vec}(XX’)},$$
where the vec operator stacks all columns of a matrix in a single column vector?

It is easy to derive that
$$frac{partialtext{vec}(XX’)}{partial text{vec}X} = (I + K)(X otimes I),$$
where $$K$$ is the commutation matrix that is defined by
$$text{vec}(X) = Ktext{vec}(X’).$$

Now $$(I + K)(X otimes I)$$ is a singular matrix, so that the intuitive solution
$$frac{partial text{vec}X}{partialtext{vec}(XX’)} = left( frac{partialtext{vec}(XX’)}{partial text{vec}X} right)^{-1}$$
does not work.

Is the solution simply the Moore-Penrose inverse of $$(I + K)(X otimes I)$$, or is it more complicated?

## matrices – Matrixes whose elements are matrixes

I’ve worked with matrixes whose elements are objects in a field, such that real numbers, complex numbers, inclusive functions in space of functions, but… Today I was talking to a friend and he asked me about something he saw in his PhD in informatic science that was about “matrixes with matrixes in their entries” and I know that we can make an arrange of the blocks of the matrixes in the entries to form a matrix in a nxn space, for some n… but… what use or there is any example of how is useful a matrix with this characteristic?