linear algebra – Prove that the Rayleigh quotient of the self-adjoint positive-semidefinite operator with continuous inverse has a positive lower bound.

To let $ V $ be a Hilbert room. To let $ A: v in V right arrow Av in V $ Let be a self-adapting continuous bijective linear operator with continuous inverse such that
begin {equation}
(Av, v) geq 0, quad forall v in V.
end {equation}

How can you prove that there is a constant? $ alpha> 0 $ so that $ (Av, v) geq alpha || v || ^ 2 $? Can someone help me?

linear algebra – projection onto the space of permutations of lower diagonal matrices

To let $ G $ be the group of all $ n times n $ Permutation matrices and let $ V $ Be the vector space of all $ n $lower diagonal matrices. Then I define the quantity
$$
X = {P cdot L cdot P ^ T mid forall P in G, L in V },
$$

from where "$ cdot $& # 39; & # 39; is matrix multiplication. This is, $ X $ is the set of all lower diagonal matrices with possible rows and columns that are permuted at the same time.

I am interested in the orthogonal projection of a general matrix $ M in mathbb {R} ^ {n times n} $ on the set $ X subset mathbb {R} ^ n $, Does anyone know if there is a more efficient method than projecting onto each room?
$$
V_P = {P cdot L cdot P ^ T mid forall L in V }
$$

for each one $ P in G $ and then take the shortest?

Thank you in advance!

abstract algebra – is the ideal $ I = (X + Y, X-Y) $ in the polynomial ring $ mathbb {C}[X,Y]A first-class ideal?

Is that ideal $ I = (X + Y, X-Y) $ in the polynomial ring $ mathbb {C}[X,Y]$ A first-class ideal? Justify your answer.

I hesitate a bit to ask this. The question is not "How do I solve this problem?". The question is, "What do I have to learn first to solve these kinds of problems?".

I have little to no influence on the subject of Rings & Fields, let alone Prime Ideal.

What does it matter? $ mathbb {C}[X,Y]$ my here? A polynomial in two variables $ X $ and $ Y $whose coefficients are chosen from the set of complex numbers? A Google search shows that prime ideals are very similar to primes. Well, (intuitively) an ideal is a special subset of a ring that can endure multiplication by the elements of the ring (left or right, if commutative, then distinction does not matter). Well, how is this true? $ I = (X + Y, X-Y) $?

It is also very helpful if someone tells me how to build my powers in this particular area of ​​mathematics (I understand the group theory moderately well, though not at all profoundly).

This question is indeed very unclear and quite opinionated and unsuitable for this site. I'm very confused about that myself. Every help is appreciated.

Linear Algebra – If $ P (x) $ is a characteristic polynomial of $ A $, then $ P (A) = 0 $?

I am a student and have just read the Characteristic Polynomial on the wiki.
I have a filling that:

If $ P (x) $ is a characteristic polynomial of $ A $ then $ P (A) = 0 $

Thanks to the matrix calculator, I tested it for the matrix, which is the wikis example, and for another matrix, and this was true for both. The matrices were $ begin {bmatrix} 2 & 1 \ – 1 & 0 end {bmatrix} $ and $ begin {bmatrix} -1 & -1 \ 1 & 1 end {bmatrix} $,

I thought about it. when the eigenvectors of the matrix span the space. so we can write every vector in space as a linear combination of its eigenvectors. So for everyone $ v $ in space $ P (A) v = P (A) ( alpha_1e_1 + alpha_2e_2 + …) $ and that's easy for everyone to see $ e_i $ there are $ P (A) e_i = 0 $

BUT the problem is for the matrices like $ begin {bmatrix} -1 & -1 \ 1 & 1 end {bmatrix} $ have the dosent $ n $ different eigenvectors that span the space. I have no idea about this species.

Any help would be appreciated.

commutative algebra – special case of isomorphism $ S otimes_R mathrm {Hom} _R (M, N) simeq mathrm {Hom} _S (S otimes_R M, S otimes_R N) $

So we have the S-module isomorphism $$ S otimes_R mathrm {Hom} _R (M, N) simeq mathrm {Hom} _S (S otimes_R M, S otimes_R N) $$ given that $ S $ is flat over $ R $ and $ M $ will finally be presented. Even though $ S $ is not flat or $ M $ is finally presented, there is at least $ S $Homomorphism between the modules.

What I am considering is a limitation of isomorphism. To let $ phi in mathrm {End} _R (M) $, $ 1 otimes_R phi $ will designate endomorphism of $ S otimes_R M $, $$ S otimes_R R[phi] Simeq S[1otimes_R phi]$$
Is the above isomorphism without the condition that $ S $ is flat over $ R $ or that $ M $ is finally presented?

In the face of that $ S $ is flat over $ R $ and $ M $ is finally presented, $ S otimes_R R[phi]$ is a module of $ S otimes_R mathrm {End} _R (M) $ and the isomorphism in question follows easily.

It is obvious that the induced homomorphism is surjective, but I can not verify exactly if it is injective.

  1. Does Isomophism have no flatness or finite presence?
  2. If the isomorphism holds, what is this structural difference between $ R[phi]$ and the whole $ mathrm {End} _R (M) $ that makes isomorphism possible on one side and not on the other?

dg.differential geometry – metric structures that make cohomology a module over a Lie algebra

Thank you for giving MathOverflow an answer!

  • Please be sure too answer the question, Provide details and share your research!

But avoid

  • Ask for help, clarification or answering other questions.
  • Make statements based on opinions; Cover them with references or personal experience.

Use MathJax to format equations. Mathjax reference.

For more information, see our tips for writing great answers.

Algebra precalculus – $ x_1 x_2 x_3 x_4 + x_2 x_3 x_4 x_5 + …… + x_n x_1 x_2 x_3 = 0 $ What is $ n $?

Can someone please help me to understand the following problem?Enter image description here

It's really hard for me to understand this problem. What I understood is that I have to find a natural number $ n $ for which the equation holds, whatever the values ​​of $ x_i $ That's impossible. Because of this, some conditions seem to be on $ x_i $ s are necessary ..

linear algebra – matrix decomposition in a certain form

Can we prove that to anyone really valued? $ d times d $ matrix $ A $. $ A $ can be decomposed to a finite product of such matrices

$ A = prod_ {i = 1} ^ n (I + R_i) $

from where $ I $ is the identity matrix and $ rank (R_i) = 1 $,

As far as I know, if so $ LDU $ or $ LU $ Decomposition too $ A $, then we can easily find out everyone $ R_i $,

abstract algebra – show $ mathbb {C} [x,y]/ (xy-1) $ is an ideal principle

I have to show that $ A: = mathbb {C} [x,y]/ (xy-1) $ is an ideal area in principle

I know that $ A $ is isomorphic too $ mathbb {C}[t,t^{-1}]$ and that this is a subfield
from $ mathbb {C}

But I have to work with the canonical inclusion card $ i: mathbb {C}[x] hookrightarrow A $ from then point $ I cap Im (i) neq {0 } $ ideal for everyone $ I neq {0 } $, I have no idea why I need it.