Numerical integration – Loss of orthogonal polynomial precision using orthogonalize

context

To understand the growth of the structure in the universe,
I'm interested in characterizing the curvature of random fields like this:

Enter the image description here

For this purpose I start with a PDF of the eigenvalues ​​of the second derivative of the field (a measure of the local curvature). For a Gaussian random field this PDF reads

Pdf1 = 2 Sqrt[2/[Pi]](x1-x2) Exp[1/2 (-x1 (3 x1-x2)-x2 (3 x2-x1))]

and looks like this:

rg = {{x1, -Infinity, Infinity}, {x2, -Infinity, x1}};
contour plot[Pdf1, Sequence @@ rgn // Evaluate,  ImageSize -> Small]

Enter the image description here

However, if the field does not become Gaussian (as above), the PDF file can be very different:

Enter the image description here

My purpose is to use orthogonal polynomials to represent this non-Gaussian PDF file.

attempt

I have defined a scalar product

clear[int]; int[a_, b_] : =
Integrate[ a b  Pdf1, Sequence @@ rg // Evaluate]

and a numerical integration version thereof

    clear[nint]; N int[a_, b_] : =
NIntegrate[ a b  Pdf1, Sequence @@ rg // Evaluate]

I define my 2D polynomial

p = 2; pol0 =
table[Table[x1^i x2^(p1 - i), {i, 0, p1}] // flattening // union
{p1, 0, p}]// flatten // join

{1, x1, x2, x1 ^ 2, x1 x2, x2}

    pol = orthogonalize[pol0, int[#1, #2] &]

They look like this:

map[ContourPlot[# Pdf1  , Sequence @@ rgn // Evaluate, 
   PlotPoints -> 15, PlotRange -> All] &, pole]

Enter the image description here

If I do the same numerically

    pol = orthogonalize[pol0, intn[#1, #2] &, Method -> "Reorganization"];

I get the same answer.

But

If I try and find Order from above Polynomials numerically

    p = 6; pol0 =
table[Table[x1^i x2^(p1 - i), {i, 0, p1}] // flattening // union
{p1, 0, p}]// flattening // joining;
pol = orthogonalize[pol0, intn[#1, #2] &, Method -> "Reorganization"];

I understand that

(* {1., 1.81473 x1-0.804133, -1.53835 x1 + 2.37903 x2 + 1.73585.2.33148 x1 ^ 2-2.23663 x1 + 0.1703.940 x 2-0.0991482, -2.84065 x1 ^ 2 + 4.36636 x1 x2 + 4.97902 x1.24.656 x2 + 1.904.66 1.84722 x1 ^ 2-5.47403 x1 x2-4.89841 x1 + 4.11307 x2 ^ 2 + 6.90647 x2 + 2.25076,0,3.15107 x1 ^ 2 x2 + 1.83935 x1 ^ 2-3.44839 x1 x2-3.23326 x1 + 0.212753 x2 + 2121619 x2 +0.676981,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0} *)

In other words the Loss of accuracy in the orthogonalization leads to
zero Polynomials of higher order.


I also tried Gramm Schmitt by hand (from somewhere on SE)

gs[vecs_, ip___] : = Module[{ovecs = vecs},
Do[ovecs[[i]]- = projection[ovecs[[i]]ovecs[[j]]ip]{1, 2,
length[vecs]}, {j, 1, i - 1}]; ovecs];
pol1 = gs[pol0, Function[
    NIntegrate[ ##  Pdf1, Sequence @@ rg // Evaluate]]];

but it gives the same thing Loss of accuracy,


I also tried eigenvectors of the matrix of scalar products

            mat = parallel table[int[i, j], {i, pol0}, {j, pol0}];
eigs = own system[mat // N]; 

However, the orthogonal polynomials do not have an increasing order:

                map[ContourPlot[#  Pdf1, Sequence @@ rgn // Evaluate, 
   PlotPoints -> 15, PlotRange -> All] &,
pol = eigs[[2]].pol0 / sqrt[eigs[[1]]]// chop]

Enter the image description here

Note that while they are orthogonal, they are not orthogonal in the same direction.

question

How can I calculate the orthogonal polynomials of higher order? exactly?

Additional factors

One option would be the symbolic evaluation of the scalar product.
but for polynomials of higher order, it seems to take forever.

Can you say that? orthoganalizing not to normalize the polynomials, which symbolically takes the longest?