## Estimation of the convolution of two multiplicative functions

To let $$f, g: mathbb {N} to mathbb {C}$$ be two multiplicative arithmetic functions.
Suppose we know an asymptotic behavior of $$f$$ and $$g$$, Is there a general result for the asymptotic behavior of the convolution? $$f * g$$?

## Machine Learning – Sample of complexity of mean estimation with empirical estimator and mean estimator?

For a random variable $$X$$ with unknown mean $$mu$$ and variance $$sigma ^ 2$$we want to create a quote $$has { mu}$$ based on $$n$$ i.i.d. Samples off $$X$$ so that $$rvert has { mu} – mu lvert leq epsilon sigma$$ with a probability of at least $$1 – Delta$$,

Empirical estimator: Why are $$O ( epsilon ^ {- 2} cdot delta ^ {- 1})$$ Samples required? Why are $$Omega ( epsilon ^ {- 2} cdot delta ^ {- 1})$$ Sufficient samples?

Averaging estimator: Why are $$O ( epsilon ^ {- 2} cdot log frac {1} { delta})$$ Samples required?

## [GET][NULLED] – WP Cost Estimation & Payment Forms Builder v9.683

(GET) (NULLED) – WP Cost Estimation and Payment Form Builder v9.683

## nonlinear – Parameter estimation for the Gaussian function in Gaussian noise

The mathematical model can be given as follows.

$$x (n) = a exp (- (n – k) ^ 2 / s ^ 2) + w (n)$$

Here w (n) is a Gaussian noise process with a variance of zero $$3 cdot10 ^ {- 5}$$, The problem is to estimate the parameters $$a$$. $$k$$ and $$s$$ as much as possible. The data samples are collected successively. Therefore, it is expected that the algorithm will be able to make estimates on the fly, and that the estimation quality will improve as the total number of data samples increases.

I just started in case the noise simply does not exist.

I have used the following code for the data below and it works pretty well. The problem is, if I add some noise to these data to simulate the mathematical model given above.

nlm = NonlinearModelFit(mydata((1 ;; 300)), a*Exp(-(x - k)^2/s^2), {a, k, s}, x)


for example:

mydata2 = mydata + RandomVariate(NormalDistribution(0, Sqrt(3*10^-5)), 400);

nlm = NonlinearModelFit(mydata2((1 ;; 300)), a*Exp(-(x - k)^2/s^2), {a, k, s}, x)


or

nlm = NonlinearModelFit(mydata2((1 ;; 400)), a*Exp(-(x - k)^2/s^2), {a, k, s}, x)


both lead to no meaningful results.

What do you have to do to deal with NonlinearModelFit? Or in general for example with maximum likelihood estimation (MLE)? I've tried MLE, but I can not get anything in closed form except the amplitude a.

mydata={{1, 0.0004696016896143079}, {2,
0.0004754998605167475}, {3, 0.00048146795030926896}, {4,
0.0004875067327344009}, {5, 0.000493616989397984}, {6,
0.0004997995098406223}, {7, 0.0005060550916096482}, {8,
0.0005123845403316832}, {9, 0.0005187886697856992}, {10,
0.0005252683019766862}, {11, 0.0005318242672098824}, {12,
0.0005384574041655253}, {13, 0.0005451685599742198}, {14,
0.0005519585902928752}, {15, 0.0005588283593811737}, {16,
0.0005657787401786761}, {17, 0.0005728106143824625}, {18,
0.0005799248725254053}, {19, 0.0005871224140549774}, {20,
0.0005944041474126807}, {21, 0.0006017709901140973}, {22,
0.0006092238688294729}, {23, 0.0006167637194649574}, {24,
0.0006243914872444208}, {25, 0.0006321081267919121}, {26,
0.0006399146022146664}, {27, 0.0006478118871867838}, {28,
0.0006558009650335242}, {29, 0.0006638828288161673}, {30,
0.0006720584814175511}, {31, 0.0006803289356282413}, {32,
0.0006886952142332748}, {33, 0.0006971583500995931}, {34,
0.0007057193862640794}, {35, 0.0007143793760222815}, {36,
0.0007231393830176956}, {37, 0.0007320004813317731}, {38,
0.0007409637555745676}, {39, 0.0007500303009759804}, {40,
0.0007592012234777229}, {41, 0.0007684776398259019}, {42,
0.0007778606776643128}, {43, 0.0007873514756283212}, {44,
0.0007969511834394849}, {45, 0.0008066609620008387}, {46,
0.0008164819834928088}, {47, 0.0008264154314698429}, {48,
0.0008364625009577514}, {49, 0.0008466243985516465}, {50,
0.0008569023425146643}, {51, 0.0008672975628773075}, {52,
0.0008778113015375558}, {53, 0.0008884448123615773}, {54,
0.0008991993612852293}, {55, 0.0009100762264162467}, {56,
0.000921076698137077}, {57, 0.000932202079208501}, {58,
0.0009434536848739328}, {59, 0.0009548328429644679}, {60,
0.000966340894004572}, {61, 0.0009779791913185954}, {62,
0.0009897491011379713}, {63, 0.0010016520027091057}, {64,
0.0010136892884020524}, {65, 0.001025862363819943}, {66,
0.0010381726479090703}, {67, 0.0010506215730697898}, {68,
0.0010632105852681447}, {69, 0.001075941144148263}, {70,
0.0010888147231454353}, {71, 0.0011018328096000214}, {72,
0.001114996904872117}, {73, 0.0011283085244569028}, {74,
0.001141769198100848}, {75, 0.0011553804699186276}, {76,
0.0011691438985108552}, {77, 0.0011830610570825238}, {78,
0.0011971335335622672}, {79, 0.0012113629307224428}, {80,
0.0012257508662998923}, {81, 0.0012402989731175613}, {82,
0.0012550088992069328}, {83, 0.0012698823079311588}, {84,
0.0012849208781090693}, {85, 0.001300126304139942}, {86,
0.0013155002961291102}, {87, 0.0013310445800143}, {88,
0.0013467608976928378}, {89, 0.0013626510071496933}, {90,
0.0013787166825862166}, {91, 0.0013949597145498226}, {92,
0.0014113819100643969}, {93, 0.0014279850927616303}, {94,
0.0014447711030130214}, {95, 0.0014617417980628403}, {96,
0.0014788990521619123}, {97, 0.0014962447567021162}, {98,
0.0015137808203518409}, {99, 0.0015315091691922803}, {100,
0.0015494317468544429}, {101, 0.0015675505146571435}, {102,
0.0015858674517457525}, {103, 0.0016043845552318901}, {104,
0.0016231038403338208}, {105, 0.0016420273405178289}, {106,
0.0016611571076404561}, {107, 0.0016804952120914654}, {108,
0.0017000437429378116}, {109, 0.0017198048080683862}, {110,
0.0017397805343397233}, {111, 0.001759973067722423}, {112,
0.001780384573448571}, {113, 0.0018010172361600572}, {114,
0.0018218732600576001}, {115, 0.0018429548690508228}, {116,
0.0018642643069091724}, {117, 0.0018858038374136323}, {118,
0.0019075757445094173}, {119, 0.0019295823324595143}, {120,
0.0019518259259991753}, {121, 0.001974308870491162}, {122,
0.001997033532082018}, {123, 0.0020200022978592415}, {124,
0.00204321757600922}, {125, 0.0020666817959762093}, {126,
0.0020903974086221343}, {127, 0.002114366886387389}, {128,
0.0021385927234523765}, {129, 0.0021630774359001114}, {130,
0.0021878235618797226}, {131, 0.002212833661770739}, {132,
0.0022381103183484164}, {133, 0.0022636561369500053}, {134,
0.0022894737456417603}, {135, 0.0023155657953870647}, {136,
0.0023419349602153542}, {137, 0.0023685839373920855}, {138,
0.0023955154475894373}, {139, 0.002422732235058113}, {140,
0.0024502370678000586}, {141, 0.0024780327377419684}, {142,
0.0025061220609098843}, {143, 0.002534507877604617}, {144,
0.00256319305257824}, {145, 0.0025921804752112995}, {146,
0.0026214730596911488}, {147, 0.0026510737451912337}, {148,
0.0026809854960511046}, {149, 0.002711211301957591}, {150,
0.002741754178126894}, {151, 0.002772617165487437}, {152,
0.002803803330863877}, {153, 0.00283531576716194}, {154,
0.002867157593554324}, {155, 0.002899331955667316}, {156,
0.0029318420257686026}, {157, 0.002964691002955967}, {158,
0.002997882113346801}, {159, 0.0030314186102687224}, {160,
0.003065303774451082}, {161, 0.0030995409142175016}, {162,
0.00313413336567916}, {163, 0.0031690844929292495}, {164,
0.003204397688238352}, {165, 0.00324007637225061}, {166,
0.0032761239941810146}, {167, 0.0033125440320136814}, {168,
0.0033493399927008493}, {169, 0.0033865154123630826}, {170,
0.003424073856490291}, {171, 0.0034620189201438464}, {172,
0.003500354228159393}, {173, 0.0035390834353508653}, {174,
0.0035782102267154447}, {175, 0.0036177383176392553}, {176,
0.0036576714541042455}, {177, 0.003698013412895921}, {178,
0.0037387680018121402}, {179, 0.0037799390598726064}, {180,
0.003821530457529599}, {181, 0.0038635460968796066}, {182,
0.003905989911875701}, {183, 0.003948865868541081}, {184,
0.003992177965183619}, {185, 0.004035930232611049}, {186,
0.004080126734347439}, {187, 0.004124771566850392}, {188,
0.00416986885972943}, {189, 0.004215422775964974}, {190,
0.004261437512128586}, {191, 0.004307917298604109}, {192,
0.00435486639980953}, {193, 0.004402289114420054}, {194,
0.004450189775591967}, {195, 0.00449857275118758}, {196,
0.004547442444000855}, {197, 0.004596803291984207}, {198,
0.00464665976847628}, {199, 0.004697016382430332}, {200,
0.004747877678643866}, {201, 0.004799248237989198}, {202,
0.004851132677644647}, {203, 0.004903535651326949}, {204,
0.004956461849524496}, {205, 0.005009915999731568}, {206,
0.0050639028666832585}, {207, 0.00511842725259155}, {208,
0.005173493997382378}, {209, 0.005229107978933174}, {210,
0.005285274113311835}, {211, 0.0053419973550162415}, {212,
0.005399282697215}, {213, 0.005457135171988643}, {214,
0.005515559850572118}, {215, 0.005574561843598148}, {216,
0.005634146301341157}, {217, 0.005694318413962399}, {218,
0.005755083411756057}, {219, 0.005816446565395737}, {220,
0.005878413186182366}, {221, 0.005940988626292675}, {222,
0.006004178279028781}, {223, 0.006067987579068253}, {224,
0.006132422002715424}, {225, 0.006197487068153544}, {226,
0.0062631883356974215}, {227, 0.0063295314080473175}, {228,
0.006396521930543493}, {229, 0.006464165591421823}, {230,
0.0065324681220697645}, {231, 0.006601435297283679}, {232,
0.0066710729355268766}, {233, 0.006741386899188109}, {234,
0.006812383094841357}, {235, 0.006884067473506359}, {236,
0.006956446030909557}, {237, 0.0070295248077463105}, {238,
0.007103309889943699}, {239, 0.007177807408924284}, {240,
0.007253023541870322}, {241, 0.00732896451198904}, {242,
0.007405636588778868}, {243, 0.007483046088295842}, {244,
0.007561199373421404}, {245, 0.007640102854130552}, {246,
0.00771976298776116}, {247, 0.007800186279283448}, {248,
0.007881379281570702}, {249, 0.00796334859567074}, {250,
0.008046100871077617}, {251, 0.008129642806004526}, {252,
0.008213981147657484}, {253, 0.008299122692509127}, {254,
0.008385074286573856}, {255, 0.00847184282568334}, {256,
0.008559435255762958}, {257, 0.008647858573108479}, {258,
0.00873711982466387}, {259, 0.008827226108299748}, {260,
0.008918184573092054}, {261, 0.009010002419601842}, {262,
0.009102686900155536}, {263, 0.009196245319125947}, {264,
0.009290685033213592}, {265, 0.009386013451728938}, {266,
0.009482238036875396}, {267, 0.009579366304032336}, {268,
0.009677405822039234}, {269, 0.009776364213480477}, {270,
0.00987624915497003}, {271, 0.009977068377437481}, {272,
0.010078829666414185}, {273, 0.01018154086232032}, {274,
0.010285209860751873}, {275, 0.01038984461276873}, {276,
0.010495453125183327}, {277, 0.010602043460849177}, {278,
0.01070962373895057}, {279, 0.010818202135292367}, {280,
0.010927786882590718}, {281, 0.011038386270763546}, {282,
0.011150008647222046}, {283, 0.011262662417162849}, {284,
0.011376356043859661}, {285, 0.011491098048956299}, {286,
0.011606897012759966}, {287, 0.011723761574534348}, {288,
0.011841700432793744}, {289, 0.01196072234559729}, {290,
0.012080836130843974}, {291, 0.012202050666567049}, {292,
0.012324374891229834}, {293, 0.012447817804021589}, {294,
0.012572388465153093}, {295, 0.012698095996153359}, {296,
0.012824949580166162}, {297, 0.012952958462247309}, {298,
0.013082131949661404}, {299, 0.013212479412179602}, {300,
0.013344010282377671}, {301, 0.013476734055933404}, {302,
0.013610660291925196}, {303, 0.013745798613130704}, {304,
0.013882158706325}, {305, 0.014019750322579735}, {306,
0.014158583277561974}, {307, 0.014298667451833773}, {308,
0.014440012791150826}, {309, 0.014582629306762357}, {310,
0.014726527075710786}, {311, 0.014871716241130982}, {312,
0.015018207012550263}, {313, 0.015166009666188132}, {314,
0.015315134545256497}, {315, 0.015465592060258998}, {316,
0.015617392689291267}, {317, 0.01577054697834125}, {318,
0.015925065541588505}, {319, 0.016080959061704408}, {320,
0.0162382382901525}, {321, 0.016396914047487634}, {322,
0.016556997223656153}, {323, 0.01671849877829549}, {324,
0.016881429741034147}, {325, 0.01704580121179054}, {326,
0.0172116243610727}, {327, 0.017378910430277787}, {328,
0.017547670731990553}, {329, 0.017717916650282543}, {330,
0.017889659641010587}, {331, 0.0180629112321157}, {332,
0.018237683023920606}, {333, 0.018413986689427973}, {334,
0.018591833974618544}, {335, 0.018771236698747706}, {336,
0.018952206754643035}, {337, 0.019134756109001325}, {338,
0.019318896802684327}, {339, 0.019504640951015263}, {340,
0.019692000744074168}, {341, 0.019880988446993795}, {342,
0.02007161640025356}, {343, 0.020263897019974386}, {344,
0.020457842798212975}, {345, 0.02065346630325453}, {346,
0.020850780179906252}, {347, 0.02104979714978936}, {348,
0.02125053001163171}, {349, 0.02145299164155823}, {350,
0.021657194993382013}, {351, 0.021863153098895007}, {352,
0.022070879068156565}, {353, 0.02228038608978288}, {354,
0.022491687431235516}, {355, 0.022704796439108162}, {356,
0.02291972653941392}, {357, 0.023136491237871225}, {358,
0.023355104120189686}, {359, 0.023575578852353802}, {360,
0.023797929180907142}, {361, 0.024022168933235784}, {362,
0.024248312017849494}, {363, 0.02447637242466357}, {364,
0.02470636422527892}, {365, 0.024938301573262146}, {366,
0.025172198704423065}, {367, 0.025408069937092777}, {368,
0.02564592967240078}, {369, 0.025885792394549478}, {370,
0.026127672671089334}, {371, 0.026371585153192822}, {372,
0.026617544575925807}, {373, 0.026865565758519682}, {374,
0.027115663604641082}, {375, 0.027367853102661762}, {376,
0.027622149325925435}, {377, 0.02787856743301484}, {378,
0.028137122668018034}, {379, 0.028397830360791194}, {380,
0.028660705927222665}, {381, 0.028925764869493876}, {382,
0.029193022776340737}, {383, 0.029462495323311303}, {384,
0.02973419827302401}, {385, 0.030008147475424414}, {386,
0.030284358868038947}, {387, 0.030562848476228626}, {388,
0.030843632413441655}, {389, 0.031126726881462462}, {390,
0.03141214817066134}, {391, 0.031699912660241324}, {392,
0.03199003681848457}, {393, 0.03228253720299529}, {394,
0.03257743046094255}, {395, 0.03287473332930184}, {396,
0.033174462635092716}, {397, 0.03347663529561699}, {398,
0.03378126831869377}, {399, 0.03408837880289416}, {400,
0.03439798393777186}}


## SQL Server – Cardinality Estimation outside the histogram

I'm using the 2010 version of the Stack Overflow database on SQL Server 2017 with the new CE (compatibility level 140) and created this process:

USE StackOverflow2010;
GO

CREATE OR ALTER PROCEDURE #sp_PostsByCommentCount
@CommentCount int
AS
BEGIN
SELECT *
FROM dbo.Posts p
WHERE
p.CommentCount = @CommentCount
OPTION (RECOMPILE);
END;
GO


There are no non-clustered indexes or statistics too dbo.Posts Table (there is a clustered index for Id).

If you ask for an estimated plan, the "estimated lines" will be output dbo.Posts is 1,934.99:

EXEC #sp_PostsByCommentCount @CommentCount = 51;


The following statistic object was automatically created when I asked for the estimated plan:

DBCC SHOW_STATISTICS('dbo.Posts', (_WA_Sys_00000006_0519C6AF));


The highlights are:

• The statistics have a fairly low sample rate of 1.81% (67,796 / 3,744,192).
• Only 31 histogram steps were used
• The value for "All Density" is 0.03030303 (33 different values ​​were scanned)
• The last RANGE_HI_KEY in the histogram is 50, with EQ_ROWS from 1

If you pass a value above 50 (up to and including 2,147,483,647), the row estimate is set to 1,934.99. Which calculation or which value is used to create this estimate? Incidentally, the legacy cardinality estimator generates an estimate of 1 line.

Here are some theories I had, things I tried, or additional information that I could dig up when I was doing it.

### density vector

At first I thought it was the density vector, as if I had used it OPTION (OPTIMIZE FOR UNKNOWN), However, the density vector for this statistic object is 3.744.192 * 0.03030303 = 113.460, so that's not it.

### Extended events

I tried to run an advanced event session in which the query_optimizer_estimate_cardinality Event (from which I learned in Paul White's blog contribution Cardinality Estimation: Combining Density Statistics) and contains the following interesting treat:





So it seems the CSelCalcAscendingKeyFilter The calculator was used (the other says he failed, whatever that means). This column is not a key or unique or unconditionally ascending, but whatever.

When I googled that term, I came up with some blog posts:

These contributions indicate the new CE values ​​that these estimates outside the histogram are based on a combination of the density vector and the statistic change counter. Unfortunately, I have already excluded the density vector (I think so!) And the change counter is zero (per sys.dm_db_stats_properties anyway).

### Trace flags

Plan for computation:

CSelCalcAscendingKeyFilter(avg. freq., QCOL: (p).CommentCount)

Selectivity: 0.000516798


This is a breakthrough (thanks, Forrest!): That 0.000516798 Number (which seems to be unrounded in the XE) Selectivity="0.001" Attribute above) multiplied by the number of rows in the table is the estimate I was looking for (1,934.99).

I probably miss something obvious, but I could not reverse how this selectivity value is generated within CSelCalcAscendingKeyFilter Calculator.

## Reference requirement – module estimation with intersecting circular rings (quasi-additivity)

Generally for annulus $$A subset mathbb {C}$$ if $$A_ {1}, A_ {2} …. subset A$$ If there are disjunctive annuli, then we have
$$mod (A) = frac {1} {2 pi} int_ {A} int_ {A} frac {1} {| z | ^ {2}} dz> frac {1} {2 pi} int _ { cup A_ {i}} int _ { cup A_ {i}} frac {1} {| z | ^ {2}} dz geq sum mod (A_ {i}).$$

In my case I have

• shrink rings $$A_ {i}$$
• that could possibly be cut in some places $$Q_ {i, j}: = A_ {i} cap A_ {j} neq varnothing$$
• with hopefully controlled diameters $$diam (Q_ {i, j}) leq d_ {i, j}$$ go to zero when the annulus shrinks
• and the sum $$sum_ {i = 1} ^ {N} mod (A_ {i})> c_ {0} N$$ grows linearly.

To let $$U_ {N}$$ be a ring that contains $$bigcup_ {i = 1} ^ {N} A_ {i}$$,

Q: I hope for growth for $$mod (U_ {N})$$ e.g. $$mod (U_ {N}) geq c_ {1} sum_ {i = 1} ^ {N} mod (A_ {i}).$$

Thanks for all suggestions.

An interesting paper in this direction is "The Quasi-Additive Law in Conforming Geometry". I will tell the interested reader the result in all generality.

Let S stand for a compact Riemann surface with boundary, then for a compact subset $$K subset S$$ To let $$W (S, K) = frac {1} {L (S, K)}$$ Let be the outermost width (inverse of the outermost length) of the connection paths $$partially S$$ to $$K$$ insider $$S setminus K$$,

Consider open amounts $$A_ {i} subset S$$ for i = 1, .., N whose termination is a finite Riemann surface (not necessarily connected) with smooth boundary.

To let $$X: = W (S, cup_ {i = 1} ^ {N} A_ {i})$$. $$Y: = sum_ {i = 1} ^ {N} W (S, A_ {i})$$ and $$Z: = sum_ {i = 1} ^ {N} W (S setminus cup_ {j neq i} ^ {N} A_ {j}, A_ {i})$$,

If $$Y < xi Z$$ for some $$xi geq 1$$there is constant $$K$$ it depends on $$xi$$ and the betti pay by $$S setminus cup_ {j in M} A_ {j}$$ to the $$M subset {1, …, N }$$, so that
$$Y geq K Rightarrow Y leq 2 xi X.$$

## Repeat Relation – Complexity Estimation and Induction Proof

I tried to prove this by induction
$$T (n) = begin {cases} 1 & quad text {if} n leq 1 \ T left ( lfloor frac {n} {2} rfloor right) + n & quad text {if} n gt1 \ end {cases}$$
is $$Omega (n)$$ imply that $$exists c> 0, exists m geq 0 , , | , , T (n) geq cn , , for all n geq m$$

Base case: $$T (1) geq c1 implies c leq 1$$

We assume that now $$T (k) = Omega (k) implies T (k) geq ck , , for k and prove that $$T (n) = omega (n)$$,
$$T (n) = T ( lfloor { frac {n} {2}} rfloor) + n geq c lfloor { frac {n} {2}} rfloor + n geq c frac {n } {2} -1 + n geq n left ( frac {c} {2} – frac {1} {n} + 1 right) geq ^ {?} Cn \ c leq 2 – frac {2} {n}$$
So we proved that $$T (n) geq c n$$ in the :

1) The base case for $$c leq 1$$

2) The inductive stage for $$c leq 2 – frac {2} {n}$$

However, we must find a value that satisfies both $$n geq 1$$the book suggests such a value $$c = 1$$ which is not right for me:

$$1 leq 1 \ 1 leq2 – frac {2} {n} implies 1 leq 0 text {for n = 1}$$
My guess would be $$0$$ but is not an acceptable value; So let's just say it $$Omega (n)$$ but for $$n gt 1$$Or how can we handle it?

## Estimation of Confidence Interval – Math Stack Exchange

Thank you for writing a reply to Mathematics Stack Exchange!

But avoid

• Make statements based on opinions; Cover them with references or personal experience.

Use MathJax to format equations. Mathjax reference.