Probability – The Riddle "Knight on an Infinite Chessboard"

I have a question:

Suppose a knight makes a "casual walk" on an infinite chessboard. In particular, the knight follows the standard chess rules each round and moves with a probability of 1/8 to one of his eight available boxes.

What is the probability that the jumper will be back on the starting field? How many trains are to be expected?

Here is a simulation: We can see that knights always return with an even number, but not with an odd number. http://varianceexplained.org/r/knight-chess/

It's just a simple, random walk on an infinite graph. So it is also recurring after a series of moves. Denote the $ X_n $ be the position of the knight in steps $ n $, It is asked
$$ mathbb {P} _0 ( tau_n < infty) =? text {and} mathbb {E} _0 ( tau_n < infty) =? $$
Where $ tau_n: = {n geq 1: X_n = 0 } $ and $ 0 $ is the starting point.

Is there a precise solution to get the result?

MGF of CDF probability

This problem was very strange because it was not a normal CDF type but constants. So I am not sure if the procedure is the same or not.

In the face of the CDF

$$ F_X (x) = begin {cases} 0 quad x <0 \
frac {1} {3} quad 0 leq x <2 \ 1 quad x geq 2 end {cases} $$

Find the MGF of $ X $, Find the expectation and variance of X by differentiating the MGF.

My attempt:

$$ f (x) = begin {cases} f (2) = frac {2} {3} \ f (0) = frac {1} {3} end {cases} $$

So the MGF $$ phi_X (t) = E (e ^ {tX}) = Sigma_xe ^ {tX} f (x) = frac {1} {3} e ^ {t (0)} + frac {2} {3} e ^ {2t} = frac {1} {3} + frac {2} {3} e ^ {2t} $$

$$ phi_X & # 39; (0) = frac {2 cdot 2} {3} e ^ {2 (0)} = frac {4} {3} $$

$$ phi_X & # 39; & # 39; (0) – frac {4} {3} ^ 2 = frac {8} {3} e ^ {2 (0)} – frac {16} {9} = frac {8} {3} – frac {16} {9} = frac {24-16} {9} = frac {8} {9} $$

Probability and marbles – Math stack exchange

My brother brings a certain number of his marbles to play in my room. Every marble is different. He has 8 marbles in total, which are either red or blue. One day I discovered two red marbles in my room. The probability of two of his marbles (of which he plays in my room) being randomly selected and both being red is 1/2. How many marbles does he bring to my room?

I tried to do this:

Let x = number of red marbles

In order to $ (x / 8) $ = Chance to pick red marble

and then $ (x-1) / (8 – 1) $ = Chance to pick the second red marble.

$ (x / 8) (x-1) / (7) = 1/2 $, but I specified x as a decimal, which is not possible.

Probability – independence in random orthogonal matrix

To let $ U in mathbb {R} ^ {n times n} $ be a uniform random orthogonal matrix. To let $ L in mathbb {R} ^ {m times m} $ Let be the submatrix of $ U $ in the upper left corner. To let
$$ L = (a_1, a_2, …, a_m) = begin {pmatrix}
b_1 ^ T \
vdots \
b_m ^ T
end {pmatrix} $$

Where $ a_j $ are column vectors and $ b_i ^ T $ are row vectors of $ L $, Define
$$ A = begin {pmatrix}
| a_1 | ^ 2 \
vdots \
| a_m | ^ 2
end {pmatrix}, B = begin {pmatrix}
| b_1 | ^ 2 \
vdots \
| b_m | ^ 2
end {pmatrix}, C = begin {pmatrix}
frac {b_1 ^ T} { | b_1 |} \
vdots \
frac {b_m ^ T} { | b_m |}
end {pmatrix} $$

Where $ | | $ is the Euclidean norm. It can be shown that $ frac {b_i} { | b_i |} $ is independent of $ | b_i | $ and it seems obvious that there are more such independence relationships. The questions are:

are $ A $ and $ C $ independently? are $ B $ and $ C $ independently?

Probability Theory – The convergence of a sequence is almost certain

To let $ X_1, X_2, ldots $ a sequence of random variables in a probability space, so that $ E (X_n) <2 ^ {- n} $, Show that $ X_n rightarrow 0 $ almost certainly as $ n to infty $

This is a problem I'm not so sure about. I have many sets of convergence, but I'm not sure which one to use here (if any). I may have thought about Markov's inequality, since she had an expectation, but that would not help me show the almost certain part.

I am currently practicing for my exams, so I would really appreciate if someone can help me solve the problem. Unfortunately, this exercise gives me no solution.

Probability – Explain $ E = frac {1} {2} left (E + frac {2} {3} right). $ – Mathematics Stack Exchange

Question: Throw a fair die until the game ends. The game ends when you receive a 4, 5 or 6. For every 1, 2 or 3, your score increases by +1. If the game ends with a 4 or 5, you will receive the accumulated score. If the game ends with a 6, you will not receive anything. What is the expected payout of this game?

I meet this question from this post.
Answer is $ frac {2} {3}. $

While I read the solution of André Nicolas in this post, I understand no step.

To let $ E $ to be the expected score. We condition on the result of the first litter. When the litter is $ 4 $. $ 5 $, or $ 6 $, then the amount we get, and with it the expectation $ 0 $,

Provided the first litter is a $ 1 $. $ 2 $, or $ 3 $Our expectation is Not $ E + 1 $, For the $ 1 $ is not a sure thing. With probability $ frac {1} {3} $ It will not be credited to you as the game ends with one $ 6 $, It follows that
$$ E = frac {1} {2} left (E + frac {2} {3} right). $$
That makes $ E = frac {2} {3} $,

I do not understand why we should get that $ frac {2} {3} $ in the equation
$$ E = frac {1} {2} left (E + frac {2} {3} right). $$

Is it the law of total expectation?

Probability – distinction between urn probability models

I have a question about the urn model. Suppose I have an urn:

Model A: Either the urn has 100 balls in it of which 70 are black and 30 are white.

Model B: Or the urn has 100 balls in it which are indeterminate in color. When removed, they turn black with probability 0.7 and white with probability 0.3.

I'm only allowed to remove one ball from the urn at a time, and I have to replace each ball after writing down its color.

Is there a statistical test that lets me know what kind of urn it is?

Is it true that I could replace the models with:

Model C: The urn has 1 ball in it which is indeterminate in color. When removed, it turns black with probability 0.7 and white with probability 0.3.

Model D: The urn has 10 balls in it of which 7 are black and 3 are white.

Does this mean that there are an infinite number of different models that would be indistinguishable from statistical sampling?

Probability Theory – What does "conditional distribution in a stochastic process" mean?

To let $ ( Omega, mathscr {F}, P) $ be a probability space.

To let $ X, Y $ Be random variables $ Omega $,

Then we say $ Z sim X | Y $ iff (i) $ int_ {Y ^ {- 1} (A)} X dP = int_ {Y ^ {- 1} (A)} Z dP $ and (ii) $ Z $ is $ sigma (Y) $-measurable.

Now let it go $ S: mathscr { mathbb {R}} times Omega rightarrow mathbb {R} $ be a stochastic process.

What does that mean? $ X | S $?

There are numerous newspapers that say "… because …" $ X | S sim S $. $ P (X in A | S) = S (A) … $,

I think that's NOT really a conditional expectation, but it's just one way to describe De Finneti's theorem. Is not it

What does it matter? $ X | S $ mean?

Probability – How to control the waterstone distance in relation to the characteristic function

To let $ mathcal P ( Omega) $ The set of probability measures supported by a compact subset $ Omega subset mathbb R ^ d $, To the $ mu in mathcal P ( Omega) $denote by $ F _ { mu} $ its characteristic function, i.

$$ F _ { mu} (x) ~: = ~ int _ { mathbb R ^ d} e ^ {i langle x, y rangle} mu (dy) ~ = ~ int _ { Omega} e ^ {i langle x, y rangle} mu (dy), quad forall x in mathbb R ^ d. $$

To let $ W ( cdot, cdot) $ be the Wasserstein distance of order $ 1 $, My question is if there is a steady function $ c: mathbb R _ + to mathbb R _ + $ With $ c (0) = 0 $ s.t. it applies to everyone $ mu $. $ nu in mathcal P ( omega) $:

$$ W ( mu, nu) ~ le ~ c big ( | F _ { mu} -F _ { nu} | big), quad mbox {with} | F _ { mu} -F _ { nu} |: = max_ {x in mathbb R ^ d} | F _ { mu} (x) -F _ { nu} (x) |. $$

Any answer, comments and references are highly appreciated!