Poisson Distribution and Conditional Probability

I’m stuck with a problem for my Statistics class. The problem says that there is a hospital where patients arrive at a constant rate of 2 patients per hour, and there is a doctor that works 12 hours, from 6 am to 6 pm. If the doctor has treated 6 patients by 8 am, what is the probability that he treats 9 patients by 10 am?

So, I was planning to solve this problem using Poisson distribution and conditional probability. I identified two events: A, where after 2 hours the doctor treats 6 patients, and B, where after 4 hours the doctor treats 9 patients.

For A, m=lambda*hours=2*2hrs=4, and P(X=6)= (e^-4 * 4^6)/6! = 0.104

For B, m=lambda*hours=2*4hrs=8, and P(X=9)= (e^-8 * 8^9)/9! = 0.124

So, given that events are independent in Poisson distributions, I try to calculate P(B|A) as P(B)/P(A) but I get a number > 1.

Also, I tried working the answer considering the time interval from 8 am to 10 pm, where the number of patients increased by (9-6)=3

In that scenario, m=lambda*hours=2*2hrs=4, and P(X=3)= (e^-4 * 4^3)/3! = 0.53 , but I think it must be wrong too, since it does not consider the event A.

Could someone help me with this?
Thanks!

probability – If $X$ is symmetric at $a$, then $Y= X + a$ is symmetric at $0$

Suppose $X$ has a density function $f$ that is symmetric about $a$. Let $Y = X + a$. Show that the density function $g$ of $Y$ is symmetric about $0$.

Setting $f(x) = g(x-a)$ gives you the result through basic algebra. But I’m having an inordinate amount of difficulty justifying the equality to myself. Could someone explain to me very clearly why (and if) the equality is allowed?

probability – Limiting behaviour of shifted random vectors

Let $X_n=(X_{n,1}, ldots, X_{n,n})$ be sequences of random vectors with increasing dimension, where $X_{n,1}, ldots, X_{n,n}$ are independent and identically distributed with absolutely continuous distribution $mu_n$. Let $Y_n$ be a sequence of random variables dependent on $X_n$, which also have absolutely continuous distribution.

Assume that:

(A.1) for a given bounded set $B$ and a constant $c in B$, $P(Y_n in B)=1$ and $Y_n to c$ in probability;

(A.2) for $i=1, ldots,n$, $X_{i,n}overset{d}{to} Z$, where $Z$ is a nondegenerate tight random variable.

Finally, define $overline{Y}_n= sum_{j=1}^{k_n}y_{n,j}1(Y_n in B_{n,j})$, where $(B_{n,j})_{j=1}^{k_n}$ represents nested partitions of $B$, which become finer and finer as $n to infty$, i.e. $k_nto infty$, and $y_{n,j}in B_{n,j}$, $j=1,ldots,k_n$.

Question: Let $E_n$ be a sequence of measurables sets in $mathbb{R}^n$ and, for each $cin mathbb{R}$, let $X_n – c=(X_{n,1}-c, ldots, X_{n,n}-c)$. Is it true that
$$
P(X_n-Y_n in E_n)-P(X_n-overline{Y}_n in E_n)to 0, quad textbf{(E.1)}
$$

as $n to infty$?

Comment: In the simplest case of $X_n$ being a sequence of random variables, by Slutsky’s lemma we would have $X_n-Y_noverset{d}{to} Z- c$ and $X_n-overline{Y}_noverset{d}{to} Z- c$, thus, I would say that in such a case the result in $textbf{(E.1)}$ holds true.

How about the increasing dimensional case described above? Would some sort of infinite dimensional version of Slutsky’s lemma do the job? Would it be true that $X_n overset{d}{to} (Z_i)_{i=1}^infty$, where $(Z_i)_{i=1}^infty$ is a countable collection of independent replicates of $Z$ (i.e. $Z_i overset{d}{=}Z$)? Does Slutsky’s lemma still apply?

Probability of picking items from a list when each item has a probability that indicates “try the next item instead”

In a computer game, there is a list of items and each item has an associated probability (0, 1) for being chosen. The game looks at the first item, rolls a PRNG (0, 1) and if above or equal to the associated probability, the item is selected (the rest ignored). If below, the game looks at the second item and repeats the process.

How can I calculate the overall probability for each item of the list given this selection process?

I know, for the first item, it’s always the associated probability ($p_1$). For the second, I suppose, it is $(1 – p_1) * p_2$, i.e., probability the first item was not chosen times the probability the second item is chosen. But then, what would be the third (and subsequent) item’s formula(s)?

What is the probability of having forks in PoW given different network size (miners count)?

Given a network of k miners, what is the impact of k on the probability of having forks in Bitcoin or any PoW-based blockchain?

It is clear for me that a network with fewer miners will have less probability of having a fork than a network with higher number of miners.

Is there any mathematical formula that shows this relation?

probability – Expected number of throws in a custom dice game, difference manual and simulation

There is a difference in my computed expected number of throws by hand and by simulation. The difference is about 1/4, but my question is, which one is wrong?

Clarification of the rules of this game:
We have $6$-sided dice, with colours Red, Green, Blue, Cyan, Magenta and Yellow. Based on the throw and the eligible paths you move a pawn.
The starting state in the game is position $0$. When throwing the right colour you can move your pawn to the next state, otherwise the pawn remains in the same position. The eligible paths are (with the numbers denoting the column):

  • 0 -> 1R, 1B
  • 1B -> 2C, 2G, 2Y
  • 1R -> 2G, 2M, 2Y
  • 2C -> 3B, 3M
  • 2G -> 3B, 3R
  • 2M -> 3R
  • 2Y -> 3B, 3M
  • 3B -> 4C, 4G, 4M, 4Y
  • 3R -> 4M, 4R, 4Y
  • 3M -> 4B, 4C, 4G, 4Y

My interest:
$E(X_i)$, with $X_i$ the number of throws to get in column $i$. $E(X_0) = 0$, since it is the begin state.

My computation by hand:
Let $Z_j$ the number of throws to get in one of the eligible next states from state $j$,
$$
begin{align*}
E(X_1) &= 3 + E(Z_{0}), text{since the propability of throwing one of those is 2/6.}\
E(X_2) &= (E(Z_{1R}) + E(Z_{1B})/2 + E(X_1) = 2 + 3, text{assuming independence between throws.}\
E(X_3) &= (E(Z_{2C}) + E(Z_{2G}) + E(Z_{2M}) + E(Z_{2Y}))/4 + E(X_2) = (3 + 3 + 6 + 3)/4 + 5\
E(X_4) &= … = 1frac{2}{3} + 8frac{3}{4} = 10frac{5}{12}
end{align*}
$$

The difference starts occurring at $E(X_3)$ in my simulation I get about $8.5$ a difference of about $0.25$, but when I compute $E(X_4)$ the difference is still about the same.
So I feel like the manual computation of the expected number of throws is wrong, what do you think?

number theory – a question about the probability of being a prime?

if we choosed a random number $a leq N$, then, the probability for $a$ to be a prime is $frac{1}{log N}$.

Now, if there are some primes that do not divide $a$, then what is the probability for $a$ to be a prime ?

EX: if $a leq 100000$, and both of {2,3,5,7} don’t divide $a$, then what is the probability for $a$ to be a prime ?

Channel coding and Error probability. Where are these probabilities from?

From where are the following probabilities?

We consider BSCε with ε = 0,1 and block code C = {c1, c2} with the code words c1 = 010 and c2 = 101. On the received word y we use the decoder D = {D1,D2} which decodes the word to the code word which has the lowest hamming distance to y. Determine D1 and D2 and the global error probability
ERROR(D) if the code words have the same probability.
Hint: To an output y there exists only one x which gets to a failing decoding.
(Y = 100 will only gets wrong decoded if we sent message is x = c1 = 010.) So the term (1− p(D(y)|y)) is equal to Pr(X = x|Y = y) for a suitable x.

Nun

$$begin{aligned} &text { Hamming-Distance: }\ &begin{array}{c|cc} text { Code } & 010 & 101 \ hline hline 000 & 1 & 2 \ hline 001 & 2 & 1 \ hline 010 & 0 & 3 \ hline 011 & 1 & 2 \ hline 100 & 2 & 1 \ hline 101 & 3 & 0 \ hline 110 & 1 & 2 \ hline 111 & 2 & 1 \ hline end{array} end{aligned}$$ $$left.D_{1}={000,010,011,110} text { (Decides for } 010right)$ $left.D_{2}={001,100,101,111} text { (Decides for } 101right)$$ $$begin{aligned}
E R R O R(D) &=sum_{y in Sigma_{A}^{3}} p(y)(1-p(D(y) | y)) \
&=overbrace{2 cdot p(y)(1-p(D(y) | y))}+quad overbrace{6 cdot p(y)(1-p(D(y) | y))}^{ } \
&=2 cdotleft(frac{729}{2000}+frac{1}{2000}right)left(frac{7}{250}right)+6 cdotleft(frac{81}{2000}+frac{9}{2000}right)left(frac{757}{1000}right)
end{aligned}$$
How do I get to the probabilities $$ frac{7}{250}$$ and $$frac{757}{1000}$$??

I don’t get this calculation. It should be right. But I don’t get how to get to these probabilities.

Could someone explain me this?

probability – Unknown expected value and variance in a reproductive property of distribution problem.

I am trying to solve the following problem:
When the random variables X and Y are mutually independent and follow the same normal distribution $N( mu ,sigma^2)$, where $mu$ is the expected value and $sigma^2$ is the variance, we have to find the probability distribution that follows $U=2X+3Y$.
This is the problem. I would think that the answer develops as follows:

$Xsim N( mu ,sigma^2)$

$Ysim N( mu ,sigma^2)$

$2X+3Ysim N(5mu,13sigma^2)$

From here on I would need the values of the expected value and variance, but I do not have them. Is it impossible without any data? Or is there another way to look for both values? Thank you very much for your help in advance.

probability – How to make MathNet.Numerics work with Unity?

Inspector_dllI have installed the latest version of MathNet on Visual Studio using the NuGet Package Manager. I’ve also, copied the MathNet.Numerics.dll to an Plugins folder in the Unity editor. Yet it still gives a compiler error: error CS0234: The type or namespace name 'Distributions' does not exist in the namespace 'MathNet.Numerics' (are you missing an assembly reference?)

Does anyone know how to solve this or knows a similar package that works with Unity?