Discuss the below:
The Tchebychev inequality can also be stated in the following way:
Q: For any random variable x with mean equal to μ and variance equal to Δ². The minimum probability of X belong to the interval X?[ μ-k, μ+k] is at least:
P( | X- μ|
Suppose that the random variables x1, x2, x3... xn form a random sample of size n drawn from some unknown distribution, then the sample mean is expressed as:
=(x1+x2+x3+....+xn)/n
The mathematical expectation of sample mean is equal to:
E[]= μ
The variance of the sample mean is equal to:
Var[=Δ²/n
Now we are applying the Tchebyshev inequality to the sample mean to estimate the probabilities:
a) show that P(|-μ|>k ≤ Δ²/(nk²)
b) show that when the sample size increases, the probability of outside k units from the mean μ decreases and asymptotically approaches to 0
c) suppose we know the variance Δ²=4 and we don't know μ and we have observed the data x1, x2, x3... xn. How large the sample size n is required in order to make sure the probability of estimated μ will satisfy the following condition:
P(|-μ|>1) ≤ 0.01