The second paragraph simply says that if limn→∞P(|Xn−X|≥ϵ)→0 then we say Xn→X in probability. That's just a definition of what "convergence in probability means".
The first paragraph talks about specific sequences Xi - namely of those which are independent and identically distributed. You may view those as samples drawn (with replacement) from a fixed population with cumulative distribution function (CDF) D.
You are correct that such a sequence of samples in general won't converge in probability. In fact, they never do unless the distribution of the Xi is degenerate, i.e. there's an x with P(Xi=x)=1.
They do, however converge in distribution, which is a weaker form of convergence, and means that the cumulative distribution function (CDF) of Xn converges pointwise to D, i.e. that limn→∞P(Xn≤x)=
manpreet
Best Answer
2 years ago
I'm studying statistical analysis and there's something fundamental I'm missing about random variables and how they are used in defining convergence in probability or distribution:
In my syllabus (which is in dutch, so the terms i use might be slightly off), when talking about samples, it says that
A bit further, discussing convergence, it says
What I don't understand is what XiXi actually means in these two contexts. I read it as follows: In the first part, it is presented as one choice from the population: XiXi is the length of the
0
views
0
shares
Facebook
Twitter
Linked In
WhatsApp