User:Martin Hogbin/Conditional probability

Yet another page of original research for me, but rest assured this is only to help me get my thinking straight and explain it to others without repeating myself continually on the talk pages. Anything I add to an article will be based on reliable sources.

 This page is currently under construction and thus much of it can be ignored

=Definition=

What does the term Conditional probability mean? The WP article defines it thus:


 * Conditional probability is the probability of some event A, given the occurrence of some other event B.

Note that no restriction is placed on the nature of event B. According to the above definition, event B could be absolutely anything, including something which it might be reasonably expected to have no effect whatever on the probability of event A. On this basis, all probability problems are clearly conditional, with an infinite number of conditions. Fore example: ''I draw a ball from an urn containing and equal number of white and black balls. What is the probability that it is white, given that: I breathe, the sun rose this morning, I am standing, the urn is red, and so on...''.

In my discussions relating to the Monty Hall problem, suggested restrictions have been proposed on event B. I list these below with my objections to them:

The event must reduce the sample space
This seems an obvious and simple suggestion but it has one simple weakness. It depends entirely on how you set up the sample space.

In my urn example above we might choose the following events

The event must be 'part of the experiment'
But how do we know what is intended to be 'part of the experiment'?

The event must be mentioned in the problem statement
This may well be a valid restriction. However, many events that are clearly not intended to be conditions of the problem might be mentioned in the problem statement.

A and B must be independent
True, but this gets us nowhere. For A to be independent of B the probability of occurrence of A  must not depend on the occurrence of event B.

The modern definition starts with a set called the sample space, which relates to the set of all possible outcomes in classical sense, denoted by $$\Omega=\left \{ x_1,x_2,\dots\right \}$$. It is then assumed that for each element $$x \in \Omega\,$$, an intrinsic "probability" value $$f(x)\,$$ is attached, which satisfies the following properties:
 * 1) $$f(x)\in[0,1]\mbox{ for all }x\in \Omega\,;$$
 * 2) $$\sum_{x\in \Omega} f(x) = 1\,.$$

That is, the probability function f(x) lies between zero and one for every value of x in the sample space Ω, and the sum of f(x) over all values x in the sample space Ω is equal to 1. An event is defined as any subset $$E\,$$ of the sample space $$\Omega\,$$. The probability of the event $$E\,$$ is defined as
 * $$P(E)=\sum_{x\in E} f(x)\,.$$

So, the probability of the entire sample space is 1, and the probability of the null event is 0.

The function $$f(x)\,$$ mapping a point in the sample space to the "probability" value is called a probability mass function abbreviated as pmf. The modern definition does not try to answer how probability mass functions are obtained; instead it builds a theory that assumes their existence.