Talk:Chapman–Kolmogorov equation

I'm guessing that the Chapman in the title might be Sydney Chapman? -- The Anome 19:52, Jan 26, 2005 (UTC) Amended, from http://members.aol.com/jeff570/c.html --Adoniscik 17:17, 3 November 2005 (UTC)

Merge with master equation
The two terms are not the same: In the theory of Continuous-Time Markov Chains (CTMC) what is described under "master equation" is called the "balance equation".

Further, it is important to, at the least, ensure that both of these are "searchable" as some users may be familiar with one term but not the other, though they are related and both potentially useful to the same person, e.g.. I would suggest leaving as is and referring to the other in each via mention and/or a link.

Undefined variables
In the Application to Markov chains section, the variable $$s$$ is not defined anywhere. —Preceding unsigned comment added by 46.115.120.113 (talk) 16:18, 6 June 2010 (UTC)

Proof
What exactly is the form for the proof of the Chapman-Kolmogorov equation?

Isn't it just marginalization?

Does it start with a definition of a conditional probability? Then it requires the law of total probability and we are done? What else is needed?

All the proofs I can find assume some Markovian process.

Does someone know a reference to Chapman's or Kolmogorov's work w.r.t. this?

An example for a non-Markovian process where this equation is not merely the law of total probability would also be clarifying.

Anne van Rossum (talk) 12:56, 21 December 2014 (UTC)

Whether the proof is just marginalization depends on what one calls "the Chapman-Kolmogorov Equation". In "Handbook Of Stochastic Methods" by C.W. Gardiner, second edition, pages 43-44, marginalization is a proof for the equation:

$$ p(x_1,t_1) = \int dx_2 \ p(x_1,t_1; x_2, t_2) $$

which corresponds to the equation in the current article given by:


 * $$p_{i_1,\ldots,i_{n-1}}(f_1,\ldots,f_{n-1})=\int_{-\infty}^{\infty}p_{i_1,\ldots,i_n}(f_1,\ldots,f_n)\,df_n$$

However, Gardiner does not call that equation "the Chapman-Kolmogorov Equation".

Marginalization also proves the equation

$$ p(x_1,t_1\ |\  x_3,t_3) = \int dx_2\ p(x_1,t_1; x_2,t_2|x_3,t_3) = \int dx_2\ p(x_1,t_1| x_2,t_2; x_3,t_3)\ p(x_2,t_2 | x_3,t_3) $$

Gardiner says "This equation is also always valid. We now introduce the Markov assumption.  If $$ t_1 \ge t_2 \ge t_3 $$ we can drop the $$ t_3 $$ dependence in the double conditioned probability and write


 * $$ p(x_1,t_1 | x_3,t_3) = \int dx_2\ p(x_1,t_1| x_2,t_2) p(x_2,t_2| x_3,t_3) $$ which is the Chapman-Kolmogorov equation."

So Gardiner's definition of the Chapman-Kolmogorov equation is more restrictive than the definition given in the current article.

Tashiro~enwiki (talk) 09:31, 29 November 2015 (UTC)

Easier definition
I've added an item to Further reading (Introduction to Probability Models) where the definition is clear with the proof on the same page. I'm not sure if it corresponds with what is written in this article, because I wasn't able to grasp the definition here. — Preceding unsigned comment added by 2A00:1028:83D4:42DE:225:22FF:FEF6:293 (talk) 14:57, 20 March 2016 (UTC)