Talk:Examples of Markov chains/Archive 1

Link to "Markov chain example"
Hello all. It was I who originally added the link to "Markov chain example" from Markov chain, so I claim most of the responsibility for this article being empty(ish) for so long. :-) I have finally added a worked example to the page!  I'm not an expert on them (just had a short introduction in linear algebra) so some peer review of my contribution would be most welcome.

I have moved some of the previous content of the page here for discussion:


 * [This page has so little content now that I trust adding this comment to this page will not upset much. The statement above may not be true: one could construct a game in which the present move is determined not only by the present roll of the die and the current state, but also by the history of rolls of the die on previous turns.


 * This page could be very good if it were called Examples (plural) of Markov chains and listed a variety of examples.]


 * Please add more examples.


 * I would like to add examples like 1. customers brand loyalty, 2. study of life of newspapers subscriptions etc.


 * i dont know more about these markov chains..and if anyone would like to help me out please mail me at rohang82@hotmail.com (thanx)


 * Please add further examples (short descriptions like snakes and ladders, or longer worked examples). Customer brand loyalty and newspaper subscription examples could well be possible, if there's someone how knows enough about them to contribute examples!


 * I also agree that "Examples of Markov chains" is a better title for the article.--Ejrh 04:14, 18 December 2003 (UTC)


 * Moved. --Ejrh


 * PS. A special request to anyone who is good at mathemtical writing: please fix up the "steady state vector" section of the weather example. I kind of ran out of inspiration at that point.  :-/--Ejrh 04:18, 18 December 2003 (UTC)

Possible copyright violation
The weather example is straight from []. 24.255.46.150 08:21, 5 May 2006 (UTC)


 * Thanks -- I'll deal with this by either giving a new example or requesting this page be deleted. --Richard Clegg 09:17, 6 May 2006 (UTC)


 * That page is copied from Wikipedia, not the other way around. Note that his formula images use Wikipedia's typesetting and all formulas elsewhere on the page use a different style. And guess what... at the bottom of the page he even cites Wikipedia, though he doesn't point to the right page or mention that the content was copied verbatim. Fredrik Johansson 10:51, 6 May 2006 (UTC)


 * Ooops -- *blush* Thanks. Apologies.  Well, I guess we can consider that sorted then.  --Richard Clegg 11:04, 6 May 2006 (UTC)

Directions of Markov chains
The examples on this page are given as the transpose of what I am used to as the normal practice. On the wikipedia page about markov chains and in all textbooks I know Markov chains are formulated using left row vectors not right column vectors with the transition matrix as the transpose of what you have given. I imagine that there are textbooks using the other formulation. However, I think it is confusing to use both notations on wikipedia without at least some acknowledgement given. --Richard Clegg 09:02, 28 March 2006 (UTC)


 * I agree and think this should be made a high priority. It thoroughly confused me whilst researching Markov chains for an assignment, particularly since the Wikipedia Markov article refers to it the opposite way yet links to this page. -- Jono — Preceding unsigned comment added by 203.206.93.151 (talk) 15:12, 31 May 2006 (UTC)