Talk:Splaysort

Worst Case Performance
I am quite sure that the worst case running time has to be O(n2), as an already sorted input will generate a degenerate splay tree, in which each insert will take O(n) time.

But the alleged worst case running time of O(n log n) is used to compare splaysort to similar sorting algorithms, so it definitely does not look like an error caused by a lack of attention, and since I am quite the noob, studying cs in second semester, I am not really sure enough of my correction to actually make the edit.

If someone more experienced could please take a look at this and either explain my mistake or edit the article, I would be most grateful.

--79.213.241.100 (talk) 23:46, 9 June 2016 (UTC)
 * Only the first insert into a degenerate splay tree takes the full linear time. After that insert, the tree will have half its former height, etc. The amortized time per operation is logarithmic, meaning that if you add up a sequence of n operations, the total time is O(n log n). —David Eppstein (talk) 00:02, 10 June 2016 (UTC)
 * But if the inserted key is inserted as the only node in the empty subtree of the root of the degenerate tree, the tree will stay degenerate. And this is exactly what happens over and over again with already sorted input, which would therefore be the worst case and take O(n2) time, right?--79.213.241.100 (talk) 00:37, 10 June 2016 (UTC)
 * So sorry, I eventually got it.--79.213.241.100 (talk) 00:43, 10 June 2016 (UTC)