Talk:Turbo code

turbo a "misnomer
I think the usage of the word turbo generally means "fast" in popular english, so this is not really a misnomer unless you understand what a "turbo" is in car mechanics. DarkShroom (talk) 13:19, 2 August 2014 (UTC)

Latency
"For satellite use, this is not of great concern, since the transmission distance itself introduces latency due to the limited speed of light." - Can the writer of this sentence justify this? Considering how fast the speed of light is, I don't think this can really contribute to latency. - Yongqli


 * Well, looking at the geosynchronous article here on Wikipedia gives the distance to a GEO satellite (approx. 36000 km), and the speed of light is 300000 km/s. The round-trip distance is twice the altitude, or 72000 km, and 72000 / 300000 = 0.24 s (or 240 msec), a significant delay in many types of communication. europrobe 10:38, 14 January 2006 (UTC)


 * I believe the reference is to deep-space satellie communications; the latency introduced in that case is on the order of minutes to hours. 15 April 2007

Error"correcting" code?
Is the Turbo code actually an error "correcting" code? To my understanding, separate demodulation, hard decision and then error "correction" justify that term. I believe, instead "error control" code has been used. Also, I just took from a lecture that LDCP codes will also have an error floor, although orders of magnitude lower. It can still be a problem for example in fiber optic links, where the BER can be designed as 10e-14 or so. I'm not expert enough to edit the article, but I thought I'd put it here for discussion. If needed, Biglieri's book on "coding for wireless channels" should provide references.

Internetexploder (talk) 14:16, 23 September 2008 (UTC)

The Shannon Limit
User:HughSW -- hats off to your addition How Turbo codes work -- really nice work! technopilgrim

Compared with Reed-Solomon
I am interested to know if turbo code is more efficient than Reed-Solomon error correction for those areas that Reed-Solomon is particularly used for. For example, given the same number of additional bits, is turbo code better able to handle errored signals? Is turbo code better able to handle missing signals? Is turbo code well suited to 'bursty' errors? Also, on modern desktop CPUs, which is most time-efficient for encoding and decoding? --Yamla 18:08, 2005 Mar 21 (UTC)

In terms of encoding efficiency, Turbo Codes are the best known (as mentioned in the first paragraph). Bursty errors are usually handled by interleaving/rearranging the bits (as in read solomon's usage on CD's). Sorry I can't tell you which is most time-efficient. 194.106.59.2 20:37, 28 Mar 2005 (UTC)

How turbo codes work
I think the emphasis of the How turbo codes work section is wrong: the focus is on soft-bits while this is not what Turbo codes made different, it is the fact that two codes in parallel (interleaved) are used. The soft decoding was already known before (: "It is well known that soft decoding is better than hard decoding") Emvee 20:43, 16 July 2005 (UTC)


 * This is true. (LDPC improves upon turbo coding for the same reason.)  --Piet Delport 20:34, 6 January 2006 (UTC)


 * Agreed; that was completely wrong. Soft-decision decoders are used routinely with just about every kind of code there is. I've removed that section. 66.30.10.35 01:09, 29 January 2007 (UTC)

Nitty gritty
Wow this article has the lame term "nitty-gritty" twice. (Once with a hyphen and once without, heh.)
 * Yeah, that seriously has to go.[update] Alright, I got rid of it.--Soban 15:00, 1 August 2006 (UTC)

Typical number of iterations?
The Wikipedia article says "typically in 15 to 18 cycles", but both articles referenced at the end say "typically 4 to 10". Which is more correct?

My gut feel is that the number depends on some external factor, like the number of errors, in which case the statement should probably be replaced entirely with an explanation of what factors influence the iteration count, and why. --Piet Delport 20:15, 6 January 2006 (UTC)

Sometime the definition of what constitutes an 'iteration' is blurred. Technically an interation is the combination of two component decode operations (MAP/Log-MAP/SOVA etc), one using the non-interleaved parity bits, and one using the interleaved ones (i.e. 6 iterations involves 12 component decodes). Sometimes people use the term iteration to just mean number of component decodes. It is not uncommon for a turbo decoder to feature an 'early termination' option, where you compare the hard output of the previous N iterations (where N is generally between 2 to 3). If the outputs are the same (i.e. they have converged), you can generally stop decoding with very little effect on BER. In high SNR channels this is good as the decoder generally quickly converges to the right answer (i.e. more iterations don't tell you anything new). Typically its also good in low SNR channels for the opposite reason (the decoder quickly converges to the wrong answer so more iteration doesn't help either). In mid-SNR channels the decoder usually run to its maximum number of iteration anyway. The maximum number of iterations required for a given application is generally derived from the expected SNR of the channel and the a required bit or frame error rate target for the link. --miterdale 12:51, 15 January 2006 (UTC)

Error Floor
In this article, no distinction is made between Turbo codes and Turbo product codes. Being no expert in the field, I'm not sure which is discussed here. I am, however, aware of one important limitation of Turbo codes: they exhibit an error floor at high SNRs. Turbo Product codes (from my understanding) exhibit no such error floor, and LDPC codes definitely do not. The orphaned article Error floor mentions that Turbo codes have an error floor (and, incorrectly, that LDPCs do), but other than that I find no mention of error floors anywhere in Wikipedia. I'm not knowledgeable enough to fix that, but if anyone else is looking for a good project... 129.128.210.68 17:23, 17 April 2007 (UTC)

Example
Really great article, but for someone coming out of the field, I would love to see some example to see how this works...

Product Codes
Great article. However, you only mention turbo convolutional codes. There is also a class of turbo product codes mentioned in IEEE 802.16. They are two (or three) dimensional hamming codes with parity that use the same style of iterative soft decoding that turbo convolutional coding does. 71.112.157.181

Examples
This article have lack of examples. If anyone got some example please add it. —Preceding unsigned comment added by 217.198.239.190 (talk) 09:57, 12 May 2008 (UTC)

Boundary conditions for divergence and the probability of failure
In a nice analogy of solving crossword puzzles, the article states, "Based on this new knowledge, they both come up with updated answers and confidence ratings, repeating the whole process until they converge to the same solution."

Or until they clearly diverge, with no solution possible. The failure case ought to be considered, since promising performance close to the Shannon limit without any risk should not be accepted without an estimate of the probability of failure. Failure in a serial communications algorithm could mean loss of an entire message block.

Of course, the probability of failure is the number of initial states for which the two parity calculations do not converge to a single hypothesis divided by the total number of initial states.

Someone who works with Turbo Codes could probably contribute a paragraph on the failure probability of this class of communication algorithms. David spector (talk) 20:14, 18 July 2009 (UTC)

Example: WTF???
It's unusual to see a hardware example used to describe what is essentially an algorithm. What would that look like in software? —Preceding unsigned comment added by 71.72.235.91 (talk) 20:19, 8 December 2009 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 1 one external link on Turbo code. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20130611235418/http://www.ima.umn.edu:80/csg/bib/bib16.0429hage.pdf to http://www.ima.umn.edu/csg/bib/bib16.0429hage.pdf

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at ).

Cheers.— InternetArchiveBot  (Report bug) 04:58, 10 November 2016 (UTC)

Maxime comment
How to do a comedy 105.112.219.7 (talk) 18:54, 14 June 2024 (UTC)


 * how to be a comedian 105.112.219.7 (talk) 18:55, 14 June 2024 (UTC)

Maxime comment
I want to be a comedian 105.112.219.7 (talk) 18:55, 14 June 2024 (UTC)