Talk:Bit error rate

Packet Error Ratio
Isn't PER defined as the number of incorrectly received data packets divided by the total number of transmitted (not received) packets? --Richardtorres314 (talk) 03:55, 23 June 2016 (UTC)

Test time
How long would it take to test a 56kb/s modem connection? I estimate several days.
 * --TerrorBite 30/1/06 13:03 AEST

It depends on the error rate and the data rate. A rough rubric is to transmit 10-times the inverse of the expected BER. For example, if you have 1E-6 (one out of a million) BER, then test 10 million bits. This should provide about 10 errors, which is enough to make the calculation of BER have some significance. You need a significant number of errors to have confidence in the BER. Most people expect modem connections to be error free, which is technically impossible to measure. Therefore, a quasi-error free level, such as 1E-9, or one error in a billion, is used. How long would it take to communicate 10 billion bits on a 56kb/s modem connection? About 48 hours. (Jim Waschura, SyntheSys Research, Inc.)

Bit error rate
Isn't it supposed to be bit error rate? I have never heard the term bit error ratio. I tried to move the page but it was unsuccessful.

These are synonymous terms. Each are used in different contexts. (Jim Waschura, SyntheSys Research, Inc.)

berriz (talk) 13:48, 25 November 2007 (UTC) I was wondering the same thing, so I searched both terms in the wikipedia search box. I found out that they are not synonyms precisely, and although I'd never heard the term "bit error ratio" before either, it's easy to conclude that 'bit error ratio' is actually the most correct name for that relation, since it's unitless. A 'rate' is a specific type of 'ratio'.

--71.10.226.43 (talk) 16:46, 11 September 2009 (UTC)Another way to look at it is that 'rate' usually implies a time-dependent function. A valid bit error rate might be 1 bit/sec, but it has no meaning unless the bit rate is also given. Bit error ratio doesn't have that problem.

Normal practice is to express results as the ratio, but use of the term 'bit error rate' is a habit hard to kill despite deprecation in international standards for at least 25 years. (Check out ITU.)

Move requested

 * The following discussion is an archived discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section. 

The result of the move request was page moved. SchuminWeb (Talk) 05:05, 30 December 2009 (UTC)

Bit error ratio → Bit error rate — Bit error rate is a 3.3 times as common than Bit error rate according to books.google.com, and 14.6 times as common according to scholar.google.com ! Mange01 (talk) 02:07, 21 December 2009 (UTC)
 * Support per nom. "Bit error rate" is the phrase used in all the technical documents I use from day-to-day, even though it may not be the approved academic term.  Tevildo (talk) 21:42, 21 December 2009 (UTC)


 * The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page. No further edits should be made to this section.

Good connection?
Isn't it presumptuous to say "On good connections the BER should be below 10-9."? A good connection is a subjective decision...

--71.10.226.43 (talk) 16:53, 11 September 2009 (UTC) Yes, especially since different communications technologies and channels have vastly different nominal capabilities and requirements. Microwave telephony (old) was 10^-6, first fibre optics 10^-9, present 10 Gb/s optics 10^-12, computer clock circuits 10^-15 or even 10^-18.

Equation terms defined?
The equations are a little too vague for me. When measuring the BER in a real system, what numbers do you "plug in" to the equation? (And which equation should you use?) I think I can figure out symbol time and sampling time (though I'd like them clarified to be sure), but I have no idea what "bit energy" etc. are. Could someone please clear this up with a specific example? Thanks!
 * do your own homework. 128.128.98.46 (talk) 17:28, 30 January 2009 (UTC)

Bit error ratio vs. bit error rate
pointed out what appears to be a reliable ref which indicates BER has two possible meanings. I've incorporated this new information into the lead. More work is required to fully incorporate it and it is messy. Any comments on the meaning of BER or ideas for revising the article to appreciate two possible meanings? ~KvnG 17:01, 16 February 2015 (UTC)


 * I would like to point out an issue in these sentences: "The bit error ratio (also BER) is the number of bit errors divided by the total number of transferred bits during a studied time interval. Bit error ratio is a unitless performance measure, often expressed as a percentage." It follows directly from that definition that the units are errors per bit, so it is not unitless. See this page: "A quotient of two quantities that are measured with different units is called a rate.[5]". — Preceding unsigned comment added by 173.80.251.29 (talk) 16:24, 2 October 2021 (UTC)

Math nonsense
Mathematical draft is not a phrase that should ever be used and the term A is never mentioned or defined. This page is garbage. 151.197.173.92 (talk) 01:35, 5 March 2019 (UTC)