Talk:Comparison of synchronous and asynchronous signalling

The description of "synchronous transmission" provided in the article is inconsistent with various physical synchronous interfaces, e.g., SMPTE-310M. While a synchronizing signal is required (and provided) to receive the transmission, it need not travel "on another wire" -- the actual synchronizing signal can be embedded within the synchronous signal.

From SMPTE/EBU Task Force for Harmonized Standards for the Exchange of Program Material as Bitstreams :

"Synchronous transmission" describes a transmission technique that requires a common clock signal (or timing reference) between two communicating devices to coordinate their transmissions. This common reference can be embedded within the signal, or can physically travel along with the signal on a similar or different medium.

"Asynchronous transmission" describes any transmission technique that does not require a common clock between the two communicating devices, but instead derives timing signals from special bits or characters (e.g. start/stop bits, flag characters) in the data stream itself. The essential characteristic of time-scales or signals such that their corresponding significant instants do not necessarily occur at the same average rate.

Perhaps confusing the issue are cases of non-linear signals, e.g., when compressed video is carried, wherein the video time base has an asynchronous relationship to the interface clocks.


 * I'm confused. When I read that SMPTE/EBU document, I see the same definition for "asynchronous transmission", but I don't see any definition in that document for "synchronous transmission".
 * Instead I see:
 * Synchronous: A term used to describe a transmission technique that requires a common clock signal (or timing reference) between two communicating devices to coordinate their transmissions.
 * E.3.3.3. Synchronization: Streaming data requires timing synchronization between the transmitters and receiver(s). This timing synchronization may be achieved through either the recovery of timing references embedded within the stream, or through the distribution of a system-wide clock to all participating devices.
 * That document does talk a lot about embedding timing reference in the data stream.
 * But I don't see anywhere that it defines that as "synchronous".
 * I'm also going to argue that the "timing synchronization" mentioned can be achieved through timing references embedded within a asynchronous transmission stream.
 * I can't find even one place where that document uses the phrase "common reference can be embedded within the signal".
 * Therefore, I think "single wire synchronous transmission" should really be better classified as "asynchronous".
 * If you expand the definition of "synchronous" to include not only two-signal protocols but also single-signal protocols, then what is left for the word "asynchronous" to describe?
 * --68.0.124.33 (talk) 03:26, 7 May 2008 (UTC)
 * If you expand the definition of "synchronous" to include not only two-signal protocols but also single-signal protocols, then what is left for the word "asynchronous" to describe?
 * --68.0.124.33 (talk) 03:26, 7 May 2008 (UTC)

Let me explain my view, hope it helps:

The "asynchronous" character transmission is asynchronous with respect to the charcters, these may arrive at any time; there is no need to have a common clock. However, on the bit level, a short-term synchronisation is required, which is thus correctly termed "plesiochronous", which means nearly synchronous. Better would be piecemeal synchronous. Usual implementations have an interface, where just the binary values of the bits are transmitted, and the data terminal has the devices to extract the databits from this raw demodulated data stream. The charcaters are received practically at the same time the sender did send it, plus some delay for traveling (which can be seconds for satellite connections). Note that both sides have to agree on a timing parameter, i.e. the bit time.

The "synchronous" character transmission used in e.g. HDLC is more complicated, and uses bit-stuffing and byte-stuffing to ensure enough clock information so that the receiver can have use a clock that is synchronous, i.e. has exactly the same frequency for more than just a single character. When this transmission was used in computers, the modem recovered the clock, and uses two lines, a clock and a data line. Thus, we have an explicit clock synchronised with the sender on the interface. However, this was not really necessary, and I am not really shure it always was used, because the data terminal could also recover the clock from the de-modulated signal, as above. It had to do a similar operation anyhow on charcter-level, i.e. remove the stuffing bytes. And after removal of bit and byte stuffing, the byte stream of the HDLC synchronous transmission was as asynchronous as that of the previous case. Note that receiver could automatically determine the bit timing, even if normally this was a parameter to be set.

So the EBU document is not really a definition of terms, but highlights the features of their interest, which are also commonly used: In a synchronous system, you have a clock line; if not, it is asynchronous. Which is simple, but not really useful. Then the raw data stream of a synchronous modem would be asynchronous, as well as every optical connection, practically every serial communication that uses a single line only. But in fact e.g. Ethernet and most other newer serial communication systems use a synchronous method for a block of data, e.g. a phase-encoded signal, thus it is a block-synchronous system. And as far as I know, ADSL etc set up a synchronous clock as long as the modem is "online".

There is also a really asynchronous system, that because of the sloppy use of the term "asynchronous", calls itself "self-timing" or similar. In this technology, on bit level the reception of a bit is singalled on an extra return line (normally), so there is no time agreement at all.

To summarize, it must be said clearly on which level the transmission is synchronous or asynchronous: Bits, bytes, blocks ?

Rainglasz (talk) 16:22, 3 September 2008 (UTC)

That seems to make sense. So ... "asychronous" means the transmitter can pause (at least between bytes) for an arbitrarily long long time -- a time that is perhaps not even an integer multiple of the bit times? While "synchronous" means the transmitter must constantly transmit at a constant bit rate, and even it's "idle time" must be filled with an exact integer multiple of 8 bit times?

Alas, I hesitate to just cram these definitions into the article, because of that pesky Verifiability policy. I wish there were some references for this definition.

Let's focus on bit timing for now, trying to decide when to sample each bit.

Several categories of serial communication links, in order of increasing timing sensitivity:
 * a "I'm ready" signal going in the opposite direction as the data signal, telling the transmitter when the receiver is ready for the next bit. There is no timing required. This is called "asynchronous" in delay insensitive circuits and "synchronous" in the MISO link of Serial Peripheral Interface Bus.
 * a "here's the next bit" clock signal going in the same direction as the data signal, telling the reciever when to sample the next bit. The transmitter must delay long enough to give the receiver enough time to grab that bit before going on to the next bit; otherwise there is no timing required. This is called "synchronous" in the MOSI link of Serial Peripheral Interface Bus.
 * one signal -- a asynchronous start-stop -- once the transmitter sends the idle-to-stop transition, it must send all the bits of a character with precise time intervals. The reciever must also have a clock within 5% or so of the transmitter's clock. However, the interval between characters has no particular timing -- it doesn't even need to be an integer number of bit-times; and it's OK if the receiver and transmitter disagree over whether an idle interval is 20 bit-times or 21 bit-times.
 * one signal -- where the transmitter sends bits with fixed bit-times, and bytes with fixed byte-times. If there is no good data to send, the transmitter sends complete idle bytes. Once the transmitter has started transmitting an idle byte, even when new urgent data needs to be sent, it always clocks out the complete idle byte before sending data. The receiver must somehow keep its bit-sampling clock at exactly the same frequency as the transmitters clock -- they must both agree that idle intervals are always integer multiples of a complete byte in order to avoid bit slip. Typically a line code is used that makes clock recovery easier. The recovered clock is used much like the "here's the next bit" signal in the above 2-signal system.

Well, I hoped that categorizing things in this way would help. But now I'm not so sure. --68.0.124.33 (talk) 06:57, 11 September 2008 (UTC)


 * This needs work. The basic idea of modern synchronous transmission is that the signal must contain some form of clock information, but preferably only a small fraction of the signal bandwidth should be used up by the clock signal. There are many ways to encode the clock signal:
 * a periodic sync burst (NTSC color TV uses this)
 * watching bit transitions, correcting the receive clock slightly whenever a bit transition is observed, and hoping that too many bits with the same value don't come through in a row (modems use "scramblers" to improve the odds on this)
 * providing a bit transition on every bit (this wastes half the bandwidth but is very robust; bar codes and swiped magnetic cards use this.)
 * Those are just some examples. There are other approaches, such as those used on magnetic disks and optical media. --John Nagle (talk) 02:47, 13 March 2009 (UTC)

Advantages and disadvantages
I have removed this table from the article because it is entirely uncited and wrong or at least misleading. Synchronous systems have simple and inexpensive transmitters and receivers but require an additional wire for clock. Whether this requirement makes the overall system more expensive or complex is application dependent. ~Kvng (talk) 22:48, 26 January 2020 (UTC)