Talk:S/PDIF/Archive 1

Obsolete?
No current device capable of processing or decoding the new generation of HD audio tracks on Blu-Ray and HD-DVD's will send these tracks over Optical out, they all require HDMI. Whether this is a physical bandwidth constraint or has roots in DRM I'm not sure, either way the TOSLINK cable is clearly well on the way to the grave considering it was only ever the marketed to home theater enthusiasts and audiophiles to begin with. —Preceding unsigned comment added by Hayaku (talk • contribs) 02:45, 12 February 2008 (UTC)

Concerning "Concerning "Obsolete?" above"
I am sorry I am not certain what you were trying to say in the above response. "take advantage of audio equipment that is separate from the television" ?? I was never referring to using the in built TV speakers. Ideally one would have a HDMI cable running from his/her disk drive to a HDMI capable AVR, which would then split the signal into video, running it to the display device via another HDMI cable, and audio, which would be converted to an analogue signal and sent to the surrounding speakers. TOSLINK is only capable of audio codecs with a bandwidth of 1.5Mbps, as opposed to HDMI which is capable of bitsreaming codecs as large as 24.5Mbps.

With the introduction of Blu-Ray, TOSLINK will find itself eventually phased out as it is not compatible with the HD Audio codecs of this format, and will only be of use to connect devices to pre-HDMI AVR's, which in the coming years will slowly vanish from existence anyway. TOSLINK will survive for a while yet, in the same way that most PC motherboards still have old-school printer ports when virtually all modern printers are USB 2.0, but I am more saying that since TOSLINK was always a boutique and niche technology, it will be made obsolete as it has already been overtaken and made redundant in the HD era of audio visual, and is strictly speaking unsuitable for lower end / common use too.

I just invested considerably in a home theater setup consisting of a 40" LCD panel, Blu-Ray player, and 7.1 speaker system, spending several thousand dollars in the process. Since my AVR however is a carry-over from my old system, it has no HDMI support and as such is only capable of decoding 1.5Mbps DTS, which is quite disappointing given the pedigree of the rest of my setup. When I eventually have the money to get a good HDMI AVR, it will use HDMI and HDMI alone... TOSLINK will have no place. The point I am trying to make is that since TOSLINK is inadequate to do what HDMI does, it has no place in (HD eta) home theater. Simply put, anybody setting up their own system today would be foolish to use TOSLINK instead of HDMI.

- Hayaku (talk) 02:35, 19 March 2008 (UTC)

I'd agree that it is on the decline but I still feel that it will be quite useful for some time. I'm still seeing it in pro audio equipment and a lot of consumer a/v. Part of the reason is that these expensive HDMI toys are unaffordable to many folks. A lot of people will have something laying around that uses TOSLINK and they will be sure to buy something that supports it (as well as HDMI for the day they can afford it ;) ). I'd also add that a lot of people there still want to keep their a/v simple and have no desire for anything beyond 2 speakers. When it comes to personal electronic music recording and making the best use of a PC-based a/v system, TOSLINK is nice to have.

There is a reason that parallel ports seem obsolete but just won't go away. A lot of people need to maintain compatibility with their existing commercial equipment. Dox-matrix printers are still being used and need to be supported. —Preceding unsigned comment added by 71.107.225.216 (talk) 19:25, 22 May 2008 (UTC)

Response to Hayaku's "Obsolete" argument
TOSLINK specifically is indeed going out, and rightfully so, considering that the usual TOTX/TORX transmitter/receiver parts result in significantly more jitter in the signal than just using a coaxial cable connection (measurements with time domain reflectometer were posted on a diyhifi.org a while ago). This is an issue since most DACs are timed by a clock signal recovered from the input line by a phase locked loop, and jitter in that clock directly results in amplitude errors during conversion. The only advantage TOSLINK has is that it breaks ground loops since it provides electrical isolation (but a properly designed system shouldn't have ground loops in the first place). But that's for TOSLINK specifically. S/PDIF, on the other hand, which is the format transmitted over both TOSLINK and the usual coax connection, is not going away for a good deal of time, despite various problems (check Hawksford's 1993 Audio Engineering Society paper "Is The AESEBU / SPDIF Digital Audio Interface Flawed ?"). Nonetheless, HDMI doesn't solve these problems and it introduces others (and in terms of jitter, it's worse than S/PDIF over coaxial and probably comparable to TOSLINK). This issue specifically is discussed at http://diyhifi.org/forums/viewtopic.php?f=2&t=1213&start=0&st=0&sk=t&sd=a If anyone has questions about my comment, feel free to contact me. ThVa (talk) 11:13, 26 May 2008 (UTC)

Is SPDIF the same thing as optical cable?
If so why are they in separate articles if not why is there a picture of an optical cable in this article. —Preceding unsigned comment added by 86.157.49.34 (talk) 18:46, 13 October 2008 (UTC)

Digital coaxial cable
In the article it says that "any" coaxial cable could be used to carry digital signals. While this is true in principle it is still a debatable subject with respect to the worthiness of the coaxial cables out there which claim themselves to be specifically designed for digital transmissions. From what I understand the difference between a "normal" coaxial cable and a specialist "digital" cable is that the "digital" one claims to be made of superior materials and with better shieldings etc. Whatever that means it is definitely showing up on the price tags as the "digital" ones are far more expensive. I've heard people saying in forums that if a "normal" coaxial cable is used when transmitting digital signals the audios coming out of the receiving end would occasionally skip, implying that the quality of the transmissions cannot be guaranteed with the "cheaper" cables. I'm not particularly familiar with the S/PDIF protocol but maybe its forward error correction is not as robust, if at all? So for the sake of completeness should we at least be stating this point in the article?? Ken l lee (talk) 06:26, 21 September 2009 (UTC)


 * Any cable, even a good one, can be damaged and suffer a degree of signal reflection. Poor impedance matching (poor cable-to-circuit match) can result in a lot of signal reflection. Signal reflection can cause errors to appear. Capacitance is a huge determinant of signal integrity for high speed data cables—a low capacitance cable is necessary for digital signals with megahertz speeds. An undamaged cable of low capacitance, made of normal, decent quality, well-matched with the transmitter and receiver circuits that it is connecting, will perform without failure. There's no need to spend Rolls Royce level money on superior cables if the Volkswagon ones are capable of doing the job. Binksternet (talk) 14:45, 21 September 2009 (UTC)
 * actually, capacitance per say is not a problem, so long is it, along with the inductance (and resistance) of the coaxial cable, results in the correct impedance (75 ohms for AES/EBU). It is the impedance that matters, not the capacitance. Failing to match the impedance (which would happen if you cut the capacitance without a corresponding increase in the inductance) will result in poor performance. Get the impedance right and the only limiting factor will be losses in the dielectric and increasing resistance due to skin effect as the frequency goes up. Even cheap TV aerial coax has a performance vastly superior to plastic optical connects.DGMatic (talk) 15:02, 11 November 2009 (UTC)

Limitation claims are misleading
"S/PDIF cannot fully decouple the final signal from influence by the properties of the source or the interconnect"

This is untrue. S/PDIF is a interface specification, not a specification for the implementation of a detector! It is a specification for how the data flows and the interconnects, not how you should detect the signal. The sources cited claim that jitter is a problem for PLL based S/PDIF converters. This is *one* way, and a cheap way at that, to lock to and convert the signal. Most importantly, this noise prone method of not using a local clock isn't "S/PDIF" as stated in the sentence..it's just a *way* to decode the S/PDIF signal. It's the manufactures saving money that causes a problem, not the specifications. If you look at the papers, they're ancient. I don't think any modern DAC will be unclocked since nearly all DAC's these days have DSP's.

This sentence should be changed to something like "Most low cost (PLL based, without a local clock) S/PDIF decoders cannot fully ...". Brandon.irwin (talk) 12:44, 9 January 2010 (UTC)
 * I was wondering if this was going to come up. While it is technically correct to assert that you can implement a device that will fully disregard the source clock and do its own thing, that device would not in any way be suitable for playback within a reasonable time-frame in typical S/PDIF applications.  That sentence could be rephrased to clarify that that limitation applies only under the constraint of real-time synchronous playback (typical of a DVD player, or a CD player if you don't want delays after seeking or pressing play), but the presence of the effect is not subject to cost or modernity -- only the strength of the effect is.
 * The reason that you can not disregard the source clock is that if you use your own clock then you strike a problem when you find that you're converting samples to analogue either faster or slower than the source is presenting them to you (this is inevitable, because you're trying to tune two independent oscillators -- one being of unspecified design -- to precisely the same frequency). While you can be a million kinds of clever about what you do, any adjustment that you make in response to that situation is undeniably an influence from the source clock, because if that source clock had been different then your response would have been different.
 * I haven't gone over that in detail in the article because it seems tedious and irrelevant. I had hoped that the references would have made the situation sufficiently clear. --ToobMug (talk) 00:36, 10 January 2010 (UTC)
 * Also, word clock recovery is already described in the article, and doubtless it features in the S/PDIF specification itself as it's discussed in all sorts of datasheets and app notes. This makes it a feature of S/PDIF even if you can get by without clock synchronisation.
 * If you do have any reference designs or app notes that do away with PLL-like synchronisation then I'd be interested to see them. --ToobMug (talk) 02:57, 10 January 2010 (UTC)


 * I think I've found a clearer way to explain it, covering the ambiguous points, in this edit: . --ToobMug (talk) 23:02, 10 January 2010 (UTC)


 * I think it's great! 71.130.51.44 (talk) 23:49, 11 January 2010 (UTC)

Table "Main differences between AES / EBU and S/PDIF"
Signal levels were incorrect, I have fixed them. Source Interfacing AES3 and S/PDIF Nshvydky (talk) 04:08, 10 February 2010 (UTC)

Applications
Is somebody having a laugh? "Invented by Chin Mao Ing in 200 bc" —Preceding unsigned comment added by 80.177.39.178 (talk) 12:00, 11 February 2010 (UTC)

TMDS
Transition Minimized Differential Signaling appears in See also. This article appears in See also over at Transition Minimized Differential Signaling. There is no mention of a connection between S/PDIF and TMDS in the body of either article. What gives? --Kvng (talk) 13:42, 8 September 2010 (UTC)
 * TMDS is used in dvi and hdmi. S/PDIF is similar to TMDS: SPDIF for audio, tmds for video. so the connect bridge is they both are something like communication protocols for transfering data(audio resp. video) between two places. why not set them both together in the see also section? mabdul 14:26, 9 September 2010 (UTC)
 * If I remember correctly, S/PDIF uses Manchester encoding not TMDS. Do you have a citation for a connection between S/PDIF and TMDS? --Kvng (talk) 15:12, 9 September 2010 (UTC)
 * I never said that S/PDIF used TMDS. mabdul 17:46, 9 September 2010 (UTC)
 * OK, I have removed the link. --Kvng (talk) 21:12, 9 September 2010 (UTC)
 * Shouldn't then manchester encoding mentioned? mabdul 21:36, 9 September 2010 (UTC)
 * Yes I think it should. Go for it. --Kvng (talk) 23:28, 9 September 2010 (UTC)
 * added. Somebody should write something about that topic! Nearly all articles in this segment are really bad referenced, partially bad written and mostly stubs or too less content. I have not enough knowlege to exand these article - I'm also busy in expanding web browser related articles. mabdul 23:38, 9 September 2010 (UTC)

Primarily used in Professional Audio Equipment
Not consumer audio equipment, although it has become much more common as many surround sound systems use it to transmit multichannel audio. If there is one thing that is a universal standard in the pro audio field it is S/PDIF. Almost every digital mixer, from the most inexpensive to the most expensive features an S/PDIF connector or two.

Feel free to delete this comment.


 * See AES3 for the AES/EBU professional standard from which S/P-DIF was derived. Maybe Pro-sumer ? But $20 computer soundcards and $50 media players often support it ! --195.137.93.171 (talk) 20:27, 31 December 2010 (UTC)

TRS connector physical format
My computer, an HP pavilion dv2000z, has two standard 3.5mm TRS headphone jacks. They both function as a normal headphone jack - I assume they are there so two people can watch a movie with headphones at once. The connector on the right is labeled "SPDIF," and, upon inspection with a flashlight, has a little dot at the end that the other one lacks. The article should list this physical format, but I don't know what it is. -kslays (talk) 20:37, 13 December 2007 (UTC)
 * See TOSLINK --195.137.93.171 (talk) 20:35, 31 December 2010 (UTC)

Item 1: Applications
Axeb (talk) 20:20, 5 January 2011 (UTC)

"A common use for the S/PDIF interface is to carry compressed digital audio as defined by the standard IEC 61937. This mode is used to connect the output of a DVD player to a home-theater receiver that supports Dolby Digital or DTS surround sound."

Is it correct to stipulate that the home-theater receiver be surround sound? This statement sounds confusing, because it seems surround sound might be referring to both Dolby Digital and DTS formats or exclusively DTS. The most basic DTS specification is a 5.1 channel speaker systemm but is this true for Dolby Digital? It is a matter of the specification for S/PDIF, IEC61937, I suppose. — Preceding unsigned comment added by Axeb (talk • contribs) 20:13, 5 January 2011 (UTC)

Still don't get it
This really needs a plain-english summary to the beginning. I'm an engineering student and this still seems too technical. So I agree with Bovineone.


 * "S/PDIF" plugs into your sound system. Stuff comes out the other end." Simple enough yet? —Preceding unsigned comment added by 70.27.232.22 (talk) 21:18, 10 March 2011 (UTC)

Manchester Encoding Irrelevant
Manchester Encoding has almost nothing to do with this protocol, apart from being a protocol. It is mainly used as a data carrier, as far as I know. I know most remotes use it, but it has not so much to do with this. Also Manchester encoding is not mentioned anywhere in this article. I know I'm not the best at giving arguments and  reasons in english. Ivaneduardo747 (talk) 03:12, 11 March 2011 (UTC)

too technical
This article starts out by describing too many technical details, as indicated by this question on the March 20, 2006 science reference desk. Adding more text to the lead section in simpler language may improve this. -- Bovineone 00:04, 21 March 2006 (UTC)


 * 5 years have passed and I believe the same. Ivaneduardo747 (talk) 03:07, 11 March 2011 (UTC)


 * Sometimes it takes a while. I have written a new lead paragraph. Let me know if it helps. --Kvng (talk) 03:05, 13 March 2011 (UTC)

How many channels can be sent over S/PDIF link?
Why is there no mention of the number of discrete audio channels that can be conveyed over an S/PDIF link? —Preceding unsigned comment added by 64.231.86.190 (talk) 13:27, 19 May 2011 (UTC)

✅ I also made some changes to AES3 to clarify there. --Kvng (talk) 14:23, 21 May 2011 (UTC)

Protocol section needs clarification
"There is one channel status bit in each subframe, making 192 bits in each audio block."

Since subframes are not described or even mentioned anywhere in the article, this sentence doesn't provide any useful information. Why does one channel status bit make 192 bits in each block?

"This means that there are 192/8 = 24 bytes available in each audio block."

What? Why does it mean this? Where does the 8 comes from? Brandon.irwin (talk) 12:17, 9 January 2010 (UTC)


 * There is 8 bits to each byte, 192 bits divided by 8 = 24 bytes — Preceding unsigned comment added by 60.240.163.190 (talk) 23:05, 24 December 2011 (UTC)

We know this, but why would a reader know this? What's the point is showing the arithmetic without explanation? If you know what the 8 is for, you don't need to see the math behind a bits to bytes conversion. Brandon.irwin (talk) 23:21, 15 June 2012 (UTC)


 * The protocol specification is intentionally incomplete because section explicitly refers to the AES3 article. The editors assume readers will click on the link provided for the necessary context. --Kvng (talk) 14:17, 18 June 2012 (UTC)

False!
the reason s/pdif doesn't support lossless formats ISN'T because they "require greater bandwidth". optical has more than enough bandwidth, 125 Mbit/s...  a 2x bluray will only read at 72Mbps...   The problem is flow control,  among other things. Lostubes (talk) 06:09, 26 September 2015 (UTC)
 * I tagged this claim in the article as "dubious" until this can be sorted out (hopefully with citations to reliable sources). It would be helpful if the article explained how much bandwidth is used when run as intended.  The "Protocol specifications" section says it doesn't have a defined bitrate, but there must be a de-facto rate (or rates, given there are multiple audio formats carried) for real-time audio reproduction. -- Beland (talk) 19:51, 19 May 2016 (UTC)
 * I have added a reference. It is possible that the electrical/optical interface could handle the additional bandwidth. The problem is the standard does not define these higher bitrates. If a transmitter was jacked to spew the required data, no standard receiver would know how to receive it. ~Kvng (talk) 17:41, 12 September 2016 (UTC)

HDMI info under bandwidth section.
Im curious as to how these items are related to the topic of the article, HDMI does not use spdif in any way I can see, and certainly not referenced in the aforementioned section on bandwidth.....which also doesn't actually list spdif bandwidths in the section....at all. If I didn't know the subject matter already, the only take away Id have for spdif from the bandwidth section of this article, is that HDMI looks like a pretty capable interface.....and thats it.2600:6C4E:1400:3D97:50F1:EC35:8963:C9A2 (talk) 16:21, 19 July 2018 (UTC)dude


 * I don't see how this belongs here either. I have reverted., please discuss here before restoring any of this. ~Kvng (talk) 13:50, 20 July 2018 (UTC)