Wikipedia:Reference desk/Archives/Computing/2016 March 28

= March 28 =

TFT Defective pixel fairy tales
Articles, ISO-Norms and Shop descriptions all more or less claim its impossible to intentionally produce error free LCD-substrates. But customers actually never "bought" that narrative and when they buy some new TFT-Monitor online and get on send to them with defective pixels most send that Monitor back nomatter technical description of these clearly imply most as class 2 which means if it is a "4K"-resolution model you have to infact expect lots of defective pixels. So i assume most sold TFT's are infact error free because shops and producers know they wont get away with their definitions, theyr ISO-Norm and least with that class 2 categorie they sell most Monitors with. So why are they holding on these fairy tale norms and this silly narrative that its impossible what they infact do every day? Who makes up such silly norms? How old are they and how much further from reality may they be? --Kharon (talk) 14:11, 28 March 2016 (UTC)


 * ...Maybe you could attend an industry trade show, like Display Week 2016 in May in San Francisco, and direct your questions to the representatives of major manufacturers and resellers?
 * Technology manufacturers rarely like to give accurate information about manufacturing yield unless there is a good reason to brag about it. You might also like to learn about root-causes for defects in display hardware: we have an article on defective pixels that might broaden your understanding.
 * Nimur (talk) 15:39, 28 March 2016 (UTC)
 * The yield bit is also related to quality assurance and quality control. Again, things that are rarely advertised unless they think it's worth bragging about. SemanticMantis (talk) 21:19, 28 March 2016 (UTC)


 * I've seen this pattern before of disclaimers saying customers should have low expectations. One that comes to mind is cable TV contracts which say they don't have to fix an outage unless 6 or more houses in an area suffer from it.  If they actually refused to fix outages when there are 5 or fewer houses affected, they obviously would lose those customers, but they feel the need to offer that disclaimer anyway.  I suspect that in rare cases both might do what they threaten, though.  For example, if the manufacturer and cable company were facing bankruptcy, they might then do the minimum replacement or maintenance required by law.  (Another silly disclaimer is software that says "We are not responsible if our software destroys your computer".  I certainly hope that isn't actually a possibility)


 * As for the bad pixels, how those appear seems very important. If it's just dark, that isn't as bad as being some random color.  Also, if there is bleed-over from adjacent pixels, that could mask bad pixels, although it would also make the screen a bit blurrier. StuRat (talk) 15:53, 28 March 2016 (UTC)


 * I suspect your overestimating the number of customers who'd noticed or bother to send back monitors with a few defective pixels. And that's only referring to end user purchasers, not counting the majority of displays which likely end up in offices etc. Nil Einne (talk) 16:14, 28 March 2016 (UTC)


 * Actually i read that Dell and HP aka "office"-manufacturers have a "zero dead pixle" policy or offer one optional.
 * Anyway, manufacturers always try and manage to improve production quality over time, minimizing scrap production. I know for shure today industry will fire any manager who isnt constantly trying to cut preventable loss. Also now they do 4k screens on 24" i assume production technology is long past the state it was when these clearly outdated ISO-NORMS where written. --Kharon (talk) 21:13, 28 March 2016 (UTC)


 * One approach that might make sense from an economics POV is to continue to produce them with the occasional dead pixel, if that is cheaper than upgrading the manufacturing process where they can completely eliminate them. But, rather than sell them all to unsuspecting customers, they might inspect them and only sell the good ones.  Those with dead pixels, on the other hand, could be donated or sold at a reduced price, with full disclosure that they do have dead pixels.  (Selling them all at full price and relying on customers to return those with dead pixels would reduce the inspection costs, but increase shipping costs and customer dissatisfaction and therefore bad ratings.) StuRat (talk) 23:06, 28 March 2016 (UTC)
 * There's obviously a big difference between a normal policy and an optional one. If it's optional, how much does it cost and what percentage of office computer users actually take it up? You've provided no statistics to suggest defective pixels are as unacceptable to the ordinary consumer as you implied which is my key point, and your followup doesn't give much more. Also "I read" but where? Here's an example of a Dell pixel policy . Here's a HP one . Neither specify zero pixel defects (although HP has no full pixel). Of course large purchasers may negotiate their own policies depending on their demands and we won't see these. Just to be clear, I don't know that anyone in this discussion is disagreeing that production improvements probably mean pixel defects are rarer than they used to be, simply that they are unlikely to be zero and that you've provided little evidence for your claim that no one accepts a monitor with defective pixels. P.S. Dell and HP probably have less tolerance for dead pixels than some less well known manufacturers. Notably those cheap Korean monitors that are all the rage tend to be the lower quality panels (although problems may not just be pixel defects). And standards in developing countries may vary from developed ones, particularly with more mid tier manufacturers. Note also you said "send that Monitor back" yet in a number of places this won't actually achieve anything but waste your money. They won't be refunding you or giving you a different monitor, instead they'll be asking you to pay them to get the monitor back. Or more likely they'll just reject the shipment since you won't get a RMA number. Actually most likely you won't be sending it back but rather you'll have to take it back yourself and they'll argue with you a bit and either you'll leave or they'll ask you to leave.  Nil Einne (talk) 15:12, 29 March 2016 (UTC)


 * TESTED.COM - We Uncover the Dead Pixel Policies for Every Major LCD Maker. The Quixotic Potato (talk) 12:25, 29 March 2016 (UTC)


 * Good article, but note that it's over 5 years old. StuRat (talk) 13:14, 29 March 2016 (UTC)
 * ISO13406 & Dell & Acer & HP & ASUS & Samsung (in Dutch, I can't find the English version). Apologies to the Apple users; it seems like Apple does not share this info, although an internal guideline was leaked in 2010. The Quixotic Potato (talk) 13:45, 29 March 2016 (UTC)
 * I live in the EU, Germany, and here in EU customers have 14 days Revocation-Right on online purchase of standard products. I do know thats not a customers right everywhere so sorry i forgot to mention this when i pointed out that most customers will send back Displays with defective pixels. Ofcourse this only works where customers have a law to back them up.
 * Anyway, these pixels are huge structures in digital hardware and even 12 million (4kx3 subpixels) Transistors on a modern 500$ Display fail to impress when compared to 1.4 Billion Transistors in an Intel Pentium G3258 for 65$. And now dont tell me CPU's are cheaper and/or easier to produce. --Kharon (talk) 02:43, 30 March 2016 (UTC)
 * That still presumes 1) Customers purchase the product online 2) They actually notice or take time to look for defects in the first 14 days. For example, Slovenia and Croatia are both part of the EU, but consider these stats . Production techniques for LCDs are quite different from CPUs, I don't think $500 is an accurate price for the cost (based on wholesale LCD panel costs) but the prices strongly suggest it is more expensive although possibly requires less investment in the plants and maybe R&D. One point you're perhaps missing is that size does matter. If you're able to get 1450 chips from your 300mm wafer, that's quite different from when this wafer is smaller than even one panel, even if you have to work with a lot smaller structures in the former.  Another thing you're perhaps forgetting is that while the structures are a lot smaller, IC yields from wafers aren't 100%. Your IC manufacturer throws away, clocks down or whatever the defective ICs from any wafer. Your LCD panel which is bigger than this wafer (since you seem to be talking about large panels not phone ones) needs to be free from any significant defects to achieve your requirements. While the structures may be a lot smaller, there's no reason to assume even a IC manufacturer would have perfect yields if for example they were still doing 180 nanometer. Actually as our article mentions some are still doing so, their yields are probably higher, but it seems unlikely it's 100%. Of course one of the reasons is likely that it's not necessarily cost effective to chase the last remaining yield increases. Particularly since the reason for 180 nm is probably the lower investment cost, it's possible or even probable that effective yield from a wafer is smaller because the dies are so much larger.  A final thing you're perhaps forgetting is that the same applies to panel manufacturers. Improving yields is helpful but just as IC manufacturers chase smaller process nodes due to the ability to produce more, faster, less power hungry chips even if this means initially reduced yields, rather than always continuing to improve their existing nodes for higher yields; panel manufacturers may make changes to reduce cost, improve quality etc even if these reduce yields.  Anyway if these were so interchangable or CPU fabrication nodes were so superior, companies like Taiwan Semiconductor Manufacturing Company,United Microelectronics Corporation and GlobalFoundries would be destroying the likes of LG Display and Samsung Electronics. (Samsung have they own high end IC fabrication anyway so basically you're saying they're stupid and should just throw away their panel division and use their IC fabrication one instead.) Plus Intel wouldn't have to be so worried about the future, with the lead of their Semiconductor fabrication plants, they would have a new area they could compete in without having to worry so much about the rise of ARM CPUs. But none of this is happening because these are fairly difference processes and techniques even if there is some crossover. (And Foxconn did buy Sharp apparently significantly due to their panel business even though according to your idea, it be better for them to buy Semiconductor Manufacturing International Corporation or even something like Powerchip Semiconductor or Vanguard International Semiconductor Corporation.)  For these reason, talking about which one is easier is complicated and doesn't seem useful or helpful.  Nil Einne (talk) 19:23, 30 March 2016 (UTC)
 * But panels are sorted too. Companies like Apple or Samsung pick first class samples and what is left you can buy from "budget"-distributors like Qnix, X-Star, Wasabi Mango and alike. See . Ofcourse yield of an production line is never 100%, especially not on state of the art products but advances in panel technology are like advances in CPU technology obviouse reality. In Display technology they even start big with OLED now while in CPU technology they still only seem to "refine the shrinking". My point is this latest ISO-9241-307 "Defective pixel classification". Its from 2008. Its a near one decade old definition for computer hardware! --Kharon (talk) 09:25, 31 March 2016 (UTC)
 * Your first four sentences are just repeating what I said. None of what you said addresses my main point; which is that your suggestion that consumers never accept displays with defective pixels is at best unproven, in reality fairly unlikely. Nor any of my other points, like the flaws of your original comparison between CPUs and panels. Weirdly enough you now seem to be partly acknowledging my point with your mention of "panel technology are like advances in CPU technology" and OLED although I still don't see it makes much sense to argue over the comparative difference between the advances in each field. Suffice to say both have had their own significant advances whether or not they are necessarily advances that you feel were best. As for the ISO issue, it's the only point you raised I didn't already address as it was irrelevant to my points and didn't seem really interesting. Suffice it to say international standards can be slow to adapt. It may very well be that with technology where it is now and modern ultra high definition displays having several million pixels that it makes sense to have something between class 0 and class 1. Note of course, if yields have really reached as high as you suggest i.e. anything other than class 0 is almost unheard of, then there's little point bothering to make a new class between 0 and 1. So there's another contradiction with what you're saying. (The presence of the other classes in the ISO standard doesn't mean they have to exist now, there's little point removing them from the standard just because they're no longer used unless it's been a long time and you're updating the standard anyway.) On the flip side, perhaps you're missing the point of the standard. It's titled "Ergonomics of Human System Interaction". In other words, it's not directly targeted at quality, yields, or preferences unrelated to ergonomics. It may be that those involved in the ISO have deicded that even if with high PPI display or a display otherwise having many millions of pixels, there difference between the ergonomics of a class 0 and class 1 is small enough that something in between is unneeded. Likewise of course for between class 1 and 2. (And getting back to my earlier point, even if a class between class 0 and class 1, or 1 and 2 does make a meaningful difference to ergonomics, if nearly all displays produced now are class 0 or nearly all are class 0 and 1 and it's only between class 1 and 2 that there's a difference; it may not be useful to add an intermediate class.)  In case you don't understand, although a class 1 UHD monitor may have quite a few defective pixels this doesn't mean that an intermediate class with e.g. half the number of defective pixels is actually going to be significantly different from an ergonomics standpoint, which is the only issue that seems relevant to the standard. (You obviously could do other things like only allow subpixels but these may also not make a significant difference.) In fact, the smaller size of the pixels may very well mean the same ratio of defects/pixels in a UHD panel are generally less noticable or at worse equally noticable than those in a HD or lower PPI panel. (Admitedly I don't see that the standard disallows defective pixels to be next to each other however I've only read our article's summary not the actual standard. Noting of course if your pixels are smaller, the actually size of the defect is likely to end up similar however the higher PPI may or may not make it more noticable.)  If you are using the standard for anything other than ergonomics considerations, it may not be surprising if it doesn't work for you.  Nil Einne (talk) 11:58, 2 April 2016 (UTC)
 * Thank you all very much for your responses! The fact that this ISO is "overaged" was just a sidenote form me. I was questioning the chaim of poor yield that the industry is implying to struggle with! Interestingly the majority of Displays (almost all) are actually sold as Class 2, which seems poor. So you seem an additional bit off focus (besides missunderstanding me) with you discussion around class 0 and 1! --Kharon (talk) 19:57, 2 April 2016 (UTC)