Wikipedia:Reference desk/Archives/Computing/2013 June 29

= June 29 =

Distinguish USB 1/2/(3) cords?
How to distinguish USB cords USB 1 and 2? USB 3 has a blue marker in the plug which makes it easy to recognize. Is there a similar clear visible difference when it comes to USB 1 & 2? I have plenty of 1 & 2 cords and it would be a real great effort to check each of them out one by one by connecting and see what the system tells me. Is there any shortcut to this? Thanks, TMCk (talk) 01:36, 29 June 2013 (UTC)
 * As far as I know, USB 1 and 2 use the same cables. RudolfRed (talk) 01:45, 29 June 2013 (UTC)


 * Actually no. There can be a difference in quality and shielding which makes your USB 2 device run slow if connected with the old standard. Also found a page that shows how to distinguish (most) cords here. Thanks anyway, TMCk (talk) 17:28, 29 June 2013 (UTC)
 * Actually that document appears to present a bunch of misleading or inaccurate claims, as unfortunately is common with unofficial documents claiming differences in cabling required by standards (e.g. you get the same nonsense about HDMI, these sources mention it and say why it's generally crap  ).
 * It suggests shielding was not required by USB 1.1, in fact various sources suggest it only wasn't required under USB 1.0 e.g. and if you ignore the insults and crap in this discussion from usenet . It suggests 3m was the limit for USB1.1 in fact our article says it's only the limit for low speed devices, full speed devices supported 5m (this isn't sourced but I'm pretty sure I heard the 5m length limit before USB2 existed). The documents also makes a big deal about their cable construction including the use of twisted pair for the data portion but our article says this has always been required.
 * Perhaps the most relevant to your question, it suggests the cable shold have a plus, in fact this document from the USB forum says it's only intended for ports, hubs etc and says it's not recommended for cables edit: because unique cables are unneeded . (None of my cables have the plus even ones for USB 2.0 HDs etc.) Edit: Forgot to mention, while I do not know if a recommendation that the symbol isn't needed for cables has always been given in documents and info from the USB forum to implementers, I strongly suspect the plus was never recommended for cables even if this wasn't explicit.
 * The only part of that document which may be mostly true is that USB 1.1 did not have any specs for the higher frequencies (edit: forgot to mention I didn't look for sources for this, I'm just assuming it's probably the case as it's fairly common with this sort of thing). You will note that requirements for the lower frequencies did not change. My understanding of cable design or for that matter the science behind transmission is limited but I believe it's likely possible a cable could comply with the lower frequency requirements but not the higher ones but I suspect given the specs and differences, it's probably not that likely. (Edit: To be clear, this would likely depend on the various parts of the spec which I haven't read and obviously don't know enough to be able to comment on. For example, it could be that the lower frequency attenuation requirements are completely irrelevant because the higher frequency ones will almost definitely mean lower attenuation for the lower frequencies. However it's fairly common in these sort of things that the tolerances the spec requires mean that any compliant cable particularly if it isn's at the extreme edges for example a 5m cable, will work even if it's closer to the borderline then would be ideal in a spec. This seems to be borne out by the various stuff coming from the USB forum and elsewhere.)
 * To be clear, I'm not suggesting that cable cannot make a difference, rather the most likely case is that a cable which does make a difference isn't USB1.1 compliant either (in other words, the number of cables which are USB1.1 compliant but cause problems for USB2 is small) and that there are undoutedly still some USB2 cables which do not comply and may make problems (although probably less likely both because the price difference is small and the it's more likely to be noticed). This discussion quoting stuff coming from the USB forum seems to agree . And of course perhaps the key point, unless you have the equipement to test the cable and perhaps even enough of the same kind that you can afford to cut some open, and the ability to understand the spec, you can't check compliance with either spec. Definitely going by the plus seems a rather flawed idea if it was your intention.
 * Note that this is quite distinct from USB 3.0 which requires extra wires so a USB2.0 cable will never be capable of providing superspeed support.
 * Nil Einne (talk) 07:22, 30 June 2013 (UTC)


 * Yes, the link I gave is not a definite source and has it's flaws, even more than I thought in the first place. After reading your reply and diggin' further myself I found out that even with USB 3 cords there are some troubles when running a USB 3 devise. Seems to depend somehow on the quality and length of cable and also the systems motherboard in use.Information online, (especially in forums), are quite conflicting [That's why I posted here in the first place].>br>Summarizing your response and my own research/experience as a layman, there is no clear answer since even i one knows what cord they're using the result can wary anyway. I myself will go from now on with cords made for USB 3, (even so I don't have it on my current main system), and thus get at least USB 2 devises run at maximum speed.. Some old USB 1 devises I still have and run once in a while are therefore of no concern anymore just as to distinguish between USB 1 & 2. Thanks for your input which helped me make up my mind in making a decision for the/my future.TMCk (talk) 04:23, 2 July 2013 (UTC)

29 to 30 inch monitor for gaming
Any suggestions? Budget would be around 850$, however wouldn't mind bang for buck. — Preceding unsigned comment added by 184.151.114.192 (talk) 07:02, 29 June 2013 (UTC)


 * You might want to consider a dual monitor system instead. With a single monitor that big, the corners will be significantly farther from your eyes than the center, making the picture seem distorted.  (This assumes you are sitting close to it, so you can read small text.)  With a dual monitor system, or even 3 monitors, you can arrange them around you, to approximate a curved shape, and thus prevent this distortion.  Many games and gaming PCs have support for multiple monitors. StuRat (talk) 07:33, 29 June 2013 (UTC)


 * As far as the price side you may wish to check out sites like NewEgg.com and TigerDirect.com for bargain basement pricing, they also have ways of free shipping & are pretty good at standing behind their products and selling quality goods new or refurbished.  Market St.⧏  ⧐ Diamond Way  08:01, 29 June 2013 (UTC)

I don't reaslly have space for dual monitors in my super-ergonomic college-student-living-with-parents room (http://i.imgur.com/XJx8NVN.jpg), and I'm kinda wary because not many games support dual monitors. Those sites are probably where I'm going to buy it, the new Quebec electronic recycling tax is like 50 $ for a frigging 700$ monitor. Eisenikov (talk) 10:58, 29 June 2013 (UTC)


 * Not a computer question nor answer here but I hear New Hampshire has no sales taxes so depending where in Quebec you are a FedEx or UPS store in NH may be much more cost effective for shipping. Please update us on what you decide, I am curious what you find as the best model, brand, product and website.    Market St.⧏  ⧐ Diamond Way   05:30, 30 June 2013 (UTC)


 * How about a HD TV with a VGA/HDMI input port. Plenty of 32" models available for around US$400 with 1920x1080 pixel resolution (seems to be a gap in the market between 27 and 32 inch, though).  Astronaut (talk) 19:50, 1 July 2013 (UTC)


 * That's a good idea. Something like this (though that particular one I linked to is out of stock, there are many other options listed on that page - for a lot less than your original budget.) --Yellow1996 (talk) 01:11, 2 July 2013 (UTC)
 * With such limited space, watch out for viewing distance. If you sit too close to a larger screen you might notice pixellation especially with older generation models. Also, with gaming, ghosting might become an issue. This is where the refresh rate matters. Current generation monitors have not only higher refresh rates than older TV's but technology that can help to reduce ghosting. You can also consider a plasma TV which has the advantages of better refresh rates as well as true black, compared to LCD/LED, which is brighter and sharper. However, there are pros and cons and the decision is yours... good luck! Sandman30s (talk) 13:37, 3 July 2013 (UTC)

Video colorization
Specifically this Ana Trijox music video here, I know that the text moving feature is an online software program but I am wondering about the color ball things that seem to travel throughout the video and cast shadows change shades of different objects etc. Is that a publicly available web software app/program? Thanks in advance.  Market St.⧏  ⧐ Diamond Way  07:59, 29 June 2013 (UTC)
 * The video looks like it makes heavy use of blend modes, which are a device for combining images in fancy ways. I'm not familiar with their use in video editors, but they are commonly used to create fancy still images with programs like Adobe Photoshop or GIMP.  Modes such as "overlay", "soft light", and "hard light" give effects resembling those. Looie496 (talk) 13:53, 29 June 2013 (UTC)
 * Thanks so much Looie496, I will check out your recommendations, also if any other editors have insights I am all ears ;-).  Market St.⧏  ⧐ Diamond Way  05:27, 30 June 2013 (UTC)


 * A little bit redundant on my part but those are definitely - as Looie points out - blend modes, particularily soft light. I recall working with photoshop a few years back and doing similar effects; though if you don't have PS already I would reccomend GIMP since it's free! :) --Yellow1996 (talk) 17:33, 30 June 2013 (UTC)

iPhone/iPad volume side design decision
Was it a conscious decision to have the volume on different sides between the iPhones and iPads? When held vertically with the home button at the bottom and the screen facing the user, for the phones the volume is on the user's left side, but for the tablets the volume is on the right side. I'm interested in documented statements by people actually involved in the design process. 75.75.42.89 (talk) 18:07, 29 June 2013 (UTC)
 * I can't tell you about documents (I guess you'd need to consult a Patent Office for them) but I'm pretty sure that the reason behind this would be functionality for right-handed people. With an iPhone, you hold it in your right hand and your fingers go round the back until the volume buttons are directly under your index and middle fingers or so. With an iPad, it's too big for that so your right hand will always be on the right hand side of the iPad, thus the volume key is under your right index finger by being there instead. Falastur2  Talk 19:37, 29 June 2013 (UTC)
 * I see now. Thanks. As a left-hander, I've always used my thumb on the phone which is why I noticed the difference. 75.75.42.89 (talk) 20:12, 29 June 2013 (UTC)

How left-handers do suffer !!85.211.134.120 (talk) 08:15, 30 June 2013 (UTC)
 * Yeah, especially for writing! I'm dominantly left handed so I hold the phone in my left hand and put it up to my left ear; though I'm quite ambidextrous so I could do either... --Yellow1996 (talk) 17:39, 30 June 2013 (UTC)
 * While I agree that it's likely due to right-handed dominance, I'm right handed and always hold my phone with my left hand. I don't know why but I've thought that it might be a sub-conscious choice in order to have my right hand free to do more complex tasks.  It might be worth looking into what hand right-handed people hold their phones with.  (Or I could just be a freak.)  This leaves my thumb to do the volume adjusting on the rocker switch for the volume.  Note: I have a Samsung cell phone and my wife has an HTC which both have their volume adjustments on the left side of the phone when looking at it in the same way as the OP's example.  Dismas |(talk) 09:11, 30 June 2013 (UTC)
 * Nope. Turns out I'm a bit of a freak. According to a study using your right hand to hold your cell phone to your right ear is dominant. And it indicates which side of the brain your speech center is on.  There are a number of news articles about this but they all seem to lead back to this study.  Dismas |(talk) 09:22, 30 June 2013 (UTC)

Geolocating
If someone is using a handheld device to edit Wikipedia and you trace the IP address, would the trace give you the device or the internet provider? I'm curious because, when I'm reviewing articles, I sometimes like to trace the location of recurring unregistered editors who are being a nuisance and, lately, many of those traces have been leading to Comcast Cable locations on the East Coast.WQUlrich (talk) 22:53, 29 June 2013 (UTC)


 * Most likely just the provider and city, but probably not the device. StuRat (talk) 04:11, 30 June 2013 (UTC)


 * The internet service provider has access to the device location, but that information is not available to the geolocate services. Looie496 (talk) 05:00, 30 June 2013 (UTC)


 * If they were really causing trouble though you could call the provider and request the information; whether you'd get it or not is another story! --Yellow1996 (talk) 17:41, 30 June 2013 (UTC)


 * I've tried geolocating myself when I'm on a 3G connection - the closest I get is the mobile phone company's proxy server (about 1000km from my actual location). Roger (Dodger67) (talk) 07:37, 1 July 2013 (UTC)


 * Try this geolocation demo or similar ones, search the web for something like html5 geolocation demo. Depending on your browser and hardware, you might get as close as a few meters (GPS accuracy) - or not, there are lots of variables in how it is done. This is not done using IP addresses, so it is not applicable to the original question above. 88.112.41.6 (talk) 19:59, 1 July 2013 (UTC)