Wikipedia:Reference desk/Archives/Computing/2011 April 3

= April 3 =

SeaMicro SM1000-64 System
Hello RefDeskers! The SeaMicro SM10000-64 Server is stated to comprise 256 dual-core processors, with 4 processors on each of 64 Compute Cards. Does this mean that the SM10000-64 is a single enormous 256-processor server, or is it a cluster of 64 quad-socket servers? Thanks as always. Rocketshiporion ♫ 03:03, 3 April 2011 (UTC)
 * Cluster of 64 4-socket servers, and the cpu's are not very powerful. It's for cloud hosting, not a supercomputer.  75.57.242.120 (talk) 08:27, 3 April 2011 (UTC)
 * If I am not mistaken, the SM10000 is a cluster of 256 servers and associated infrastructure in custom packaging. A SeaMicro white paper (SeaMicro Technology Overview) seems to agree. It describes each server as a credit card-sized board that is installed in groups of four into a motherboard. It then describes each credit card-sized board as a node. I doubt the Atom processor which the SM10000 uses has support for inter-socket multiprocessing since it is a low-end processor. Rilak (talk) 08:56, 3 April 2011 (UTC)
 * Awww... No supercomputing on the SM1000-64. Darn! And I thought I'd found a supremely-powerful supercomputing platform. Rocketshiporion ♫ 05:15, 4 April 2011 (UTC)

W3C standards as a source for browsers, or vice-versa?
I've heard it claimed that new feature additions to the W3C standards for HTML, XHTML, CSS, DOM and JavaScript are usually made to comply with, or arbitrate a compromise between, what the major browsers are already doing, and that it is the browser developers that invent new functionality. Is that right? I thought it was more common for features to be first codified in W3C working drafts and then implemented in browsers. Neon Merlin  03:22, 3 April 2011 (UTC)


 * I don't know much about the W3C specifically, but industry standards bodies are normally made up of representatives from organizations interested in implementing the standards, and implementations are written in parallel with the standards themselves, so that neither one comes first. It's also common for standards to simply document existing practice and introduce nothing new. An ivory-tower "standard" developed without the input of likely implementers would probably just be ignored. -- BenRG (talk) 22:15, 3 April 2011 (UTC)

All the principal browser vendors are part of the W3C. ¦ Reisio (talk) 01:11, 4 April 2011 (UTC)
 * Yes, but in the olden days they all gladly innovated independently, giving us things like the infamous blink tag . Much of the internet has been developed by "rough consensus and working code", and this carried over into browser land. Nowadays, however, the W3C spec is, more-or-less, leading implementations. --Stephan Schulz (talk) 02:17, 4 April 2011 (UTC)
 * Not really. ¦ Reisio (talk) 21:47, 4 April 2011 (UTC)

DVD world "Zones" - How to increase the allotment?
On my Dell Inspiron 1720, I have only about 4 more times to change the zone/regions. I plan to be some type of a jetsetter in the coming years, so it would be essential to keep changing DVD regions. What can I do to increase the allotment of region changes? Or how can I make it a universal-region DVD? --70.179.169.115 (talk) 04:12, 3 April 2011 (UTC)
 * Unfortunately, there is no 'legal' way of altering the number of times you can change the region code. If you want to try, please by all means try it yourself- I think under the no legal advice section is that we cannot tell people how to circumvent these security measures. General Rommel (talk) 10:48, 3 April 2011 (UTC)
 * General Rommel - perhaps you can quote the law or statute that you have in mind here - any jurisdiction will do. I find it difficult to believe that it is illegal to alter these settings on hardware that you own, with properly licensed software, and for a legitmiate reason e.g. you are frequently moving to different countries. It may be technically difficult, it may void certain warranties, it may not even be a sensible solution to the original problem, but how exactly is it illegal ? Gandalf61 (talk) 12:33, 3 April 2011 (UTC)
 * I agree with the sentiment, but see Modchip. To not buy multiple copies of a DVD is a circumvention of copy protection, and so is preparing to not buy them. 81.131.41.83 (talk) 13:33, 3 April 2011 (UTC)
 * I was not expressing a sentiment - I was asking General Rommel for a source for their assertions. Modchip says "The nonuniform interpretation of applicable law by the courts and constant profound changes and amendments to copyright law do not allow for a definitive statement on the legality of modchips", so it sits on the fence. Our article on DVD region code lists several ways of bypassing region codes through software or firmware; none of these methods are stated to be illegal. Indeed, the article says that the legal staus of region codes and mechanisms that enforce them is unclear in several jurisdictions, and in New Zealand, for example, they have no legal protectionat all. Gandalf61 (talk) 14:46, 3 April 2011 (UTC)
 * (EC) I don't know if 'not buying' is really the issue. I do agree there's a fair chance it may fall foul of the Digital Millennium Copyright Act restrictions on circumvent copy protection measures (although for other reasons it may depend on the reason you need to use a different region see for example). As with many areas of copyright law in the US, I don't think this has been tested but see  and also consider the fact the US Copyright office has refused to grant an exemption for removing regional restrictions .  (consider why the EFF felt it necessary to ask in the first place) and  (the US Copyright offices view appears to be that the current limited number of changes is fine). This is even more likely to be a problem if you use something, like VLC, which uses DeCSS to get around the restrictions, see DVD-Video as there is case law strongly suggesting it is not allowed (unless exempted) AFAIK (looking a bit more I think it's only distributed which is accepting as not allowed, possessing has not been tested and a is common I suspect it never will ). However as our article DVD region code notes, the legality of DVD regions themselves is questionable in some countries like Australia and possibility NZ. The OP however appears to come from the US at the current time. P.S. I seem to remember discussing this before so there may be more sources in the archives. Nil Einne (talk) 15:06, 3 April 2011 (UTC)


 * From VLC media player: VLC is one of the free software and open source DVD players that ignores DVD region coding on RPC-1 firmware drives, making it a region-free player. However, it does not do the same on RPC-2 firmware drives. - might be worth a try. -- 78.43.60.13 (talk) 11:40, 3 April 2011 (UTC)


 * All drives manufactured since 2000 are RPC-2 drives. But I think that VLC also bypasses region coding on RPC-2 drives (i.e., the article is wrong). There certainly are players that do. Also, the various DVD ripping programs work on modern drives. There are custom patched firmwares for some drives that disable the region lockout. There are also some software products (none of them free, as far as I know) that transparently remove the protection for all applications. The most straightforward solution is probably ripping to the hard drive and playing from there. As mentioned above, the legal situation is unclear. -- BenRG (talk) 01:44, 4 April 2011 (UTC)


 * I should have said that that was purely first reaction statement. But I believe it's just not gright to.. But if you do want, I can say that usually using a cheap $20 DVD player from an Asian electronic store will usually do the trick. General Rommel (talk) 13:04, 4 April 2011 (UTC)


 * One thing to note, and at the risk of stating the obvious, your PC's DVD drive doesn't know which country you are in.  If your existing Region 1 disks play now, they will still play if you visit Europe, Asia, etc.  You only need to change the drive's region if you want to play DVDs from other regions or you buy new DVDs locally.  For the latter, it might be better to buy a local drive as Gen Rommel suggests, especially if you can get one that is already multi-region capable.  Astronaut (talk) 13:35, 4 April 2011 (UTC)

DOS in laptops
Some models of laptops are coming with what they advertise as "Free DOS". What exactly is this ? DOS, as I remember, was an operating systems that is now history, it was used when there was no GUI and commands had to be written as text, what is meaning of using DOS in today's computers ? Jon Ascton   (talk)  14:14, 3 April 2011 (UTC)


 * FreeDOS. -- Finlay McWalter ☻ Talk 14:16, 3 April 2011 (UTC)


 * I've seen a fair few laptops like that in Malaysia. I'm not surprised if the situation is similar in India. AFAIK it's far less common in NZ and I would guess other places like the US, UK etc. FreeDOS is basically selling the laptop without an OS (by actually including an OS they can show something when they turn it on and I guess they can claim they aren't encouraging copyright violations because they did include an OS it's not their fault if the customer doesn't like it). I doubt many people use the FreeDOS for long if at all because there's not that much you can do with it. They could use Linux, FreeBSD or something else but that creates additional support issues. The reason for selling without OS is I'm pretty sure to reduce cost (or at least create the illusion of reduced cost, as our article notes they aren't always cheaper), with the assumption the customer is going to install their own own OS perhaps one of the aforementioned options but far more likely some copyright violation version of Windows (probably Windows 7 Ultimate or perhaps Windows XP Pro) or occasionally Mac OS X and hence the reason this is common in places like Malaysia and India. Notably while they could probably add a cheap version of Windows 7 for not that much more they may figure many are just going to change it to Ultimate anyway. I've also seen this tends to be with the cheaper models. Of course not everyone will do this, some may get the shop to install a legitimate version of Windows for them. (In other words, even if the customer is going to pay for the OS, you again create the illusion of lower cost by excluding the OS cost from the upfront cost.) I don't know whether the cost will be the same since Windows OEM licensing confuses me but labour time wise it's probably insignificant compared to many developed countries where it would likely add a fair amount to the cost. Nil Einne (talk) 14:37, 3 April 2011 (UTC)
 * Nil Einne is spot on. A laptop is sold with freeDOS so that the seller doesn't need to buy a copy of Windows to put on it.  Most people won't use the FreeDOS: they'll quickly upgrade, typically to something else free (pirated Windows, or some sort of Linux).  If you do plan on installing Windows legitimately, it's almost certainly cheaper to buy a laptop with it installed (if you can), as Microsoft gives discounts on their OEM licenses over retail licenses.  See this helpful guide. Buddy431 (talk) 16:04, 3 April 2011 (UTC)


 * Won't totally removing this DOS from laptop's harddisk will present any problems ?
 * Of course it is perfectly possible that the customer already has a license for a version of Windows they are comfortable with, and has no interest in an upgrade. 84.239.160.59 (talk) 17:03, 3 April 2011 (UTC)
 * Removing DOS shouldn't be a problem: when you install your new operating system, you'll reformat the disk, essentially erasing what was on it before. And you are correct, if a person has a valid license from an old computer, or a license that's good for multiple computers (I think Microsoft's retail licenses are typically good for up to three computers at once), they could install that.  Note, however, that most licenses for OSs that come with the software are only valid for the machine they came with (so called OEM licenses).  It is not legal (or at least against Microsoft's terms of use, legality depends on jurisdiction, how good your lawyer is, etc.) to install one of these on a new machine that it did not come with.  Technically, you may be able to do it anyway, depending on the reactivation process, and how well you can lie to Microsoft if you have to reactivate by phone. Buddy431 (talk) 19:23, 3 April 2011 (UTC)

Home Wi-fi networking question
I have several devices in a part of the house where there's no wired Ethernet port. I have a Wi-fi AP in the house but the devices in question don't have built-in Wi-fi support. What would be the best or simplest way to connect these devices to the (W)LAN in the house?

I've seen "Wi-fi extender" products with Ethernet ports, which seem to be a possible solution. Are wireless routers these days also usable as Wi-fi extenders? (I'm asking because, given the choice, I'd rather get a device usable in several ways than a special-purpose one.)

I remember seeing mods for converting a wireless router into the kind of Wi-fi extender I just described, but that was from quite a few years ago. If it's no longer necessary to do any mods, using a device unmodified is preferred.

--108.36.90.190 (talk) 16:57, 3 April 2011 (UTC)
 * If you say that the devices are incapable of WIFi or Ethernet, or any other wireless way of communication, well that would mak it very tricky.... General Rommel (talk) 01:45, 4 April 2011 (UTC)
 * I believe the OP was saying it's the part of the house rather than the device that lacks ethernet. --Sean 16:32, 4 April 2011 (UTC)


 * Some wireless routers are. See Wireless Distribution System.  This page touches on using Apple's AirPort Extreme or AirPort Express routers to form a WDS &mdash; the page is short on detail but has links and a very pretty picture.  Of course, may I ask whether you have considered whether the simplest solution is to add wireless capability to these devices, via a PC Card or USB device?  I'm pretty sure that in all cases, that'll be more straightforward than trying to configure multiple routers to talk nicely to each other.  Comet Tuttle (talk) 18:08, 4 April 2011 (UTC)


 * I believe you would be looking for a WiFi Bridge, one that will take wireless signals and allow a regular ethernet hookup. These devices were popular a few years ago when wireless technologies were just getting a footing, but slowly fell out of favor.  You will find them now labeled as "Gaming Adapters," mostly marketed to connect gaming consoles with only ethernet ports to Wifi.  For example, there is this one however I did find one that looks like it will fit your needs exactly.  We of course do not endorse Amazon or Netgear, but this looks like that second one will fit the bill perfectly.  Let us know how it goes!  --rocketrye12talk/contribs 20:55, 6 April 2011 (UTC)

Limits of lossless data compression
How close is something like 7-Zip to the theorectical limit for lossless data compression? I recall reading something about this in the past - why cannot the best possible algorithmn for lossless compression be deduced? Thanks 92.15.2.39 (talk) 17:10, 3 April 2011 (UTC)


 * How do you define 'best' ? Fastest, highest compression? Specific to which domain of files to be compressed? All algorithms make trade-offs to better suit the type of file they intend to compress, there cannot be a 'best' algorithm for 'all' possible files. But the theoretical limit for an algorithm suited to one specific file, might be 1 bit. Unilynx (talk) 21:26, 3 April 2011 (UTC)


 * Why not 0 bits ? That could be used to mean that the file is in some default state, hopefully the most common state for that type of file. StuRat (talk) 23:17, 3 April 2011 (UTC)


 * A large file of apparently random bits might be a bunch of zeros encrypted by some algorithm with some key. In that case the entire file can be compressed down to almost nothing if you can find the key, but is probably incompressible if you can't. This shows that optimal compression is at least as hard as breaking current encryption algorithms. Compression is also related to science, because the shortest representation of data is more or less the same as the simplest theory describing the data in the sense of Occam's razor. So optimal compression is as hard as doing science. It's related to artificial intelligence too, for similar reasons.


 * The only estimate of an optimum compression ratio that comes to mind is Shannon's 1951 paper Prediction and entropy of printed English, where he used the prediction ability of human subjects to estimate an entropy of about 1 bit per letter for written English (about 8:1 compression of ASCII text, in modern terms). The modern state of the art is around 8:1 for 1,000,000,000 bytes of an XML dump of Wikipedia, but this is not directly comparable since it includes a lot of metadata (and punctuation, which I believe Shannon didn't consider). 7-Zip didn't do very well on this test, with a total size about 40% higher than the state of the art (using PPMd). -- BenRG (talk) 02:15, 4 April 2011 (UTC)

I thought it had been mathematically proved that the theorectical limit for (dictionary-less, lossless) compression was a lot lower than achieved by current algorithms? 92.29.115.116 (talk) 10:31, 4 April 2011 (UTC)
 * If you look at the average over all possible files, the limit of compression is 1 (as in "no compression"), as can be seen easily by a counting argument (to uniquely identify an n-bit file, you need n bits). If you look at a finite subset, then Shannon indeed describes that limit. English text has a lot less entropy than random noise, and most program texts have less, again. In principle, encrypted files should have about the same entropy as their un-encrypted counterparts. However, they are mapped into the space of all texts in a way that makes them very hard to differentiate, and hence in practice they compress badly. --Stephan Schulz (talk) 11:51, 4 April 2011 (UTC)


 * There's no such proof, but I'd be interested in seeing your source. Maybe you're thinking of the Shannon limit? All state-of-the-art compressors use entropy encoding that is very close to an optimum proved by Shannon, but that's the easy part. The hard part is coming up with the probability estimates to feed into the entropy coder. -- BenRG (talk) 19:07, 4 April 2011 (UTC)


 * The "theoretical limit" is the Kolmogorov complexity, which is not a computable function for the same reason that the halting problem is undecidable. The article about it explains more.  75.57.242.120 (talk) 20:31, 4 April 2011 (UTC)

random numbers and the system clock
Why don't computer random number generators just use the last few digits of the system clock, ie. measuring fractions of a second? I know they use the clock, but only as a seed. Why would the last few digits of something measuring fractions of a second not be random enough? Is it because a computer program runs so quickly, and with such regularity, that there would be inevitable patterns in the stream of numbers thus produced? Thanks, It&#39;s been emotional (talk) 17:14, 3 April 2011 (UTC)


 * You pretty much have it. If the program asks for random numbers quickly enough (that is, faster than the counter wraps around), they aren't going to look random at all - they'll be in increasing order. Secondly, under similar loads, computers tend to run at about the same speed, so that any two invocations of the call to get random numbers will be at about the same difference in time so the two numbers will have a non-random difference between them (this applies even if the counter wraps around several times) - this is a killer if you have a loop to get multiple random numbers, say to shuffle a deck of cards or simulate a population. In either case you would need to pass the resultant value through a function which would "deorder" the values, and amplify the small variations between close numbers: this is effectively what pseudorandom number generators do. They also have the benefit of being repeatable; that is, with the same seed, you will always get the same string of "random" numbers - handy for debugging. -- 174.21.244.142 (talk) 17:37, 3 April 2011 (UTC)


 * Additionally, from a practical perspective, getting the current time from the system clock will require a system call, which is a comparatively expensive operation. Grabbing the time once, and then using a quick pseudorandom number generator thereafter, means the program has to make only one syscall; grabbing the time whenever a random number is needed will be a lot slower. —Bkell (talk) 17:49, 3 April 2011 (UTC)


 * On x86 processors the Time Stamp Counter can be read without a system call, probably a lot more quickly than the execution time of a decent PRNG. But, as already mentioned, it's not random enough. -- BenRG (talk) 01:26, 4 April 2011 (UTC)

Thanks to all, very interesting and exact, It&#39;s been emotional (talk) 08:54, 7 April 2011 (UTC)

Sound Card and Microphone Questions
Ok I have two questions. Firstly how do I check my sound card on Windows Vista. I have read that looking under the "Sound Video and audio device", section of the device manager or that running "Dxdiag" and checking the sound tab, should have my sound card listed however the only thing listed in the device manager is"high definition audio device", while the sound tab of dxdiag says "Digital output device (SPDIF) high definition) both of which seems more like a description of a sound card, then a sound card itself.

My second question is how do I  get my speakers to output what I say into my microphone directly on Windows Vista. I have read that right-clicking the volume control, in the bottom right of my screen, clicking on playback devices. Then right-clicking on speakers and selecting levels should let me uncheck a box to unmute my microphone, but the only thing there is a section called "speaker/headphones" which isn't muted. So could anyone help me with these problems. 86.162.151.128 (talk) 17:32, 3 April 2011 (UTC)


 * For what it's worth, on the Vista system I am currently sitting at, the relevant entry in Device Manager is "Sound, video and game controllers", and the two items are "ATI Function Driver for High Definition Audio" with a part number, and "IDT High Definition Audio CODEC". These are indeed just driver files.  The dxdiag tool has three relevant tabs, "Sound 1", "Sound 2", and "Sound 3", for the three hardware audio outputs on this machine.  Comet Tuttle (talk) 18:39, 3 April 2011 (UTC)

RAID 1 and Subversion questions
Hi,

I am new to setting up RAID systems. I'm currently setting up a Subversion server using Windows XP, having used this guide to getting Windows XP to format two drives (not including the Windows drive) as a single software RAID 1 volume. No errors have been reported.

Question 1: I now want to perform the practical test of disconnecting each of the drives in turn to see whether the other drive has in fact got all the data expected, but I realize that as soon as I boot up the system with one drive disconnected, something is going to get written to the active drive, so the two drives won't be mirrored anymore. How do other people perform practical tests of a RAID 1 volume's mirroring?

Question 2: When one of the two drives in the RAID 1 volume fails, and I install a replacement, what tool is used under Windows XP to mirror the good drive to the replacement drive?

Question 3: A Subversion question. All workstations will be running Windows. Is it stupid for me to set this server up as a Windows XP machine rather than an Ubuntu machine? I have the vague notion that the most recent Ubuntu may have better tools available for administration of the server; but I find it a little appealing to run a Windows server to service Windows clients &mdash; we've run into one problem in the past with a GNU/Linux Subversion server, when my users were renaming the case of files, and the Linux server allowed both the files "hello.c" and "hElLo.c" to exist in the same directory, which caused problems with all the Windows clients. In a vague way, I have the notion that a Windows server might prevent other, similar problems.

Thanks! Comet Tuttle (talk) 18:34, 3 April 2011 (UTC)


 * It's been ages since I worked with Windows XP, so I don't remember the answer to your second question.


 * Regarding #1, there is not really a non-destructive test for this. However, you could use dd on both drives after booting from a live CD, store these images somewhere else, and restore them after you performed your tests. Another option would be to use three drives, always rotating one out and checking it in another computer to see if it contains the needed data, then locking that drive away in your storage location/vault/whatever.


 * Regarding #3, please keep in mind that XP was never meant as a server operating system and thus places a limit on the number of simultaneous connections (it's a licensing issue, not really a technical limitation per se). So if you want to go the Windows route, prepare to have to shell out some $$$ for W2K8 server. On the other hand, if you wish to follow the F/LOSS route, take a look at mount options; for some file systems it is possible to force upper- or lowercase. Maybe there's an option in subversion for that, too - but I'll leave answering that to a subversion expert. -- 188.105.131.8 (talk) 09:04, 5 April 2011 (UTC)


 * Thanks for the partial answer. What would be useful to me in question #1 would be a jumper that would write-protect a hard disk.  Comet Tuttle (talk) 17:56, 5 April 2011 (UTC)

A tablet computer question
Dear Wikipedians:

With tablet computers replacing netbooks as the latest "fad" in computing, I myself am thinking of getting a tablet. I will probably not get an iPad, for reasons that will become obvious after I have outlined my questions below. But I do have the following questions before I make my tablet computer purchase decision.

I remember that for my recent desktop PC, I went to my local computer store and bought all the parts, like motherboard, CPU, memory chip, a new SATA 250 GB hard drive, and then assembled them at home. Since the hard drive is new, there is obviously nothing on it. So I had to partition, format and then install my favorite operating systems. Now the PC works like a charm.

Now I am wondering, for tablet computers, do I have the option of going out there and grab different parts and assemble my own tablet? Barring that, is there a tablet computer that comes as a "clean slate" and allow me to install whatever operating systems I want on it, including options for multi-booting, just like what I do with my desktop PC? Barring that, if I get a tablet, say, with Windows 7 preinstalled, can I wipe it out and then setup a Windows XP/Linux dual boot system, just like what I do with my laptop computer? And by "tablet" I don't mean the clunky early tablet computers with the pivotable screen like the Lenovo Thinkpad X, but rather the "cool" one-piece tablet like iPad's form factor.

Now to push the envelope a bit, for smartphones, do I have the option of going out there and grab different parts and assemble my own smartphone? Barring that, is there a smartphone that comes as a "clean slate" and allows me to install whatever operating systems I want on it, including options for multi-booting, just like what I do with my desktop PC? Barring that, if I get a smartphone, say, with Android preinstalled, can I wipe it out and then setup a Windows XP/Linux dual boot system, just like what I do with my laptop computer?

And if the answers to my above questions are "no"s in their most restrictive sense (i.e. the only option I have is to put up with whatever crap the OEM throws at me). Then my question is: What is the reason for this erosion of my digital freedom, where I am able to do anything I like with my desktop PC, yet can do nothing with my tablet/smartphone? And should I be worried about this trend of OEMs gradually taking stronger controls of us consumers, and telling us what we can or cannot do with the gadgets that we bought with our own money? Am I witnessing the beginning of the end of our liberal, western democratic states?

Thanks,

L33th4x0r (talk) 23:15, 3 April 2011 (UTC)
 * Increasingly, many tablets and smartphones are designed so that it is difficult to access the actual layer (like behind the nice interface of a phone) and do things like delete folders. As well as that, Hardware for such things nowadays tend to be customized by the manufacturer. And as for the customization of tablets and phones, it will become increasingly difficult because of the fact that these devices are ever so becoming smaller, which means that it will become very hard to DIY. You could try building your own tablet, but you will probrably end up with a large sized one instead. General Rommel (talk) 01:42, 4 April 2011 (UTC)


 * I'd hardly call the trivially larger form of the Lenovo Thinkpad X tablet clunky, particularly when comparing it to something so much less versatile, less powerful, & incredibly less extensible such as either model iPad. You're not going to find a time where a Thinkpad X won't fit and an iPad will, it just isn't going to happen. ¦ Reisio (talk) 02:08, 4 April 2011 (UTC)


 * At the moment tablets are closer to an Embedded system then a fully functional "computer". That might change in the future, but at the moment you probably find that most of what you are "hoping" for is not possible on tablets or smart phones. There might be some hackers who get a different OS running on a tablet or a phone, but no doubt it will take a LOT of hacking and will also probably offer only limited functionality compared to the device it was designed to run on. Might be worth to note that the same thing could have been said of most laptops ten years ago, now it's not uncommon for people to dual boot macbooks and the like, it's possible when tablets become more common and ubiquitous they might become also more interoperable. Vespine (talk) 02:17, 4 April 2011 (UTC)


 * The issue with tablets is that they are generally lightweight and the size people want them in precludes any real DIY stuff with the hardware. When you shove so many components into such a small place, the only way you can make a tablet that will offer any real competition to the iPad, Xoom, and other such tablets is to solder chips to a circuit board yourself. Tablets (and smartphones) have only a couple circuit boards, and there is rarely space for connections that are easily used by human fingers, much less interchangeable components. Unlike desktop PCs where there is usually plenty of space to work and long running standards for component sizes, placement, and connections a smartphone or tablet has a small amount of space to work in, and components are shaped and put together much like a puzzle. Even in the best case scenario where components were completely standardized, the amount of DIY work you could do on a tablet would be minimal at best. Your best bet right now is to look into rooting an android tablet. Building the actual hardware will result in something cost prohibitive in small runs, or incredibly bulky. Caltsar (talk) 18:54, 4 April 2011 (UTC)

Thanks for all the responses. I now fully understand the reasons behind not being able to DIY tablets and smartphones. I understand that with more powerful tablets and smartphones in the future, and them becoming ubiquitous, open platform and general purpose variants of those machines may eventually emerge. I am putting my faith now in riding the Kurzweilian exponential technological progress curve. 174.88.242.201 (talk) 21:37, 4 April 2011 (UTC)