Wikipedia:Reference desk/Archives/Computing/2010 May 12

= May 12 =

Script errors in email
Hi, all. I need a small amount of education about email scripts. I don't need to know how to write them, but I'd like to understand where they come from, and what in our environment I can reasonably do about script errors, if anything. Although there are forty in our building, with a "corporate" mail server, each two or three of us are an independent contractor, responsibile for our own desktop system -- OS, mail client, etc. Our group uses Outlook Express on WinXP. Replacing either of those is not a viable short-term solution.

A typical error throws a dialog box -- If needed I might be able to capture one, and with help could upload it -- which must be dismissed. "Syntax error in script" is a common message, along with a line number and a token, e.g., At line 43, unexpected "}".
 * In one case, we get this error on every single email from one particular vendor. So, my first reaction was that the script is part of the mail, comes from the vendor and gets executed for some reason after it arrives.  (Why exactly? Not a clue.)  If that's true, I can't fix it.
 * In another case which started appearing recently, the subject mail is from a little old lady who could no more send along an email script than fly -- so, maybe it's not part of the mail. Or, perhaps it's something her ISP is tacking on "for our convenience".
 * If there's a library of scripts that exist on our server, then why don't I get an error on every email? Well, that's because different emails take different paths through the scripts -- there are obviously if-blocks that execute some stuff and bypass other.  If that's the case, though, somebody should be able to fix them locally, yes?  They look like simple syntax errors.
 * It is possible that the errors are mailer-dependent -- i.e., I get them because I use Outlook Express but somebody else does not get them because they use something else. Don't know.

Bottom line: Where are these errors coming from? and Is there anything *I* can do to get rid of them?

Thanks, DaHorsesMouth (talk) 02:51, 12 May 2010 (UTC)

ubuntu workstation setup
I am planning to setup ubuntu 10.04 based workstation for development (web, python, eclipse, java), learning and also for a bit of entertainment (movies and build in games, mainly minesweeper) for two users.

I have one CPU (2GB RAM, 3GHz Dual core AMD processor, 120GB HDD), and two set of keyboard, mouse and LCD monitor. I would like to use either this as dual monitor configuration or two separate consoles, so that two users can login to this machine at the same time.

What hardware (preferably < $200) do I need more to setup this configuration ?

I would also like to add more CPUs to this setup but not in next year or two. --V4vijayakumar (talk) 03:39, 12 May 2010 (UTC)


 * You are asking about a multiseat configuration, where one computer is used by multiple local users. Here are instructions for MultiSeat Ubuntu - it seems that you will be forced to use KDE, but this should not be a problem for most purposes.  However, in this era of decreasingly expensive terminals, have you considered buying a netbook for each user?  These lightweight laptops can connect to the main Ubuntu terminal server, and also provide significant standalone functionality.  They may even be cheaper than a monitor and keyboard.  As far as adding CPUs to your computer, unless you have a multi-socket motherboard, this will not be an option without a major upgrade.  Nimur (talk) 07:14, 12 May 2010 (UTC)

Wikipedia syndication
I am (attempting) to watch someone's Special:Contributions with atom. I have the appropriate feed in Google reader. I am having a few problems with it, and I don't know if it's me, the feed, the reader, or just how Wikipedia works. First, it takes hours to update, which kind of misses the point of a "feed" in my opinion. Six hours until another update seems like an excruciatingly long time to wait, but maybe this delay is normal. Second, it sometimes won't update at all: I once waited 3 days for a single entry to show up but another is still missing in action. Third, it bizarrely keeps trying to show me the same contributions over and over and over and over again. Sometimes an edit from last month will randomly pop up as a new entry, or a recent edit will just come up as new, repeatedly, for days. With all of these things combined, I'm finding the whole thing rather useless. I don't use atom for anything else so I don't know if this is typical of web feeds in general. Can someone more knowledgeable please tell me what's going on? Differentially (talk) 06:38, 12 May 2010 (UTC)


 * Am I the only one watching contributions this way? Surely at least someone can say, "works fine for me, mate", even if they have no clue why mine doesn't?  Differentially (talk) 19:06, 13 May 2010 (UTC)

New laptop
Hi

I've just ordered a Dell Studio XPS 16 Laptop and the specifications are as follows:


 * Processor: Intel Core i7-720QM Processor (1.6GHz,4 Cores/8 Threads,turbo up to 2.8GHz, 6MB Cache)
 * OS: Windows 7 Professional 64bit
 * Memory: 8GB (4GBx2) 1333MHz DDR3 SDRAM
 * HDD: 500GB SATA 7200RPM Hard Drive
 * Optical Drive: Slot Load 8X DVD+/-RW Drive with DVD+R double layer write capability
 * Video Card: ATI Mobility RADEON HD 4670 - 1GB
 * Battery: 9-cell (85WHr) Lithium Ion Primary Battery

I would like to know what kind of gaming this configuration can handle. Primarily, I'm interested in Strategy/RPG games. I would also appreciate if someone could suggest some good ones from these genres which I could play on my system. Also, while I know that Photoshop CS4 can run on my system, I want to know if it would run smoothly and not clam up the system while rendering.

Thanks —Preceding unsigned comment added by 122.170.25.229 (talk) 07:18, 12 May 2010 (UTC)


 * You have a very good CPU (well, the clock frequency is not very high, but I still think it is very high-end considered it's a mobile CPU), a lot of memory and given the overall feeling of high-end of the system, I suppose the GPU is very good as well (I am not at all very familiar with the different lines of mobile video cards), so you could probably play most games just fine, and PhotoShop will not be an issue. --Andreas Rejbrand (talk) 08:25, 12 May 2010 (UTC)


 * Thanks for the prompt response. Would appreciate suggestions of any good (compatible) games from the Strategy/RPG genres. Also I would like some clarification about whether my video card is a dedicated one or integrated with my system memory. --122.170.25.229 (talk) 09:21, 12 May 2010 (UTC)


 * From what you posted, it appears to have 1GB of dedicated VRAM, which is becoming increasingly common in performance laptops these days. The specs on this laptop seem pretty good to me and should run any modern game and run it well. ATI Radeon is very popular and well-supported and any mainstream game should have support for this chipset. Amordea (talk) 09:27, 12 May 2010 (UTC)
 * The only real concern I'd have is from an upgrade standpoint. In two years, that 4670 will not seem so hot (it's already being pretty well supplanted by the 5000-series) and being a laptop, you most likely will not be able to upgrade this. Amordea (talk) 09:41, 12 May 2010 (UTC)


 * Thanks. I probably might look to replace it after two years. Reading some reviews of the Studio XPS 16, I have noticed that they all mention an overheating issue with this build. I have added 4GB of additional RAM to the laptop (making it 4GBx2); will this augment the overheating problem? --122.170.25.229 (talk) 10:16, 12 May 2010 (UTC)
 * No. RAM adds very little in the way of heat. The heat issue is an extremely common issue in performance laptops. Get yourself a nice cooling pad and monitor the temp. SpeedFan is what I use on my netbook since it can be minimized to the Notification bar with a real-time temp readout as the icon. Amordea (talk) 10:33, 12 May 2010 (UTC)


 * Warcraft 3. I think it's compatible with your system, and it's my all-time favourite game. 174.114.4.18 (talk) 20:56, 12 May 2010 (UTC)


 * Since any half-way decent computer built after 2003 can play Warcraft 3, yes it's compatible with his system. --DraconianDebate (talk) 02:59, 15 May 2010 (UTC)

Can I customize my mediawiki to only allow people with our school's email address to create accounts?
I'm on the student senate at my school, and we want to customize our mediawiki install so it only lets someone create an account (i.e. edit) if their email address ends in @[school_name].edu. We also want to force their username to be the [first].[last]@ from their email address. Later, we might expand this to more schools, so we would want the flexibility to add other @[school_name].edu's to have the ability to create accounts.

Question 1: Is that possible?

Question 2: Howso? Thanks! AGradman / talk. How I saw it:


 * There are a number of different authentication schemes for MediaWiki listed at its Authentication page - but none seem quite what you're looking for. The $wgEmailConfirmToEdit feature forces new users to supply an email address and receive email on that address before they can edit, but I'm not aware that you can limit those email addresses. It shouldn't be too difficult to hack the code for that to reject email addresses that don't correspond with your school.  I'd check with the folks on the mediawiki mailing list in case someone has already done that, or has a simple idea of how you'd add that. -- Finlay McWalter • Talk 16:01, 12 May 2010 (UTC)


 * On mediawiki-l@lists.wikimedia.org, Wikia recently pointed to some code that could be used for this. kcylsnavS {screech} 13:17, 16 May 2010 (UTC)

Can I own all copyrights?
Imagine I run a program that spits out all possible combinations of words, including computer code. If I register the copyright of this automatically generated texts, can I sue anyone who dares to infringe my rights? Which technical resources do I need for that?--Mr.K. (talk) 16:05, 12 May 2010 (UTC)


 * Leaving aside the legal complexities, Cantor's diagonal argument says that such a program is impossible anyway. Gandalf61 (talk) 16:13, 12 May 2010 (UTC)


 * No, Cantor says no such thing. The set of all words ("finite length sequences of letters") for a given finite alphabet is enumerable. Of course, it is infinite, so enumerating it in practice takes forever. If you also want to enumerate infinite sequences, then Cantor comes into play. --Stephan Schulz (talk) 16:19, 12 May 2010 (UTC)


 * But since the OP said "all possible combinations of words", I assume this includes infinite sequences such as "a a a a ...". Gandalf61 (talk) 09:40, 13 May 2010 (UTC)


 * No. As a general rule, only "creative works" may be copyrighted, not the output of a mechanism. This Stanford University Library FAQ says "to receive copyright protection, a work must be the result of at least some creative effort on the part of its author. There is no hard and fast rule as to how much creativity is enough."  The specifics of how much and what kind of "creativity" is enough will vary by jurisdiction, by dint both of varying statute and a complex web of jurisprudence, but what you describe is surely not "creative" anywhere. This leaves all kinds of interesting, but vexing, cases - for example, Bridgeman Art Library v. Corel Corp. says that in the US taking a photo of a 2D image isn't sufficiently creative, but the UK's National Portrait Gallery asserts that it is sufficient under English law (ref).  Another interesting question is SSEYO's Koan computer program, which generates computer music based on algorithms, user input, and random numbers.  SSEYO claims (or at least used to, a few years ago when I had a copy of it) copyright over the output of this program; I'd really doubt that such a claim would hold up in court, but I don't know if a comparable matter has been tested in court anywhere. -- Finlay McWalter • Talk 16:19, 12 May 2010 (UTC)
 * Firstly, copyright law generally requires some creativity, or at least hard work, to go into creating a copyrightable work, which you wouldn't be able to demonstrate (there is work required to create the program, but the work for each individual work is negligible). Even without that, do you know how many possible combinations of words there are? Even if we restrict ourselves to English, without proper nouns, there are hundreds of thousands of words. Even if you just had all possible, reasonably short, sentences (which wouldn't help you, since a work has to be of significant size to be copyrightable and a single sentence wouldn't be) there would be something like 10100,000 sentences (you could reduce that a bit if your program understood a few grammar rules, but not enough to be useful). If you stored them as ASCII text, you need about 101,000,000 bytes. The total information capacity of the observable universe is estimated to be about 1092 bits. As you can see, your idea is completely impossible. --Tango (talk) 16:21, 12 May 2010 (UTC)


 * I agree with the above; and Feist Publications v. Rural Telephone Service is relevant in some respects. Comet Tuttle (talk) 16:33, 12 May 2010 (UTC)


 * Curious, how is the "total information capacity of the observable universe" measured, exactly? Aylad ['ɑɪlæd] 16:39, 12 May 2010 (UTC)
 * I expect Tango is using the estimate from Seth Lloyd (2002), Computational capacity of the universe, Physical Review Letters 88 (23):237901. Algebraist 17:35, 12 May 2010 (UTC)
 * Of course, if you had all possible combinations of words, you'd be violating every existing copyright... ╟─ Treasury Tag ►  quaestor  ─╢ 17:38, 12 May 2010 (UTC)


 * A tangential point is whether something like RAND's A Million Random Digits with 100,000 Normal Deviates would hold up in court. My understanding is that RAND believes the data to be copyrighted—even though its usefulness relies on its being "truly" random—because the algorithms/hardware used to produce and validate the data are "creative" as defined by copyright law. Numbers themselves can't be copyrighted, but data in specific orderings and arrangements can (they are what the copyright law calls a "compilation"). The question is: if that is true (which I'm not sure it is), why wouldn't it be true in the case of text? I imagine the courts would see a distinction; the data is specifically copyrightable because of its arrangement, but the text would be presumably interesting because of its actual semantic content. (You could probably copyright "A Million Random Letters" but not claim that there was any semantic meaning to the text itself.) Anyway, I'm not sure there is enough case law to know for sure what would really happen and what the exact reasoning would be. --Mr.98 (talk) 22:35, 12 May 2010 (UTC)


 * There is a good deal of discussion in legal journals about whether or not formulae - i.e. software algorithms - are or should be patentable. The same logic could potentially apply to the discussion of entitlement to copyright. kcylsnavS {screech} 13:20, 16 May 2010 (UTC)

MacBook cooling
On the newest model MacBook, how is it cooled? Where are the fans located and when do they kick in? Chevymontecarlo. 16:24, 12 May 2010 (UTC)


 * I really don't know what the newest model of MacBook is. If you could provide the actual model# yourself, that would make research easier for others. In fact, you could do it yourself. A simple Googling of the model plus the search term "service manual" will likely come up with a result that will show you the inner workings of the laptop with some nice line-drawn or photo figures.
 * Typically, though, most laptops have a fan over the CPU's heatsink which disperses the heat conducting through it. Some models may optionally have some small fans on the back or sides which pulls the hot air out if the CPU fan alone isn't enough to do the job. These fans are always on but are designed to run quietly (low RPM's) during normal operation. When you can hear them is when they are operating at a higher speed than normal, usually during high-load situations to disperse the greater heat that is invariably produced under load. Amordea (talk) 22:40, 12 May 2010 (UTC)


 * Sorry, I am talking about the 2009 MacBook. Chevymontecarlo . 12:06, 13 May 2010 (UTC)


 * MacBooks have an unique cooling system. Instead of vents on the bottom/side like ordinary laptops MacBooks pull air from gaps under the keys on the keyboard, through internal fan(s) and out through a heatsink located near the hinge of the screen. That way they can preserve the clean lines on the outside while delivering sufficient cooling (or not, as in the case of the new i7 MacBooks). --antilivedT 21:35, 13 May 2010 (UTC)

Using a supercomputer
I've never actually "got my hands" on an actual supercomputer. However, I have occasionally used (at that time) enterprise-grade servers in one of my jobs. I'd imagine actually using a supercomputer is a pretty much similar experience - you just SSH into an external computer, where you get a UNIX command prompt and an X server. It's no different from using an average Linux desktop, only that everything works much, much faster. But how do people maintain the supercomputer itself? How is it powered up and booted up? There's got to be something they have to go through before it can even start up an SSH daemon, but what is it, and how is it handled? Is there some sort of direct user interface to the supercomputer? J I P | Talk 19:39, 12 May 2010 (UTC)
 * When you have a great many parallel computers, there is a special job scheduler, and special communications techniques to communicate between processes. For vector processors there are special instructions, and you also have to be careful not to waste the resources on a gazillion dollar computer. So it is more important that the program works. Graeme Bartlett (talk) 21:53, 12 May 2010 (UTC)
 * Google has something on distributed systems at http://code.google.com/edu/parallel/index.html, there seem to be videos too. --194.197.235.240 (talk) 22:39, 12 May 2010 (UTC)


 * "Supercomputer" is a fuzzy and vague term. This is especially true now that your average wristwatch contains more compute power than an early supercomputer from a few decades ago.  Regarding the "experience" - many systems do run Linux (or look like they run Linux).  The power of the Linux kernel is that it has been ported to all the x86 architectures, the Power architecture, many other types of RISC systems, and many obscure processor manufacturers; but some very esoteric processors still exist that really only run proprietary "unix-like" operating systems; and some do not even attempt to be POSIX compliant, but have a specialized operating system, with its own command interpreter, system libraries, and so on.  It may support SSH or telnet.  If the system does have Linux and can run X, you really wouldn't even know you're on specialized hardware until you start programming.  Many "supercomputers" replace your standard "gcc" and "make" toolchain with a series of proprietary, special compilers, special build utilities, pre-processors, post-processors, optimizers, and so on.  These might require variants of your favorite programming languages - special C preprocessor macros, FORTRAN inline comments to direct the optimization process, and so on.  At the same time, the power of open-source software often means that these specialized programming systems are just variants of gcc or open64, tuned for some unique feature of the supercomputer hardware.
 * There are generally two classes of "supercomputer" hardware in 2010. First, there are systems organized like enterprise data centers. These are massively distributed groups of standard, ordinary COTS systems ("blade" centers). These run conventional Linuxes or Unixes, and the task of making them "super" lies in effectively harnessing software-level parallelism and peak utilization of your network resources.  IBM Roadrunner probably falls in this category, though it has unique hardware and uses the Cell Processor.  The second class of computers have "unique" hardware - very unconventional systems, special accelerators, unusual processor architectures and arrangements, and so forth.  Sometimes, this means that the system can not run Linux or Unix at all.  Other times, it means the system runs one of the "enterprise" Unixes - Solaris, AIX, and so on. SGI makes a sort of interesting hybrid, using single system image linux.  This can be considered a sort of "virtualization" technology, and it appears that an entire server room is a single machine running a single operating system (with enormous quantities of RAM, CPU, and so on).  Convey Computer makes a "hybrid" supercomputer, which looks like an ordinary 2U blade server, but instead of a dual-socket Xeon, one of the Xeon processors is replaced by an entire separate FPGA reconfigurable board that is optimized for vector processing or other specialized tasks.  SciCortex (now defunct) made a 72-node single main-board, which looked like it was running Linux; in fact, there was a very minimal operating system running on most of the compute elements, and the front-end processor tasked jobs off to the compute elements with a hardware/network scheduler.  NVIDIA has mainstreamed its CUDA architecture, which gives you the capability to turn a GPU into a pretty potent supercomputer.  You might take a look at high performance computing - this is the new "buzzword" to describe "more expensive, fancier computer" now that "supercomputer" has lost its impact factor.  I still struggle to comprehend that I have more storage capacity in my L1 cache than the entire memory capacity of all the computers in the Apollo Program.  Supercomputers are as supercomputers do.  Most people waste the power of their commodity hardware; mostly, because they have no interest in learning how to use it at peak performance.
 * If you're interested in working with high performance computing, it would serve you well to learn how to program a system programming language like C. You should probably have some intense familiarity with computer architecture, especially the various types of parallelism now available.  You should be comfortable understanding the ugly interior of a POSIX operating system - what exactly is a shell and what exactly is a kernel function, and how those would map on to a different kind of CPU.  Nimur (talk) 03:22, 13 May 2010 (UTC)


 * A person I know used to operate a supercomputer. Well, not really. An IBM 1800 in the 1970s. At first startup required entering the date and time via toggle switches before the system could be powered up, but later he wrote a program in FORTRAN (using Hollerith cards) to read this information automatically from the gubmint's systems. After that it was just pressing a couple of buttons, just like today. Oh, the computer took up the better part of a thousand square feet, room temp was maintained around 65°-68°, and 18"x18" floor panels were removed with suction grips in order to get at the cabling beneath the floor. It was a cool place for a 17 year old like me to be. kcylsnavS {screech} 13:43, 16 May 2010 (UTC)