Wikipedia:Reference desk/Archives/Computing/2009 December 28

= December 28 =

IE and absolute positioning divs
I have a div that I place with absolute position using top:1em; left:1em; right:1em; bottom:1em;. So, it should be centered in the window with a 1em margin all the way around. If I set overflow:auto, it will have scrollbars inside the div if the contents overflow. It works fine in Firefox, Konqueror, Opera, etc... I checked it in IE and it vertically expands the div. Instead of placing "bottom:1em" at the bottom of the window, it places it at the bottom of the virtual window that has been expanded to fit all of the content. Is there some bastardized trick to convince IE that by "bottom" you mean "bottom of the damn window" instead of "bottom of the stretched out crap you just did"? -- k a i n a w &trade; 00:48, 28 December 2009 (UTC)


 * Sounds like a z-index & nested DIVs problem to me. Can you post some code? 218.25.32.210 (talk) 03:40, 28 December 2009 (UTC)


 * There isn't much to post...

&lt;html&gt; &lt;body&gt; &lt;div style='position:absolute;left:1em;right:1em;top:1em;bottom:1em;overflow:auto;'&gt; Put more than a full screen of text in here. It *should* create a scroll inside the div. In IE, it stretches the div out vertically. &lt;/div&gt; &lt;/body&gt; &lt;/html&gt;
 * What is expected is that this code will make a div in the center of the screen with a scroll down the right. -- k a i n a w &trade; 03:48, 28 December 2009 (UTC)


 * Well I can reproduce your problem and can't come up with a way to fix it. I suggest you post the question & code at Stackoverflow.com - they should set you straight within 30min. 218.25.32.210 (talk) 04:59, 28 December 2009 (UTC)

Readyboost
I have a computer with 2 GB of ram running Windows Vista and was wondering about Readyboost. I have a 1 GB flash drive, USB 2.0, that I am not using so I just hooked it up and set it up to do Readyboost. My question, I guess, is how much will this help? I also saw a 2 GB flash drive at Walmart for $6 so I thought about buying it. Would it make the amount of help noticeably better? I know my hard drive runs at 7200 rpms. I know you can't say with 100% certainty and you'd probably need more information, but in general do you think this would help and how much? I read the Wiki article Readyboost and it says things such as Windows recommends you use a flash drive 1 to 3 times as large as your ram. I didn't do this obviously. I also read some article that said Readyboost can actually slow your computer down in Vista, but the problem is fixed in Windows 7. So, with things like this, I just want to know more. Thanks for any help. StatisticsMan (talk) 04:22, 28 December 2009 (UTC)


 * Interesting. Have to admit I'm not expert here as I still use XP!. 2 GB isn't all that much RAM nowadays. I think you are always better of having more RAM, (which is comparitively dirt cheap to what it was a few years ago) up to a certain point as 2x ram will NOT give you 2x PC 'performance'. But I suppose flash drives are even cheaper. To measure 'help' I think you would need a HDD benchmarking utility.  According to the article Readyboost only helps with HDD caching. 7200 rpm drive sounds a bit old, is it SATA (Serial ATA interface)? If you are using older Parallel ATA interface, it's possible you may get better results from a small SATA HDD. Though BIG drives are much cheaper per giga-byte. Theoretically I don't think you would get much difference for a 2GB over 1 Gb flash drive. The particular drives seek time, as per the Readyboost article are probably more important. It appears also that the flash drive must be marked as "ReadyBoost-capable". --220.101.28.25 (talk) 06:31, 28 December 2009 (UTC)


 * My recommendation would be to take out a stopwatch and time how long it takes for your computer to start up with and without ReadyBoost enabled. I enabled ReadyBoost on someone else's computer that was running Vista and noticed a significant improvement. He had 1 GB of RAM. There are different types of flash drives, though. Some are fast enough to use ReadyBoost, and others are not. Windows will not let you use a slow USB flash drive for ReadyBoost. Here are some drives that support ReadyBoost: . Also, although you can never have enough RAM, Windows will always write some temporary files (virtual memory) to the disk drive. This is where ReadyBoost comes into play. It writes that data to the flash drive, instead. Thus, your hard drive won't be tied up while Windows writes data to virtual memory. If your USB flash drive is fast enough, I bet you will notice an improvement.--Drknkn (talk) 06:49, 28 December 2009 (UTC)

Events based on UI elements or on functions?
In Visual Studio programs, when multiple events (e.g. selecting a menu item, clicking a toolbar button and pressing a keyboard shortcut) all have the same effect and are handled by the same method that doesn't distinguish among them, is it considered better to replace them with one event? If so, does this remain true even when not all of them are directly caused by the user interface (e.g. a display that refreshes once per minute and can also be manually refreshed)? Neon Merlin  04:24, 28 December 2009 (UTC)

Array-of-functions declaration
Would the following be a valid way in any programming language to create the functions operation[0], operation[1], operation[2] and operation[3]? If not, what is the closest construction in an existing high-level language?

void operation[](int& x) = { {x++;}, {x--;}, {x*=5;}, {x/=5;} };

Neon Merlin  04:33, 28 December 2009 (UTC)


 * Once a function is created in the normal fashion in most languages, you can refer to it by reference or callback or name (depending on the language). I will just call it a hook here to keep the confusion down.  So, you make a function in your language and you have a hook to refer to it.  Make an array of hooks and you have an array of functions.  You just have to define the functions separately from defining the array.  If you have a language of preference, I can give example code.  These "hooks" are rather different from language to language. --  k a i n a w &trade; 04:46, 28 December 2009 (UTC)


 * That's not valid in any language I know, but this Perl does something like it:

my @ops = ( sub { $_[0]++ },  sub { $_[0]-- },  sub { $_[0] *= 5 },  sub { $_[0] /= 5 } ); my $x = 10; foreach my $func (@ops) { $func->($x); print $x;   # prints 11, 10, 50, 10 }
 * --Sean —Preceding unsigned comment added by 76.182.94.172 (talk) 20:49, 28 December 2009 (UTC)

In C++0x, you should be able to write Which is relatively close to the original example. decltype (talk) 01:44, 29 December 2009 (UTC)


 * There are any number of functional programming languages that treats functions as first-class objects and in most of them you should be able to some version of what you are proposing. This is a slightly modified version in Python (the modification is that the variable isn't passed by reference, it just returns a new value, it doesn't modify the old one):


 * This prints 11,10,50,10, just like you'd expect. Belisarius (talk) 07:19, 30 December 2009 (UTC)

.Mac
I have 'lost' the .Mac icon from my Desktop. How can I recover it please? With thanks in anticipation.--85.210.188.64 (talk) 08:58, 28 December 2009 (UTC)
 * What version of OS X are you running? .Mac ceased to exist in 2008 and was replaced by MobileMe. It's likely that if you recently performed a software update, anything related to .Mac was replaced by MobileMe information probably after this update. Casey (talk) 23:51, 30 December 2009 (UTC)


 * Thanks for this I was begining to wonder if anybody had ideas!  You are probably right as I do update fairly frequently and had forgotton that .Mac had been changed to MobileMe.  I would however expected there to be a desktop icon.  I did had an on screen "chat" with an Apple Advisor who was no help at all and did not mention this aspect.  Unfortunately the page you refer me to does not apply to my System X 10.4.  Thanks for your help never the less.  Regards--85.210.170.108 (talk) 08:31, 1 January 2010 (UTC)

Disc won't leave drive, although computer thinks it has
When I try to eject a CD from my Macbook drive, it shows up on the computer as if it's left but although it makes the usual whirring noises before the CD leaves the drive, the physical disc doesn't come out of the slot but it goes away on iTunes as if the disc has left the drive's sensor but hasn't come through the drive slot. Please help! Chevymontecarlo (talk) 11:17, 28 December 2009 (UTC)


 * If you're brave...try this http://www.silvermac.com/2006/dvd-stuck-in-macbook-pro/ and good luck. ny156uk (talk) 14:03, 28 December 2009 (UTC)


 * Here are some less-brave options: . If the drive is just confused, it is easy to get the disk out. If it is truly mechanically stuck in the slotless drive, that is a lot harder to deal with. --Mr.98 (talk) 16:46, 28 December 2009 (UTC)

ENIAC described in modern terms
What would the CPU speed and memory be for the ENIAC computer described as modern computers are? I read that ENIAC took seventy hours to compute pi to 2037 decimal places in 1949. I was curious to find out how long it would take on my old home PC with a CPU speed of 1.8GHz, memory 1GB. Using PiFast, it calculated 2037 decimal places in 0.03 seconds - that's over eight million times faster. And I wonder how much time the world's most advanced computer now in December 2009 would take? 89.242.213.201 (talk) 13:29, 28 December 2009 (UTC)
 * Haven't found any specifics yet but this from the ENIAC article may give an indication: "As of 2004, a square chip of silicon measuring 0.02 inches (0.5 mm) on a side holds the same capacity as the ENIAC, which occupied a large room" --220.101.28.25 (talk) 14:33, 28 December 2009 (UTC)
 * From History of Computing Hardware: "It could add or subtract 5000 times a second, a thousand times faster than any other machine. High speed memory was limited to 20 words (about 80 bytes)" So 2 Gb RAM ≈25,000,000 times ENIACS memory in bytes. If we assume one tick of the computers clock per operation (probably took several) a typical PC(Several years old!) might run at 2Ghz+. 2,000,000,000 / 5000=400,000xfaster (conservatively) on clock speed alone. And todays computers can carry out more than one operation per 'tick'. These are very rough & quick calculations, but in the ballpark(I hope)--220.101.28.25 (talk) 14:56, 28 December 2009 (UTC)

So ENIAC was about 0.0000005GHz and 0.00000008Gb RAM? 89.242.213.201 (talk) 22:31, 28 December 2009 (UTC)
 * I've removed the commas that someone put in my remark about - for one thing I do not think it is customary to put commas in decimal places, for another it is a breach of etiquette to alter someone elses writing. So the more correct figures are 0.00000005GHz and 0.00000008Gb RAM. 92.24.98.128 (talk) 12:17, 29 December 2009 (UTC)
 * Very close!. On Clock speed you're 10x too fast actually. 0.000,005 GHz = 0.005 Mhz =5 Khz. The memory is right though!.
 * As for the Supercomputer question "And I wonder how much time the world's most advanced computer now in December 2009 would take?". Not very long! These things are running at PetaFlops ($$10^{15}$$ FLOPS-FLoating point Operations Per Second), which is about 1,000,000 GHz! The time the answer takes to reach the display from the computer is probably longer than the time spent calculating it!


 * According to Supercomputer the current fastest is the Cray Jaguar (1.759 PetaFlops) at DoE-Oak Ridge National Laboratory, Tennessee, USA. (Jaguar has 224,256 Opteron processor cores) For a general comparison the same article says, "Moore's Law and economies of scale are the dominant factors in supercomputer design: a single modern desktop PC is now more powerful than a ten-year-old supercomputer". I would guess that ENIAC was not even as 'powerful' as a simple $A 5 calculator. --220.101.28.25 (talk) 01:43, 29 December 2009 (UTC)


 * I think I have an answer to "And I wonder how much time the world's most advanced computer now in December 2009 would take?"
 * "The new value of pi is 2,576,980,370,000 decimal places long, the result of a computation on T2K-Tsukuba in April of this year that took 73 hours and 36 minutes. The time included verification. The T2K-Tsukuba consists of 640 nodes with peak calculation of 95.4 trillion floating point operations per second." Full story HERE. Interesting that the time taken is similar, but the calculation is to about 100,000,000 more decimal places! Also this was 4 months ago, and this is (apparently) NOT "the world's most advanced computer" so the time taken could be reduced signifcantly.--220.101.28.25 (talk) 02:33, 29 December 2009 (UTC)

there is any gaming website that has a recommender system
There is any gaming website (other than gamespot) that has some recommender system? —Preceding unsigned comment added by 201.78.151.130 (talk) 17:33, 28 December 2009 (UTC)


 * Googling "video game recommender system" gave a number of results. One of the first results was this. I hope this helps. JW.. &#91; T .. C  &#93;  20:55, 29 December 2009 (UTC)

Thanks —Preceding unsigned comment added by 201.78.207.222 (talk) 21:07, 30 December 2009 (UTC)

Is it possible...
To have a computer with the best of both worlds or what? Mean with closed source and open source. Have been wondering about this as well. —Preceding unsigned comment added by Jessicaabruno (talk • contribs) 18:48, 28 December 2009 (UTC)


 * Short answer: Yes.
 * Longer answer: Even within the same project, there are ways to multi-license code that allow different licensing/sourcing schemes to coexist. Much less in general on a computer.
 * Recommended reading: The Cathedral and the Bazaar is an interesting read about what the "best" of the different development models are. --Mr.98 (talk) 20:17, 28 December 2009 (UTC)


 * It is very likely that your current computer has both closed-source and open-source software. My Windows Vista machine, for example, has on it closed-source software like Microsoft Outlook as well as open-source software like Mozilla Firefox.  My Ubuntu machine has mostly open-source software on it, but also has closed-source NVIDIA video drivers on it.  There will always be both types of software, and many (I'd say probably most) computer enthusiasts work with both types daily.  Comet Tuttle (talk) 06:02, 29 December 2009 (UTC)

programming for beginner
I'm a complete newbie, so what is the best tools or programming language to use for making extremely simple little programs that can do things like download urls and stuff? —Preceding unsigned comment added by 82.44.54.127 (talk) 18:49, 28 December 2009 (UTC)


 * Perl or Python are both good languages to start with for that sort of programming. Just jump on in! I recommend working through one of the many "tutorials" you'll find on the web for either, just to get a sense of how the grammar works, and then jump right in to specific programming tasks, looking up functions and asking for help whenever you hit a brick wall. --Mr.98 (talk) 20:20, 28 December 2009 (UTC)


 * For simple tasks, scripting languages (like Perl and Python suggested above, along with PHP, Ruby, and many many others) is nice. There's no compile time.  Just script and run.  Of course, you can even do shell scripting (even on DOS - or whatever Windows calls their CLI now).  Learning any language for a "newbie" is tough.  Just pick up a book on whatever language seems of interest and start.  Once you learn one language rather well, jumping to other languages is rather easy.  I started with assembly language and since then I've only had to struggle with one language, Mumps.  Every other language has been very easy to pick up. --  k a i n a w &trade; 20:26, 28 December 2009 (UTC)

Is Visual Basic a good thing to start with? I heard it has a gui which makes building programs very easy for beginners —Preceding unsigned comment added by 82.44.54.127 (talk) 20:36, 28 December 2009 (UTC)
 * My impression of Visual Basic and other Microsoft programming products is that they keep their developers on a never-ending technology treadmill that causes one's skills to become obsolete faster than they should. I would stick with an open and public project like Perl or Python, which (book sales aside) do not have a vested interest in getting you off the old thing and on to purchasing the new new thing.  --Sean  —Preceding unsigned comment added by 76.182.94.172 (talk) 20:58, 28 December 2009 (UTC)
 * In my opinion, the simplest coding language is HTML. If you're completely new, this will help you learn how tags work (even if commands are different in other languages). However, you cannot build a program in HTML. It seems like you want to learn how to develop programs moreso than web applications, so visual basic may be a good start, even though it is considered somewhat nontraditional. Be wary that Visual Basic only runs on Windows (or possibly in Wine_(software) if you want to get technical...), so if you want to be cross-platform, maybe you should try C or related languages. If you do wish to try Visual Basic, Microsoft always has a free download of their latest beta, which would currently be Visual Studio 2010. --EpicCyndaquil (talk) 21:07, 28 December 2009 (UTC)
 * HTML isn't actually a programming language. It is a markup language. It doesn't let you do anything except format documents. To do anything more than that, you have to learn a real language—like Javascript, or PHP, or whatever. HTML just structures the output and makes it pretty. You can't do any real programming with it, though. --Mr.98 (talk) 21:46, 28 December 2009 (UTC)
 * There's a lot to be said for Perl and Python, although of the two I strongly suggest students pick up Python, as it is less of a "free form" language, and forces programmers to pick up good layout practices. I know a few universities - mine included - that teach Python as a first language before moving on to something like Java. That said, Visual Basic is quite a good choice, given that historically Basic was a teaching language as the syntax is a tad closer to English than some other languages, and Visual Studio.Net is a very nice tool, as you have heard. I've taught VB.Net (the current version of Visual Basic, more or less) to beginning students, and generally they've argued that it was a bit easier to learn than the alternatives we offered (mostly C# and Java). However, if your intent is to pick up other languages, the difficulty with learning VB is that it doesn't really prepare you for moving into other popular ones, such as Java, C++ and C#, due to the syntax. This is fine if you don't want to move into them, of course, and I've known many a dedicated VB programmer, but it can make changing languages a tad more difficult. :) As a compromise, I generally recommend C# - the syntax is a tad more difficult than VB.Net, so it isn't as easy, but in return you're learning a syntax that is (basically) common with Java, C++, C, PHP and Perl, and you still get to use the same GUI as with VB.Net. - Bilby (talk) 21:19, 28 December 2009 (UTC)


 * Visual Basic, as in old-school, Visual Basic 6.0, was pretty straightforward for a new programmer but had some limits. New, VB.NET, is less bounded by what it can do but is really a pain to get into—it is no longer as straightforward and simple as VB6 was. I wouldn't recommend VB.NET as a learning language, personally. It has a much higher learning curve than VB6 had.
 * If you're interested in an interface, I would personally go with learning PHP or some kind of web scripting. It's much easier to make a simple interface with web languages (because the interface controls are handled by the browser) then it is with a GUI language.
 * Once you understand the basics of programming, jumping into new languages is not so hard. The first one is the one that gives you real work—just making sense of how to structure a program, and learning how to look up the kinds of solutions you need to continue. Just pick a language, stick with it for a little while, and in not too long you'll be able to pick up all sorts of things if you want. --Mr.98 (talk) 21:46, 28 December 2009 (UTC)
 * I'd second that VB6 was much easier to learn than VB.Net. Thus VB6's reputation as a teaching language doesn't translate well to VB.Net. We used to teach VB6 as a first language to the business students. It was fine, but we had to make the change to VB.Net when Microsoft moved the emphasis over to it, which meant that the students had to learn Object Oriented programming. This made it a much more difficult language, so the question emerged (after teaching it for a couple of years) as to whether or not VB.Net, with the easier syntax but retaining the complex OO elements, had enough of an advantage over Java and C#. The answer was generally no, although I'm aware of other universities which stuck with it. - Bilby (talk) 22:07, 28 December 2009 (UTC)


 * Much of the decision by a university on a "beginning language" has to do with the nature of the university. I've taught at three very different universities.  At a strongly liberal arts one, Java was chosen because it was deemed to be "easy".  At a military one, Ada was chosen because many of the programmers will go on to the military and be required to use Ada.  At an engineering university, C++ was chosen because it is the base language for most real-world projects.  Due to the difficult nature of C++, "beginning" programming is three different classes.  One is for people who need to know what programming is, but don't need to know how to program.  Another is for people who have never programmed.  The third is the actual beginning class that requires you to either know how to program in some language or have taken the pre-beginner course.  I personally don't see what is hard about C++.  I mentioned that Mumps is the hardest language I've learned.  After that, Visual Basic was the hardest.  It is so unintuitive that I spent most of my time looking through books and searching the Internet to try and figure out how to do the simplest things.  When I found out a proper function name for something, it was simply painful to force myself to use a function name that in no way implied what the function was supposed to do - or often did the exact opposite of what the function name implied it would do.  Then, when I complained to VB programmers, they said that it was just my backwards C++ thinking that was causing all the problems. --  k a i n a w &trade; 22:24, 28 December 2009 (UTC)


 * I like the look of Rebol, which can do the things the OP described with one-line programs. The Rebol article is off-puttingly technical, but see the links to a tutorial and its very short programs at the base of the page. 89.242.213.201 (talk) 00:04, 29 December 2009 (UTC)
 * Some programmer-Luddite keeps removing the tutorial and other links from the Rebol page, so here they are: http://www.rebol.com/ http://re-bol.com/rebol.html http://www.rebol.com/what-rebol.html "REBOL rebels against software complexity. Do it in 1MB, not 200MB."78.149.161.55 (talk) 11:45, 30 December 2009 (UTC)

If you just want to do some simple stuff, like getting web-pages, read the manpages on wget and curl. That'll probably answer you pretty directly. If you need to do more, grep and, god forbid, perl will stitch those two together. Shadowjams (talk) 09:56, 29 December 2009 (UTC)

When I knew nothing about programming I started with HTML. After a year I started with php and css. now 4 years later I am a web developer and can make some pretty optimized sites. If I knew how long and agrivating it would be to learn I would have taken a course. it will come naturally with time142.176.13.22 (talk) 01:35, 1 January 2010 (UTC)