Wikipedia:Reference desk/Archives/Computing/2014 June 10

= June 10 =

64k intro
How can they make 64k intros so small with better graphics than many commercial games? — Preceding unsigned comment added by 100.33.64.30 (talk) 10:27, 10 June 2014 (UTC)


 * Well, the Demoscene authors spend a lot of time on these projects. They don't get paid, but they want to show off their skills. (I imagine an author might have a sweet 66k demo, and spend weeks shaving off the final bits). SemanticMantis (talk) 17:11, 10 June 2014 (UTC)
 * Actually, they start above 100k and compress that EXE. Sometimes they shave as much as 10k off by switching compressors. - ¡Ouch! (hurt me / more pain) 06:52, 16 June 2014 (UTC)
 * The article says "64k intros generally apply many techniques to be able to fit in the given size, usually including procedural generation, sound synthesis and executable compression." Now, procedural generation is used in many flagship AAA games (think e.g. Skyrim, Grand theft auto, etc.) However, since they don't have space concerns, realistic terrain is procedurally generated once, then just stored normally. This allows CPU cycles to be spent on other things. Sound synthesis certainly still has a place in modern indie games and retro fair, but usually it is considered unacceptable for a general audience blockbuster. This is a very broad topic and outside my expertise, but you will probably have plenty to digest from these links. If you have specific questions about a certain technique, it might be better to ask a new question. Finally, "better" is very subjective. I'm many of these demos look quite cool, but none of them that I've seen can get anywhere near to the photorealism of pre-rendered modern cgi. SemanticMantis (talk) 17:11, 10 June 2014 (UTC)


 * 64K of (compressed) code is actually quite a lot. I don't think it would be too hard to fit a decent 3D renderer in that space. 3D meshes and textures are a different matter. Most of that data must be algorithmically generated, though how they did that in the case of Gaia Machina, for example, I have no idea. As for "better graphics than commercial games", keep in mind that they can target super-high-end graphics cards, while commercial games have to work on their customers' low-end hardware. If you're generating your textures and meshes algorithmically anyway, it's no harder to generate them at a very high resolution.
 * Here's something I really can't comprehend: "untraceable" by TBC, a 1K intro. It makes 64K look downright unimpressive. -- BenRG (talk) 07:51, 11 June 2014 (UTC)


 * Another amazing one is "Elevated - 4k", the demo scene has interested me since the 1980's (I still have my atari-ST). Talking to the people who make these, I am really impressed with how much they manage to compress their code, and how they save on every byte of data. It also makes me wonder how some games take over 10 gigabytes of disk space. I imagine that time spent compressing code, etc, costs more in development time for large studios than it's worth to them.Zzubnik (talk) 10:37, 11 June 2014 (UTC)


 * A lot of modern games don't even bother to use off-the-shelf compression libraries like zlib or Vorbis. It's pretty common for the majority of the space to be used by sound files in N different languages.
 * You can't "procedurally compress" your assets after the fact. If you're going to use procedural generation you have to use it from the beginning, and you can't use standard painting and modeling tools. Also, the "decompression" takes a lot of CPU time on the end-user's machine. The Youtube videos of these demos don't show the very long load times. -- BenRG (talk) 23:08, 11 June 2014 (UTC)


 * (ec)
 * Some things, like realistic in-game items, are quite hard to code procedurally. Games which are made for profit don't use a lot of procedural generation, mainly because it's hard to find a good way to code item X procedurally – and "hard" means that it takes time, and time is money. You could finish your game earlier if you don't have to code items in a close to minimal way, or work with a smaller (cheaper) team or just without any procedural generation experience. Space on a DVD, OTOH, is cheap.


 * BTW, you can code a 3D renderer which only processes spheres within 16K (depending on other accessories: texturing, physics, etc) even without compresion.


 * Finally, is it really 64K, if it takes a multi-gigabyte OS to run (or XP, which is still close to 1G even if you trim it down to the bones) and a multi-megabyte driver (look at the download sizes of current DirectX and gfx drivers). Only the DOS demos were even close to "true" 64K in that sense. And from a game developer's POV, there is just no point in keeping a game below 1 megabyte today, because the difference will just not be "felt" by the audience.


 * So, 64K intro coding is an eccentric sport rather than a truly useful programming style. There are games of that size (.kkrieger is one famous 96k example) but from a commercial POV, size doesn't matter if it doesn't open up a new market (i.e. a substantial number of low-end PC owners). - ¡Ouch! (hurt me / more pain) 10:54, 11 June 2014 (UTC)
 * Many (most?) roguelike games have procedurally generated items. So does the Diablo series, which is definitely a commercial powerhouse. But most roguelikes are not "commercial" games, and they tend to have very little in terms of "impressive" graphics :) However, procedural generation is definitely becoming more popular in some gaming circles. I'm not sure that it always takes 'more time' to procedurally generate. Take e.g. Spelunky -- I'd argue that it took far less time to hammer out the procedural level generation than it would have to make millions of levels by hand. SemanticMantis (talk) 15:37, 11 June 2014 (UTC)
 * Diablo I and II use procedural generation to make random levels; 64k intros use it on many more levels (pun intended): sounds and textures are virtually required to acheive competitive graphics; some use it to make 3d meshes on the fly. For example, about 10 lines of C code are enough to make somewhat realistic clouds or the data for a height field (called "voxels" sometimes but that's a misnomer). Yes, they make the entire item model procedurally.
 * Compression of 64K intros is often quite extreme; they use the same subroutine for different purposes (like clouds and 3d mountains - it only depends on the interpretation of the values generated: brightness/opacity makes clouds, but if you use them as elevation data, you have a mountain range. Without compression (but with procedural generation), most 64K intros would weigh in at about 200K, but if you can use your own EXE compressor (rather than a general-purpose product like PKLITE or EXEPACK – WTF? Both EXE packers are red links??? Sort of show how they've fallen into disuse with multi-gigabyte disks ), and you concentrate on a certain coding style which produces some recurring patterns (in code, i.e. you set your brain to procedurally generate a piece of code that does procedural generation), you can compress 200K down to 64K. By fine tuning of the compression parameters, they can often squeeze in another part; that's why you usually get a "compo" / "party version" (both "compo" and "party" mean the intro competition) and a "final" version with more features (like an extra part, or support for higher resolution / antialiasing, more cores, surround sound, speed optimizations), lower EXE size, or both. This takes time which software companies would be unwilling to pay for, even more so since they focus on a wealth of features (which would take much more time to slim down, not to mention to debug).
 * Taking Diablo II as an example, it is much easier to have a pixel artist draw another sword sprite, and add another line to the item stats table, than working out a parameter combination that results in another sword-lik object on-screen... or you entirely do away with graphics and just add another line of stats to an existing sprite (DII vets know these as exceptional items, which look like cheaper, less powerful items). - ¡Ouch! (hurt me / more pain) 06:48, 13 June 2014 (UTC)
 * Late addition: The "compressors" are quite different from any other product, because their goal is to compress a single file to 64k or even 4k (these are the two major competitions). Sometimes it can be better to use a slightly less capable compressor if it's smaller; at 4K, even the size of the decompressor can make a huge difference. - ¡Ouch! (hurt me / more pain) 06:52, 16 June 2014 (UTC)
 * p.s. I made a "tron"-like game once, which weighed in at exatly 6K. Standard VGA mode (640x480x4bpp) not via Borland Graphics Interface but my own subroutines: (1)set graphics mode, (2)set single pixel, (3)set text mode. The rest was quite genuine Borland Pascal 6.0 code. It was great 2-player fun, too. - ¡Ouch! (hurt me / more pain) 06:52, 16 June 2014 (UTC)

What is it that makes expensive laptops worth the money?
I've been browsing laptops, and I can't work out what the £800 ones are better at than the £400 ones. This Acer (link below) seems to have as much if not more RAM, storage, screen size, processor power, etc, than anything else I could find at a higher price (or Macbooks, which seem insanely pricey for their specs). What am I missing? It isn't even heavier or has a lower battery life than most of the more expensive ones. I realise this sounds like I'm trying to advertise the thing, but I just don't understand what I'm missing in this market. There must be something worth paying more for with the high-end laptops.

http://www.pcworld.co.uk/gbuk/laptops-netbooks/laptops/laptops/acer-e1-572-15-6-laptop-iron-22081527-pdt.html

86.159.119.168 (talk) 14:54, 10 June 2014 (UTC) (Who owns an Acer, but not that one. Full disclosure.)


 * Note: You have to scroll way down at that link to get to the product description. StuRat (talk) 15:08, 10 June 2014 (UTC)


 * Please also include a link for the more expensive laptop you want us to compare against. StuRat (talk) 15:08, 10 June 2014 (UTC)


 * I've been using MacBooks since the time they were still called PowerBooks - back then I did a comparison with other brands, and at equal specs they came out about equal in price. What sets them apart from lesser ;-) laptops is not always easily quantifiable. One thing is size, weight, and build quality. The keyboard is responsive and quite. The touchpad is, as far as I can tell, the best anywhere. The screen resolution is insane by now, but other aspects of the screen are also good - backlight, viewing angle, responsiveness, colour quality). Suspend and resume just works (tm), and did so long before that was standard. But to be fair: I could now do nearly everything I need to do with a laptop 1/3rd the price. It would just be a lot more irritating to me and a little bit more irritating to my environment. --Stephan Schulz (talk) 15:16, 10 June 2014 (UTC)
 * Looking at the first £799 laptop on the site (http://www.pcworld.co.uk/gbuk/laptops-netbooks/laptops/laptops/sony-vaio-fit-15-e-svf1532k4eb-15-5-laptop-21962943-pdt.html), it has an i7 processor instead of an i3, full HD resolution, longer battery life, and a discrete graphics card with 1 GB of dedicated RAM. I'm sure there are more differences. If you need or even notice these features depends on your usage profile (and vision quality, in the case of the screen ;-). --Stephan Schulz (talk) 15:25, 10 June 2014 (UTC)


 * Wow, I didn't expect such a great answer so quickly: thanks a lot Stephen! Clear, specific, but also addressing general differences. Many thanks, I think this question is answered. 86.159.119.168 (talk) 15:34, 10 June 2014 (UTC)


 * Also note that as you near the limits of what can be packed into a given package, the cost starts to skyrocket. It's part of the law of diminishing returns.  That is, doubling some spec may more than double the price.  For example, to double the hard drive capacity, they can't just add in another unit, they must go with a more expensive technology to double the capacity while keeping the weight and size about the same.  StuRat (talk) 15:39, 10 June 2014 (UTC)

Difference between Unix and Unix-like
This question was asked here before but it got deleted. So I'm posting my reply again here, because I'm genuinely interested to know if it's correct:

As far as I understand, "Unix" is a trademark, that was previously owned by AT&T, but later was transferred to an entity called The Open Group. "Unix-like" simply means an operating system that is fully compatible with Unix on the API level, but has not been officially given the right to use the trademark "Unix". As far as I can remember, Linux, by far the most popular Unix-like system, and probably the only Unix-like system 99.999999% of the people in the world have ever heard of, does not have the right to call itself "Unix". Note that Linus Torvalds started coding the Linux kernel from scratch, whereas by far most of the other Unix and Unix-like systems had forked off an existing codebase, dating way back to when Ken Thompson originally created the whole Unix in the first place. Is my understanding correct? J I P &#124; Talk 17:28, 10 June 2014 (UTC)
 * Yes, mostly. One problem is that there is no single UNIX (or Unix) (tm) system that any system can be compatible to. There is POSIX and the Single Unix Specification, but that's an after-the-fact  effort, and describes a (mostly) shared subset of functionality. In general, you can classify systems as genetic Unix, legal Unix, and functional Unix. Unix development split early on between the AT&T System X strand and the Berkeley BSD versions, but with frequent cross-pollination. BSD Unix was eventually cleaned of all AT&T code, and NeXTStep and OS-X even replaced the BSD kernel with a new kernel build on top of the Mach microkernel. OS-X is certified UNIX despite having no line of AT&T code. Linux was never pushed through certification, as far as I know. Here is a list of certified Unixes. Note that this includes HPSUX, one of the least compatible of the Unices, but not Version 6 Unix, which long predates any certification attempts. --Stephan Schulz (talk) 17:57, 10 June 2014 (UTC)

FYI this question was asked by the Venezuelan ip troll who was is banned from Wikipedia 190.207.201.28 (talk) 18:30, 10 June 2014 (UTC)
 * I'm aware of that, but it was I who wrote the above text, and it's even completely in my own words, not including any of the banned user's content. Just because I responded to the banned user shouldn't mean I am now officially forbidden in asking the same question entirely earnestly, because I'm genuinely interested to know. J I P  &#124; Talk 18:32, 10 June 2014 (UTC)


 * The "Unix" trademark can only be used by systems that are certified as meeting the Single UNIX Specification. Anything else (even fully-compatible systems that have not been through the certification process) is "Unix-like". --Carnildo (talk) 01:49, 11 June 2014 (UTC)
 * In the spirit of providing a full treatment to this issue, here's a link to several GNU information pages, in particular, the GNU Manifesto and "Why GNU/Linux?" Nimur (talk) 02:56, 11 June 2014 (UTC)
 * I'd challenge the idea that 99.999999% of the people in the world even know one unix-like OS, or a computer for that matter... - ¡Ouch! (hurt me / more pain) 12:44, 11 June 2014 (UTC)
 * What I was implying with the statement was that in regard to Unix-like systems, 99.999999% of the people in the world have only heard of Linux, if even that. To put it the other way around, only 0.000001% of the people in the world know there even are other Unix-like systems than Linux. J I P  &#124; Talk 18:45, 11 June 2014 (UTC)
 * That can't be right. There must be more than 75 people in the world who know about Unix systems.  There must be more than 75 Solaris administrators in the world.  Robert McClenon (talk) 18:53, 11 June 2014 (UTC)
 * I should think I'm allowed a bit of hyperbole. But the general idea is clear: Of the people who know about computers, only a vanishingly small part know there are other operating systems than Windows. And of those, only a small part know about other Unix-like systems than Linux, or that there even are such concepts as "Unix" or "Unix-like", not just Windows and Linux. J I P  &#124; Talk 19:00, 11 June 2014 (UTC)
 * I partly agree and partly disagree. First, you used not a bit of hyperbole but a very large amount of hyperbole (or rhetorical understatement).  I agree that most people who know about computers know either only about Windows or only about Windows and Mac, and not about Linux or other Unix-like systems.  IT professionals know about Linux and about other Unix systems and Unix-like systems.  There are more than 75 IT professionals in the world.  I don't think that most of the people who know about Unix-like systems know only about Linux.  I might have ignored 1% as marginal rhetoric, but  0.000001% ?  Didn't you know that a retired IT professional would do the arithmetic?  If  you were being sarcastic, remember that, on the Internet, no one knows that you are being sarcastic.  If you were just engaging in rhetorical exaggeration or rhetorical understatement, then you were engaging in rhetoric.  Robert McClenon (talk) 19:15, 11 June 2014 (UTC)
 * Just to throw out another thought, literally 'every' scientific user of Mac/OSX that I know, knows that it is *NIX under the hood (Darwin_(operating_system)). Part of why many scientists like OSX is that it has a clean, polished GUI out of the box, but also has a full shell/command line interface ready to go. I have no idea what Windows is up to these days, but for a long time it didn't have good tools for scientific workflow built into the OS. SemanticMantis (talk) 14:16, 12 June 2014 (UTC)

FSX online activation
Does anybody know how exactly does the FSX online activation system works? In particular, how exactly does it make sure that the same copy of FSX is not installed on two different computers at the same time? And how does it check to see whether two different copies of FSX are installed on the same computer (and DOES it check for that at all)? Thanks in advance! 24.5.122.13 (talk) 20:40, 10 June 2014 (UTC)

Question retracted -- found the info I needed on Google and on the FSX forums. 24.5.122.13 (talk) 01:31, 12 June 2014 (UTC)