Talk:Supersampling

Wouldn't the Jitter algorithm be a combination of the grid and random algorithms, not the grid and Poisson Disc algorithms? - Potski, 27th August 06


 * It depends on your point of view. It's a combination of the grid and poisson disc when talking about the end result, and this is the way sampling algorithms are compared in most cases (atleast in my experience). See, it's the end result that counts, not the way you achieved it :). And there are more than four, these are just the most common ones. But yes, if you look at it from a technical point of view, you're right, it's the random and grid.


 * It's really quite pointless. But I agree, this should be cleared up in the article. If you were confused, someone else will be too. xompanthy 23:56, 30 August 2006 (UTC)

Aliasing is not simply a jagged or pixellated edge, that is overlooking the argument and most of the time a wrong approach to the real thing. As you may well know it is the overlapping of the higher frequency components and since edges contain the highest frequencies, one first notices the deformation on the edges. —Preceding unsigned comment added by 195.174.131.27 (talk) 18:47, 29 October 2007 (UTC)


 * This article uses a very basic explanation of aliasing on purpose. The article on aliasing has more in depth information and is also wikilinked in the very first sentence of the text, for anyone who wants to read more about it. Bottom line, this article is about supersampling, not aliasing. -- Xompanthy 00:22, 3 November 2007 (UTC)

Isn't the text in the image misspelled? Shouldn't it be "positions"? —Preceding unsigned comment added by 213.164.205.5 (talk) 11:40, 10 December 2009 (UTC)

http://www.neoseeker.com/Hardware/faqs/kb/10,72.html - this link is bad (not dead, but no longer points to what it used to point to. --Simplexxx (talk) 12:43, 12 August 2010 (UTC)

In the minecraft image, I don't see a difference from left to right. Maybe top/bottom was meant? --134.34.7.59 (talk) 09:19, 15 September 2017 (UTC)

Types of supersampling
I think there should be images as example results for each type of supersampling. --NeatNit (talk) 18:15, 4 December 2011 (UTC)

Method section
I found the source the editor had used (it was in the links farm below), so I sourced the methods chapter by this source. Which is why I removed the tag at the beginning of the chapter. Hervegirod (talk) 10:01, 19 April 2020 (UTC)

Wessex Jeff's Starved so Eva's Dhsd 82.38.79.35 (talk) 19:16, 11 April 2024 (UTC)

"Aliasing" vs. "Imaging"
This wikipedia article should be revised because unfortunately "as usual", the phenomenons "aliasing" and "imaging" are thrown together, causing unnecessary confusion.

Quote:

"Aliasing occurs because unlike real-world objects, which have continuous smooth curves and lines, a computer screen shows the viewer a large number of small squares."

This sentence brings up a bunch of different aspects which should be differentiated. Otherwise, one might get the wrong impression that aliasing is a direct consequence of the limited (sampled) resolution of the display device which isn't the case. That actually is the "imaging" part, the counterpart to aliasing during reconstruction.

Aliasing occurs, because the sampled input signal contained spatial frequencies which exceeded the Nyquist/Shannon limit during sampling. That however, resides on the recording (scanning/sampling/rendering) part of the signal chain. While a pixel-based display may indirectly affect the used sampling frequency (for instance if continuous functions are rendered for that destined pixel count used for "playback"), they aren't necessarily correlated. It would be very well possible to sample at a much higher rate and downscale the result afterwards which, given proper filtering would lead to less aliasing (but same amount of imaging) despite the same discreet pixel-based display used at the end.

The pixel-based type of an image display isn't necessarily the case for all computer screens (e.g. good old CRTs) and rather so "nowadays". But even so, the distortion which is introduced by their lack of proper filtering and pixel-based display (ideally, the image would also be entirely reconstructed to a continuous "drawing") should be conceptually separated, i.e. calling it "imaging" like in the audio domain.

A question which is left up to be clearly answered for myself however is, why it seems to be so damn difficult to implement such a *proper* anti-aliasing/low pass filter at the rendering stage in computer graphics. Apparently, the polygon-based spatial resolution of such graphics is nominally infinite as well (otherwise, aliasing wouldn't occur) and the lowpass filter has to be implemented mathematically here whereas it may be done so optically in e.g. camera systems before the signals hits any sample rate limiter and quantizer.

[little-endian], 14.04.2024, 14:52 CET. 2A01:599:920:948D:DE:FE23:C08B:A814 (talk) 12:52, 14 April 2024 (UTC) Edited: [little-endian], 15.04.2024, 15:25 CET.