File talk:United States Income Distribution 1967-2003.svg

Is the chart misleading?
Could this chart be misleading because the percentiles used are not uniformly distributed, as a naive reader might expect? The comparatively larger gap between line 3 (50th percentile) and line 2 (20th percentile), versus the gap between line 2 and line 1 (10th percentile) is at least in part due to the bigger 'step size' (30 percentiles, vs. 10). At the other end, the small step size between the 90th and 95th percentils actually de-emphasizes how large the income gap is becoming at the higher percentiles. Astockdale (talk) 21:22, 5 August 2008 (UTC)
 * A fair concern. What is graphed is the data available.  If data were available for every 5%, that would be ideal, but I'm not aware of a source.    A better graph of the available data may be possible, but I'm not sure what it would look like.  If you can find a source with more evenly spaced data, or can suggest a better presentation, please let me know; I'm game to try something new. — Alan De Smet | Talk 15:22, 16 October 2008 (UTC)

Here is the uniform-distribution table: http://www.census.gov/hhes/www/income/histinc/f01AR.html. It has 40th, 60th and 80th percentiles.
 * I like it, especially because it goes further back and is more recent. Here is the updated graph. — Alan De Smet | Talk 18:21, 12 April 2009 (UTC)

Linear vs Logarithmic
Using a linear graph for this data is very misleading.

If you look at the logarithmic version of the graph you will see that the gaps between the percentiles does not change much if at all over the years. This is the correct view.

On a linear graph the gaps appear to grow in the later years. This is not a real change, it's just inflation, and does not indicate a change in purchasing power.

For example: Suppose in 1900 you had a regular guy who earned $1,000, and a rich guy who earned $2,000. The rich guy has twice as much earning. Now fast forward to 1998. The poor guy earns $10,000, and the rich guy $20,000. The gap did not change - the rich guy still earns twice as much as the poor one, and they both had their dollars inflate by 10 times.

If you plot this on a linear graph, you will see a tiny, $1,000 difference in 1900, and a massive $10,000 difference in 1998. It looks like the rich guys earning have grown 10 times as fast as the poor one. But in reality it's not the case.

This is why you use a logarithmic graph. Any time you are plotting change over time you should always use a logarithmic graph. In fact if you always use logarithmic and never linear you will rarely be wrong.

Ariel. (talk) 10:28, 4 January 2010 (UTC)


 * I think you've overlooked "Numbers are normalized to 2003 United States dollars." Inflation has been factored in, eliminating the need to compensate with a logarithmic scale. — Alan De Smet | Talk 17:51, 4 January 2010 (UTC)


 * The inflation was just an example. Logarithmic applies any time you are measuring a change where what's important is the relative change rather than the absolute change. With dollars you care if money doubles, or halves. You don't care about the how many dollars the difference is, only the relative change. You care about the percent of change, not the number of dollars change.


 * BTW, I'm sorry if I came across too harsh in my upload comment. I was rather irritated with the whole Income inequality in the United States article because it constantly made the same error. Making it seem like the rich had a much larger growth rate relative to the poor than they really did. And the graphs at the top just were a perfect example, and you made it so easy for me to make new versions of them (thank you for that). Almost every single graph in that article is wrong, and I'd love to fix them, but they did not upload source like you did. Also a lot of the text of the article is wrong too. Ariel. (talk) 07:04, 6 January 2010 (UTC)

Totally misleading
Not many people will notice that the figures on the left increase by multiples, not numerically. This makes the curves (which is precisely what this graph should represent) appear pretty much the same. The top line should be going "ping!" up in the air.  Wik idea  16:28, 25 January 2010 (UTC)

The logarithmic scale is the misleading one! Not the linear scale!
I usually don't find anything to complaint about on wikipedia. I am very frustated by Ariel changing the scale to logarithmic on file "United States Income Distribution 1967-2003.svg". This is totally erroneous. The linear scale is the unfiltered truth!!! Why choose a base two log, any reason? OR that is just "looks right" to Ariel. The numbers are factuals, don't distort them when drawing them. I do not wish to sign up at this moment, because I don't foresee myself being a contributor in the near future. Regarding this article I can be reached at yared_tadesse@yahoo.com, if necessary.

Yared

68.98.175.67 (talk) 04:36, 23 February 2010 (UTC)

Is this before- or after-tax income?
I don't see that explained anywhere.

Mliggett (talk) 17:29, 12 December 2010 (UTC)

Logarithmic seems bias.
The data points on the left side grow exponentially and does not accurately reflect the growth at the top part of the chart with the bottom. For example: The 10% income bracket starts at 7790 in 1967, while the 95% income bracket starts at 88678. In 2003 10% income bracket is at 10536, which is like 35% growth, while the 95% income bracket is at 154120 which is 73% growth making it double the growth percentage of the 95% income bracket, yet this graph does not represent that since the exponential growth curve on the left side hides the real growth and flattens out the line and data points.

It's not even a direct logarithmic scale, it's a complex logarithm. Why double each unit after each unit? When not half each unit? Why not use a multiplier like 1.786513229? This chart is completely arbitrary with it's custom scale. It should be linear, but if it MUST be logarithmic, make it a standard function. —Preceding unsigned comment added by 12.40.84.157 (talk) 19:25, 29 March 2011 (UTC)