Talk:Shannon index

Simplify?
For the beginning Ecology student, this page makes me want to run away from Ecology. Not only is all the proofs intimidating to the regular layman, it doesn't even give a simple explanation for what the end result will give you...

By googling Shannon Index, there's a link (www.instruction.greenriver.edu/.../Shannon%20Diversity%20Index.doc) that sums it up better than anything on this page.

"High values of H would be representative of more diverse communities. A community with only one species would have an H value of 0 because Pi would equal 1 and be multiplied by ln Pi which would equal zero.  If the species are evenly distributed then the H value would be high.  So the H value allows us to know not only the number of species but how the abundance of the species is distributed among all the species in the community."

Please do better

Wow this helped me with my HW tremendously... can we add that quotation to the page? — Preceding unsigned comment added by 73.149.68.235 (talk) 00:04, 9 April 2015 (UTC)

misleading
The introduction to this article gives the impression that the primary or even the only use of this function is measuring species diversity. What if there were an article titled "cardinal number" that said the purpose of cardinal numbers is enumeration of animals in a forest? The sequence {1, 2, ,3 ,...} is used for counting animals in forests; it is not used for counting anything else; it is known only to those who have studied the counting of animals in forests. That's all.

That's what this article does. Michael Hardy 23:33, 23 May 2007 (UTC)

Nice analogy. But what else is the Shannon index used for Michael? I have only seen it used for species diversity measure. Stray 23:40, 1 June 2007 (UTC)


 * The Shannon index seems identical to information entropy, which is used by many branches of math and engineering. I think Michael's point is that the article needs to say this. hike395 07:47, 18 June 2007 (UTC)

There are several examples of this equation Boltzman (physics and chemistry), Lotka-Volterra (species interactions), Shannon (communication theory). In ecology, we learn/teach it as Shannon's Index (sometimes Shannon-Wiener etc.) The largest misconception about diversity measurement is that the many of the more popular indices are better or worse than another etc. Renyi's 1961 article on information theory proves that many of the commonly used indices of diversity in biology are related through a single equation he termed entropy of order alpha. as alpha changes in the equation the index changes and therefore the resulting meaning of the index changes.

Information theory has been a cornerstone for many of the more popular trends (used for lack of word) in very broad fields. For example the Kullback-Liebler distance measure is the basis for Akaike's Information Criterion (AIC) in model selection, which is a very popular approach. when we look at the Kullback-Liebler formula it is Boltzman, Lotka-Volterra, Shannon with a twist. In biology Shannon's Index measures nothing more than the information entropy, where the categories are the species and their proportional abundance (assuming the jointly considered samples sum to unity) represent some probability of occurrence within the larger population. Used by itself Shannon's Index has very narrow interpretation and utility. See Juhasz-Nagy, Podani, Orloci for uses of shannon's and related indices for spatial modeling in plant communities, higher order interpretations relating to function and pattern detection. —Preceding unsigned comment added by 205.200.66.138 (talk) 18:40, 3 March 2009 (UTC)

Index name
At the start of the article, the index is referred to as the Shannon index; the other two variants are said to be wrong. The source for this is a 1989 work by Charles J. Krebs. In a newer work, however (CJ Krebs. Ecology: the experimental analysis of distribution and abundance, 5th edition. p617-618), the same author calls the index the Shannon-Wiener index. Is there other information that could be used to find the "correct" name for the index, if one exists? Jonht 09:35, 2 November 2007 (UTC)

I think Shannon-Wiener is correct, as Wiener independently published it:

WIENER, N. 1948: Cybernetics: or control and communication in the animal and the machine. Paris: Hermann et Cie & Cambridge/MA: MIT Press. 194 pp. —Preceding unsigned comment added by 141.2.253.17 (talk) 13:49, 22 July 2008 (UTC)

There is no correct form. All thress are regularly used in academic papers.--Ekologkonsult (talk) 16:07, 17 September 2008 (UTC)

Data Ranges
What values of the index have been observed in nature? and what would be considered a high value (typical of an evenly diverse population)? Obviously the word high is subjective/arbitrary, but it would be nice to see some examples from the literature in this article. MATThematical (talk • contribs) 01:00, 11 May 2008

Very, very suspicious proof
The strategy of the 'proof' begins by assuming that any population can be split into two groups such that in each group every species has the same number of individual. Erm, like how do you do that with a population of 1 giraffe, 2 lions and 4 elephants? Unless I'm being rather dim here, I think the proof needs removing. The first proprer proof that springs to mind is by using a Lagrange multiplier. The Wilfster (talk) 18:32, 10 June 2008 (UTC)

An alternative way to show that
 * $${\partial H^\prime \over \partial n_k}\ =0$$

given
 * $$H^\prime = -\sum_{i=1}^S p_i \ln p_i$$

can be obtained by derivation, noting that:
 * $${\partial p_j \over \partial n_k} = - {p_j \over N}$$

and
 * $${\partial p_k \over \partial n_k} = {1 \over N}- {p_k\over N}$$

so
 * $${\partial H^\prime \over \partial n_k} = \sum_{i=1,i \ne k}^S({p_i \over N} ln(p_i)+{p_i \over N})-[({1 \over N} - {p_k \over N}) ln(p_k)+({1 \over N} - {p_k \over N})]$$

and rearranging we obtain a set of k=1,S equations:
 * $${\partial H^\prime \over \partial n_k} = [\sum_{i=1}^S({p_i \over N} ln(p_i)+{p_i \over N})]-{1 \over N} ln(p_k)+{1 \over N} =0$$

since the term in square brackets is the same for all the k equations, also the the pk are the same for all k. John lartuen (talk) 22:37, 22 December 2009 (UTC)

Normal Values for Shannon-Weiner Index:
'In natural systems, the value of Hprime has been found to range from 1.5 for systems with low species richness and evenness to 3.5 for systems with high species evenness and richness'

McDonald, Glen (2003) Biogeography: Space, Time and Life, John WIley& Sons inc. pg 409 of the text

melissa —Preceding unsigned comment added by 24.84.60.98 (talk) 21:15, 1 August 2008 (UTC)


 * This information should be added to the article - the lack of any typical range of results does nothing to help a lay person faced by "This site has a Shannon index of X" and comes to WP looking to understand what that statement means. For what its worth I came here for a reality check after reading a report on a site with an index of 0.5! The Yowser (talk) 13:48, 21 September 2010 (UTC)

This range of values needs to be removed! This range of values is entirely contingent on the number of species in the community. The range of values one might interpret as being "high" or "low" diversity varies wildly with species richness. Presenting this range (1.5 to 3.5) out of context is highly misleading in the absence of species richness information. —Preceding unsigned comment added by 129.82.97.232 (talk) 03:16, 12 November 2010 (UTC)

Shannon Biodiversity Index can never be Conclusive
High value of Shannon Index does not indicate high diversity only but both the high diversity or high uniformity (both of the extreme cases). So this measur does not indicate the biodiversity correctly from mathematical point of view. —Preceding unsigned comment added by Tuhinsubhrakonar (talk • contribs) 16:54, 15 June 2010 (UTC)


 * Agreed the article should make clear that it is unwise to rely on a single measure of diversity. especially as uniformity in species numbers is the exception, rather than the rule.The Yowser (talk) 13:45, 21 September 2010 (UTC)

Notation and merging
Hello, I'd like to move forward with merging this article with entropy_(ecology). Also, I'm considering changing the notation for the shannon index to H. H seems preferable to H' for many reasons. If anyone has reasons to not merge (with redirect), or reasons to keep H' notation, please speak up. Otherwise, I will make these modifications over the next week. Thanks, SemanticMantis (talk) 22:26, 3 March 2011 (UTC)
 * Using H retains consistency with Entropy_(information_theory), which is really the same thing, just applied in a different context.
 * I cannot find any good rationale for the H' notation.
 * H' notation is potentially confusing, because it raises the question "Is H' the derivative of some other quantity H?" The answer is (to my knowledge) "no, this in fact the same thing that's called simple H elsewhere".

Another merging?
I find that this article fails to explain the essential meaning and information content of the Shannon index. At least the proof should be removed, because it is very long and complicated (and I am not sure if it is even correct) even though the same thing could be explained in two sentences. I would really prefer to merge this article with the Diversity index article. That one includes an explanation of what the Shannon index means in practice, and also places it in context with the other diversity indices that are used in ecology. Does someone have objections to this? Cricetus (talk) 19:59, 4 January 2012 (UTC)