Wikipedia:Reference desk/Archives/Humanities/2014 April 23

= April 23 =

Polk's Declaration of War
Could someone list all the congressmen who made up the Immortal Fourteen who voted no to the bill declaring war against Mexico in 1847? Also can someone find me an online copy of this bill with its controversial preamble? Also did Calhoun abstain from the vote and if so who else abstained?--170.140.105.10 (talk) 00:30, 23 April 2014 (UTC)
 * "Immortal Fourteen"? ←Baseball Bugs What's up, Doc? carrots→ 00:45, 23 April 2014 (UTC)
 * The fourteen are named in this source in footnote 12. It doesn't mention who abstained, though. OttawaAC (talk) 00:48, 23 April 2014 (UTC)
 * The full text of the Act and the preamble are here (scroll down to where it says Chapter 16). OttawaAC (talk) 01:17, 23 April 2014 (UTC)

Exactly, why I am asking the question? I can't find a source that mentions all the congressmen who opposed the bill only the most outspoken ones like Giddings, Ashmun, and Adams. --170.140.105.10 (talk) 02:19, 23 April 2014 (UTC)


 * By the way, Lincoln was very strongly opposed to the war (nicknamed "Spotty Lincoln" by some for insisting that the spot where the war began, between the Nueces and Rio Grande rivers, was not indisputably part of Texas), but he specifically did not adopt a policy of voting against military appropriations (as it's mentioned in the linked paper that Joshua Giddings did). Of course, Lincoln's term in the house didn't begin until almost a year after that vote... AnonMoos (talk) 14:38, 23 April 2014 (UTC)


 * Found it! Here you go.  The complete voting record to pass HR 145, the declaration of War against Mexico.  Final tally 174-14.  You can read the names of all 14 Nay voters there.  Here is a photograph of the actual bill passed.  This document has the actual text of the declaration.  Page down to page 81.  -- Jayron  32  15:32, 23 April 2014 (UTC)


 * And just because I had the time to kill, here's the list of 14 Nay votes:


 * 1) John Quincy Adams
 * 2) George Ashmun
 * 3) Henry Cranston
 * 4) Erastus Culver
 * 5) Columbus Delano
 * 6) Joshua Giddings
 * 7) Joseph Grinnell
 * 8) Charles Hudson
 * 9) Daniel P. King
 * 10) Joseph Root
 * 11) Luther Severance
 * 12) John Strohm
 * 13) Daniel Tilden
 * 14) Joseph Vance
 * There you go. -- Jayron  32  22:27, 23 April 2014 (UTC)

Which is more commonly used: Myanmar or Burma?
Isn't Myanmar more commonly used in daily talk when talking about that Southeast Asian country? 112.198.90.36 (talk) 03:38, 23 April 2014 (UTC)


 * I usually hear "Burma". One reason may be that most people have little idea how they're supposed to pronounce "Myanmar", another may be opposition to the govt.  — kwami (talk) 06:14, 23 April 2014 (UTC)


 * I also usually hear Burma, in "real life" (though I don't often hear the country discussed at all). Pronunciation confidence may play a part, but I think it has more to do with repetition. Burmese cats and Burmese pythons have been in the English lexicon for a while. A Google Autocomplete for "Myanma" (I even had to look up that adjective) brings up nothing that rings a bell, or looks like a common term. InedibleHulk (talk) 18:47, 23 April 2014 (UTC)


 * See Names of Burma. --&mdash;  Rhododendrites talk  |  19:09, 23 April 2014 (UTC)


 * Burma was a British colony for a long time, which either helped or hindered its development, depending on your point of view. (One might equally consider whether the USA was helped or hindered by European colonists arriving on what are now its territories.) Since almost all of us are native English-speakers, and most English literature about Burma uses historic terms for the country, and most of us know relatively little of Burma, and media and publications in Burma are not exactly unfree at present, and so on, it's not necessarily meaningful to ask this question of this audience. What do people in a randomly selected village in Burma think or pronounce? We really don't know. --Demiurge1000 (talk) 19:23, 23 April 2014 (UTC)

A straight Google search on each of the two names shows "Myanmar" as considerably more common:
 * {| border="1"

But in Google Books, the reverse is true:
 * Myanmar
 * About 76,300,000 results
 * Burma
 * About 26,800,000 results
 * }
 * }
 * {| border="1"

Most of the text searched by Google Books is content that has been professionally written and edited; but there also is likely to be a good deal more older writing than on a general web search. I think the second point probably accounts for the apparent discrepancy. In any case Google hit counts should be considered as very approximate; I think what these searches really show is that both forms are widely used. --50.100.193.30 (talk) 21:50, 23 April 2014 (UTC)
 * Burma
 * About 8,550,000 results
 * Myanmar
 * About 4,350,000 results
 * }
 * }
 * Can't trust Google's guessing, anyway. It says 76 million, but if you go to the last page of results, the actual number will be closer to a thousand. And of those thousand, many will be duplicates. It's one of the few things with more uncertainty than opinion polls. Personally (another Google problem), I get 427 for "Myanmar", before it asks to include omissions. After that, 743. That's not even slightly approximate. InedibleHulk (talk) 10:12, 24 April 2014 (UTC)
 * This is true, but you know there must be more than a few hundred web pages that mention the country. (For one thing, consider postal addresses.)  Because Google's policy is to return no more than 1,000 hits for a given search, I think there must be some sort of initial step (in their confidential algorithm) that selects chunks of the database that ought to produce at most about 1,000 hits, and then when you ask for the actual hits, you only see the ones in those chunks.  So while we can't verify the large numbers shown as the initial estimates, the small numbers you get if you look at the actual hits aren't necessarily meaningful either.  I choose to believe that the estimates mean something even if they can't be taken as anything like exact. --50.100.193.30 (talk) 23:51, 24 April 2014 (UTC)
 * I think they mean the vast majority of the Internet (literally 99.999% here) is inaccesible through Google. Or maybe that's a misleading statistic. InedibleHulk (talk) 00:52, 25 April 2014 (UTC)
 * It would be interesting to see those same stats for Hellenic Republic vs. Greece. ←Baseball Bugs What's up, Doc? carrots→ 01:11, 24 April 2014 (UTC)
 * Straight Google search (the quotes make it a phrase search; I included the version without quotes just for interest):
 * {| border="1"


 * Greece
 * About 148,000,000 results
 * "Hellenic Republic"
 * About 1,160,000 results
 * Hellenic Republic
 * About 2,250,000 results
 * }
 * Google Books search:
 * {| border="1"
 * Google Books search:
 * {| border="1"


 * Greece
 * About 60,700,000 results
 * "Hellenic Republic"
 * About 46,300 results
 * Hellenic Republic
 * About 47,200 results
 * }
 * Incidentally, the first hit I got for "Hellenic Republic" was for a Greek restaurant in Melbourne, Australia. --50.100.193.30 (talk) 23:51, 24 April 2014 (UTC)
 * I got History of the Hellenic Republic. The restaurant is third, behind Greece. The 834th and final result is the Hellenic Parliament's official site. InedibleHulk (talk) 01:02, 25 April 2014 (UTC)
 * Incidentally, the first hit I got for "Hellenic Republic" was for a Greek restaurant in Melbourne, Australia. --50.100.193.30 (talk) 23:51, 24 April 2014 (UTC)
 * I got History of the Hellenic Republic. The restaurant is third, behind Greece. The 834th and final result is the Hellenic Parliament's official site. InedibleHulk (talk) 01:02, 25 April 2014 (UTC)


 * See BBC News - Burma: What's in a name?. Alansplodge (talk) 07:36, 24 April 2014 (UTC)


 * I distinctly remember a wave of transition to Myanmar after the official renaming in 1989, which continued for some years, as simply being the technical recognition of the new name. However, at some point substantially later on, there was a wave of reversion to Burma in association with an international sentiment that the ruling regime was illegitimate.  What I don't recall is what specific event triggered the switch back, but in my recollection it was nearly as abrupt as the 1989 transition. Wnt (talk) 02:54, 25 April 2014 (UTC)

Just to round out the discussion, Burma refers to the Burman majority, low-land living people. Myanmar is supposed to be more inclusive of ethnic tribes living in higher elevations. However, under both names, the military thugs have been at war with ethnic minorities since at least the 1962 Ne Win coup d'etat. DOR (HK) (talk) 07:30, 25 April 2014 (UTC)

Uncertainties in opinion polls
When a public opinion poll give a "margin of error" or a "sampling error", are these the same thing? Occasionally a poll with say they have 95% confidence in their results. Does this mean that the error is the range for p=0.05 rather than for the standard deviation? Or does it just mean that it passes their criterion of p=0.05? Are opinion polls ever published with σ? (I'm speaking of pollsters in the US, if that makes any difference.) — kwami (talk) 06:12, 23 April 2014 (UTC)
 * Does the Wikipedia article titled Confidence interval help? -- Jayron  32  12:53, 23 April 2014 (UTC)


 * It helps, but I'm still left guessing. If opinion polls are always published with a p=0.05 confidence interval unless they state otherwise, then we'd good to go – all our polls say either nothing or 95% confidence.  But if "sampling errors" and "margins of error" typically use different conventions (such as p-values and standard deviations), then we have a problem.  I've seen enough discussion of p-values being misused that I'm afraid to just assume what the reported errors represent, or that they're the same thing in all the polls.  — kwami (talk) 18:31, 23 April 2014 (UTC)


 * Polls are typically use discrete metrics ("Are you pro or con?", "Do you like A, B, or C?") rather than a continuous metrics ("How tall are you?"). Standard deviation (e.g. assuming a Gaussian distribution) is therefore a rather poor metric to use to describe the process. Instead of setting up the problem where there's a population average of some continuous value where individuals in the population have intrinsic values which are distributed continuously around that population average (which is how an approach using standard deviation would typically be structured), polling questions usually assume that each individual has a discrete preference of either A or B (but not some sort of fractional mix). What the poll is trying to assess is the fraction of the population which has a preference of A, rather than the average value of "A preference". When you set it up that way, you effectively do a series of discrete Bernoulli trials by asking the question to multiple different people. From these trials you attempt to back out the intrinsic fraction of people who prefer A. You can then set up a confidence interval on that estimate of the intrinsic (population) fraction based on the number of trials you did. It's this confidence interval that's typically reported with polls when they say things "with a margin of error of 3%". Note that when you compute the confidence interval there's a meta parameter of how stringent you want the confidence interval. (The p-value you use on the confidence interval.) My understanding is that this is typically set at a level of 0.05 for most polls (so the numbers reported are for 95% confidence intervals), although I'm not entirely sure of that, and that value is rarely mentioned. You may want to take this over to the Mathematics desk for more details. -- 160.129.138.186 (talk) 23:50, 23 April 2014 (UTC)


 * Thanks. Yes, I'm waiting on a reply there.  What I'm doing in the meantime is assuming that 0.65 of the margin is significant compared to a fixed point (such as being a majority opinion), and that 0.85 of the margin is a significant distinction between two opinions, and less than that is a statistical tie.  Assuming the margin of a poll is for p = 0.05, I *think* that the first gives us 95% confidence that a response is majority opinion (1.28σ above 50%), and that the second gives us 95% confidence that a response is plurality opinion (1.65σ above a competing response).  I want to address unjustified conclusions drawn from polls, such as a map that supposedly shows where a proposition has majority support or majority opposition, when the figures are really too close to call.  — kwami (talk) 02:50, 24 April 2014 (UTC)


 * You may want to read this. Might answer a few questions about unjustified conclusions. InedibleHulk (talk) 10:16, 24 April 2014 (UTC)


 * Well, yes, polls are often inaccurate, and we might not have the ability to evaluate them. What I'm trying to do here is, if we do map polling results, we at least don't misrepresent those results, such as claiming that 52% is a majority when the sampling error is 6 points.  Whether the polls we're mapping actually tell us anything is another matter.  — kwami (talk) 17:27, 24 April 2014 (UTC)
 * I'm 100% uncertain on the whole mapping thing. Good luck! InedibleHulk (talk) 01:11, 25 April 2014 (UTC)