Wikipedia:Reference desk/Archives/Science/2013 July 13

From Wikipedia, the free encyclopedia
Science desk
< July 12 << Jun | July | Aug >> July 14 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 13[edit]

equal pressure gas airlocks[edit]

What is the current state of the art on airlocks between gases of roughly equal pressure? Is there a reliable way to isolate each gas while allowing people and objects to pass between without using an intervening vacuum? How does it work?

I will explain the scenario in a little more detail.

A person needs to work in a hazardous gas environment, though the gasses are at roughly normal atmospheric pressure. He suits up in a sealed suit and brings his own air supply. When he's done, he needs to back to the normal air environment, but bring no gas with him. Is there an airlock like technology that does this? If you just try to pump one kind of gas out and another in, don't you either have a fatal vacuum in the middle, or just get mixed air? Is there a trick to avoid that, another system entirely, or does it just work and I'm not thinking about it properly?

Thanks for your help. (sorry, this was me. Editing in my signature) gnfnrf (talk) 01:41, 13 July 2013 (UTC)[reply]

Normally, when working in hazardous gas atmospheres, the hazardous gas would be kept at a slight negative pressure compared to the outside air -- this way, when people need to enter or exit the hazardous atmosphere (or in the case of an accidental breach), outside air will flow in but the hazardous gas won't flow out. 24.23.196.85 (talk) 05:30, 13 July 2013 (UTC)[reply]
I can imagine a horizontal interface might be fairly stable if the gasses are different weights, and especially if the lower one is also cooler - for example, an airlock between an upper inhabitable area and lower maintenance facilities in a floating base on Venus. Wnt (talk) 06:04, 13 July 2013 (UTC)[reply]
Do you have a specific gas or situation in mind? Most straightforward would be air purging in the airlock. There's a rule of thumb requiring six air exchanges to provide a 3-log reduction in airborne contamination. So it's simply blowing air through the chamber until it's diluted enough. The air goes to a scrubber or is dumped. An alternative would be a Suitport, but that seems a bit expensive. An air shower (room) is used for cleanrooms, together with an airlock.
Found a US Air force document CONTAMINATION CONTROL AREA – AIRLOCK –TOXIC FREE AREA PROCESS ANALYSIS (BRIEFING CHARTS), but that's more about "unplanned" decontamination it seems. Ssscienccce (talk) 00:19, 15 July 2013 (UTC)[reply]
No specific scenario. I started by thinking about humans on another planet with a dense but non oxygen-nitrogen atmosphere, but then I got to wondering about industrial processes, and was just generally curious. To clarify, though, you are saying that if you pump 6x the volume of the room out (to be filtered or rejected) and in as clean air, simultaneously, you will be left with 999 parts per 1000 of air, and 1 part per 1000 of the non-air gas left over?
That link looks very interesting and completely full of confusing acronyms and terminology. It'll take me a while to unpack it. Thanks. gnfnrf (talk) 20:26, 15 July 2013 (UTC)[reply]

Glycaemic Indices of Gluten-free flours.[edit]

A wide variety of gluten-free flours are now commonly advocated for consumption by persons with coeliac disease. These flours may also be advantageous for persons with diabetes or high post-prandial blood glucose levels, depending mainly on the the percentage of amylopectin in their starch, as reflected in a low glycaemic index (GI: rate of rise of blood glucose after a standard carbohydrate meal ). A low dietary amylopectin burden typically correlates with a beneficial low GI, leading to a reduced development of advanced glycation end-products (ACE), which are the cause of a variety of age-related diseases. Therefore a compilation of the GIs of a range of flours currently available and recommended for controlling coeliac disease would be helpful to persons wanting to manage their blood glucose levels in addition to any gluten sensitivity. Flour is used for baking breads around the world.

--19Grumpah42 (talk) 04:06, 13 July 2013 (UTC)[reply]

Do you have a question, or a desire for references? SemanticMantis (talk) 05:08, 13 July 2013 (UTC)[reply]
I think the OP is suggesting a new article, in which case they should ask at WP:Requested Articles. Rojomoke (talk) 07:26, 13 July 2013 (UTC)[reply]
I don't think that's at all a suitable topic for an encyclopedia article. The information may well belong somewhere on the web, but not here. Looie496 (talk) 15:00, 13 July 2013 (UTC)[reply]
But it could make a good list say: List of glycaemic indexes or perhaps spelled List of glycemic indexes or indeces. Graeme Bartlett (talk) 21:21, 14 July 2013 (UTC)[reply]

Looking for an Approachable In to Quantum Field Theory and Particle Physics (As in books)[edit]

I have a decent mathematics background (though analysis is not my strongest area) and know a decent bit of basic Quantum Theory (I've worked through a few texts), what I'd like to find is a specific type of book on the subjects mentioned. Most of the QFT and PP books I have are of three flavours: for physicists that are well acquainted with the experiments, motivations, and ideas being generalized; for mathematicians, meaning that the book becomes a book on applications of functional analysis and lie theory; or books full of toy models with vague shallow explanations tossed around. What I'd like is a book that is written as if it were physicists, but assumes the background of a mathematician (meaning that it explains the physics ideas and it isn't presented as theorems/proofs followed by a few lines about how particles are representations, then back to the general case.) It doesn't even need to cover the depths of these subjects, just enough that I could pick up a second more traditional book and follow it (with work, obviously). Sorry for the long question, I've spent a lot of time looking, but this is one of the few subjects I just can't find an in into. Thank you for any suggestions:-) Sorry for any poverty of clarity. Phoenixia1177 (talk) 10:13, 13 July 2013 (UTC)[reply]

CERN's education page provides several dozen free full textbooks; presentations, review papers, and websites. But let's be frank and honest. One does not "learn quantum field theory" by reading a textbook, any more than one learns to fly airplanes from the Airplane Flying Handbook. Like any advanced physics specialization, there are some prerequisites - time, money, and intellectual capacity. You will probably need one-on-one instruction, and interaction with others who are learning and using this type of physics, which means intensive study among experts in the field.
First, study and earn an undergraduate degree in physics, or mathematics with a strong concentration in physics, from an accredited university. This proves that you're capable of investing the time and money to tackle the field, and that you have the intellectual capacity for it. Next, enter into a program of graduate study in theoretical physics. After a year or two, apply for a special program, like one of the institutional exchanges or particle physics summer schools.
If you aren't able to follow that track, for any reason, then advanced particle physics probably isn't for you; if you still have interest in the topic, you might enjoy some of the material on the website I linked. Nimur (talk) 15:40, 13 July 2013 (UTC)[reply]
For an introduction to particle physics at a first-year undergraduate level I suggest The Ideas of Particle Physics by Coughlan, Dodd and Gripaios. Another good introduction that includes more cosmology is Quarks, Leptons and the Big Bang by Jonathan Allday. For an introduction to quantum field theory try Fundamental Forces of Nature - The Story of Gauge Fields by Kerson Huang. Gandalf61 (talk) 16:04, 13 July 2013 (UTC)[reply]
Thank you:-) I picked up the book by Huang this afternoon; I've read a decent bit of it, it's quite readable:-) I know I have a copy of From Operators to Path Integrals (same author) somewhere in my library (in quite a state of disarray at the moment...); looking at the excerpt on Amazon, it looks fairly readable, is this true of the rest? Do you know of anything along the lines of Quantum Field Theory: A Tourist Guide for Mathematicians, by Folland, that is slightly more elementary? That book is on the very edges of what I can follow, which makes it really hard to extract the physics out of it. I know I'm being weebly-woobly, and vague, here with my requests; for some reason this subject is a pain to navigate on my own (everything is too much physics or too little). I'm going to check out some of the links at the Cern site tonight, I haven't had a chance yet (it's easier to read on my Kindle at work).Phoenixia1177 (talk) 21:55, 13 July 2013 (UTC)[reply]
I'm not sure I understand what you are and aren't looking for, but I have a somewhat similar background and I remember enjoying Quantum Field Theory in a Nutshell by Anthony Zee (ISBN 978-0691140346, first chapter as a PDF), Diagrammatica by Martinus Veltman (ISBN 978-0521456920) and Introduction to Elementary Particles by David Griffiths (ISBN 978-0471603863). These are all undergraduate/graduate textbooks that emphasize concepts but also cover the mathematics in detail. -- BenRG (talk) 00:52, 16 July 2013 (UTC)[reply]
Thanks for the suggestions:-) I've read Griffiths, I've read several of his books, I like him as an author. Zee I, actually, have a copy of, but my books are in such disarray! I'll have to dig it up, though. I've never heard of Diagrammatica, but will definitely grab a copy this week. As for what I'm looking for, it's unusually hard to put in words. Essentially, the mathematics part isn't a problem (most of what I read is logic/set theory; but Lie theory and path integrals aren't at all terrifying). My difficulty is that most of the books for "mathematicians" are way to abstract, which is fine mathematically, but leaves me confused about where the physics comes in. On the other hand, the books for physicists are way too much "for physicists", if that makes sense. What I'd like to find is a book that isn't shy about the mathematics, but is very gentle in linking it up to actual physics. For example, most of the math I've seen relating to QCD makes sense to me, but everything I've read about it seems like it either spends its time doing math and never gets to the physical why; or it seems like a confusing rush of physics terms and conventions, which all seem impenetrably alien. Sorry if that's long, or unhelpful, but this is the only subject (Besides cutting edge Algebraic Geometry!) that just seems horribly impossible to learn. Thank you for the help:-)Phoenixia1177 (talk) 04:10, 16 July 2013 (UTC)[reply]
I recommend An Introduction To Quantum Field Theory by Michael E. Peskin and Dan V. Schroeder. Dauto (talk) 16:05, 16 July 2013 (UTC)[reply]
Bought a copy today; also grabbed Diagrammatica (I like what I've see in it thus far:-) ) and the book by Mandl and Shaw (which also looked kind of nice). Thank you all for your help:-) If anyone has anymore suggestions, I'd be willing to check them out; but I should have enough reading to occupy me for a while now:-)Phoenixia1177 (talk) 05:41, 17 July 2013 (UTC)[reply]

Lac-Mégantic derailment[edit]

In this article, Lac-Mégantic derailment, it states: "It is possible that some of the missing people were vapourized by the explosions." What would this mean exactly? Thanks. Joseph A. Spadaro (talk) 12:42, 13 July 2013 (UTC)[reply]

It would mean that it was so hot, they were basically incinerated, with no remains left. I doubt they were turned entirely into actual vapour, but certainly they could have mean cremated. Mingmingla (talk) 15:37, 13 July 2013 (UTC)[reply]
Probably a poor translation from French. It can be the French equivalent of "going up in smoke" / "leaving/being gone without trace", which is usually not literal either. - ¡Ouch! (hurt me / more pain) 16:49, 13 July 2013 (UTC)[reply]
Oops, wikt doesn't list that old meaning either. :( — Preceding unsigned comment added by One.Ouch.Zero (talkcontribs) 16:55, 13 July 2013 (UTC)[reply]
I suspect it might have simply been a word chosen for maximum emotional impact. "Vapourize" is more powerful than "burnt up". Incinerate might be too clinical or insensitive. Mingmingla (talk) 21:14, 13 July 2013 (UTC)[reply]
Do Canadians spell the word "vapourize"as in the article, presumably following British usage, or "vaporize" like folks in the US? The article should probably follow Canadian spelling conventions. Edison (talk) 02:16, 14 July 2013 (UTC)[reply]
Yes, Canadian spelling uses -our, e.g. colour, vapour, flavour, etc. as in British spelling. הסרפד (call me Hasirpad) 02:37, 14 July 2013 (UTC)[reply]
The usual British spelling is 'vaporise' or 'vaporize'. The dropping of the U also occurs in glamour -> glamorise. AndrewWTaylor (talk) 08:01, 14 July 2013 (UTC)[reply]
According to the Language Portal of Canada, vapourize is correct. So it's unlike either British or US English. Heron (talk) 10:46, 14 July 2013 (UTC)[reply]
I'm a Canadian, and have never even thought about spelling "vaporize" with a u. No more than "colourant" (which is apparently also our proper spelling, I see). Whatever the books say, it looks weird to me. "Vapour" and "colour" are cool, though.
Regardless of what the spelling conventions say, Wikipedia policy states that BOTH British (and, by extension, Canadian) English AND American English are equally appropriate in articles, and furthermore that edit warring over the use of British vs. American English is prohibited. So let's stick with whatever the article's creator decided, OK? 24.23.196.85 (talk) 01:05, 15 July 2013 (UTC)[reply]
Yeah, proper's fine. No edit war (or even edit) intended. I just, personally, wouldn't use the U. For what it's worth, a Google News search shows only the Toronto Star uses the U. Montreal, Ottawa, Vancouver and others just the O. InedibleHulk (talk) 02:33, 16 July 2013 (UTC)[reply]
No, it is not always up to the first contributor's choice of English version. If an article's topic is closely associated with one English-speaking country, then that country's version of English can be substituted for some other version the first contributor used. See Wikipedia:Manual of Style#Strong national ties to a topic. Canadian spelling and terminology are preferable in this article, regardless of what conventions were used by the initial contributor. Edison (talk) 20:04, 16 July 2013 (UTC)[reply]
As to the question, every solid that burns is vaporized, after being liquefied. Much of smoke is vapour (though the colour we see is from small solid soot). If it's hot enough, that vapour turns to plasma, which we call flames. And yes, it is also a more powerful word than burnt, implying no solids or liquids are left (which isn't true, there will virtually always be some ash). InedibleHulk (talk) 11:12, 14 July 2013 (UTC)[reply]
And I expect crispy critters is right out. The article Cremation indicates temperatures in the 1600-1800 Fahrenheit range. That "vaporizes" the flesh and leaves just the bone. So that could be compared with whatever the temperature estimates are for this railroad disaster. ←Baseball Bugs What's up, Doc? carrots→ 14:20, 14 July 2013 (UTC)[reply]
Probably somewhere around there. Some fires burned much longer here than they do in a crematorium, though. If there was a body in one of those fires, it could be more burnt than those you find in urns. Which is about as useful as pure vapour, so far as identifying it goes. InedibleHulk (talk) 14:30, 14 July 2013 (UTC)[reply]

Natives[edit]

Are Native Americans and Hawaiian and other indigenous people who have gone through mass epidemics due to weak immunities in historical times still seeing the effects of a weaker immune system or are modern high numbers in disease and death just the result of poverty in these groups? — Preceding unsigned comment added by KAVEBEAR (talkcontribs)

It's not that they had weak immune systems as such, but that (1) they had never been exposed to endemic diseases of the Europeans, so what the latter consider "childhood diseases" struck the entire population at once, and (2) they did not have the inherited resistance to some diseases conferred by genes that made the latter resistant to the plague and hence also partially to HIV. μηδείς (talk) 16:57, 13 July 2013 (UTC)[reply]
You might want to read "The Arrow of Disease", a Discover magazine article. Basically, it states that Europe had a high enough population density that some diseases became endemic, causing people to gradually become more resistant to them. The diseases evolved, forcing human defenses to counter - a sort of arms race. America, Hawaii and other places mostly didn't have enough inhabitants sufficiently packed together to go through this process, so the more evolved European maladies struck them down in droves. That also explains why few New World illnesses troubled the Old. If a disease was deadly enough, it essentially killed itself off by wiping out the isolated tribe or group it first infected. At least, that's the way I understand it. I'm no expert. Clarityfiend (talk) 19:53, 13 July 2013 (UTC)[reply]
Yes, that's accurate. And Afro-Eurasia is the biggest island (and most highly populated) on the planet. μηδείς (talk) 20:07, 13 July 2013 (UTC)[reply]
But by that same token, the entirety of the Americas is not small and, if you believe the current thinking on the subject, they were chock-a-block with people. If Tenochtitlan had a pre-contact population of between 500,000 and a million people [1] then what exactly is your definition of "high enough population density"? That's a lot of people living in hot, wet, crowded, living conditions. A more likely difference is the lack of domesticated and farmed animals. Many old-world plagues are zoonotic (carried by animals) and this typically requires humans and animals to be living in close living conditions - something that happened all the time in Eurasia and not very frequently at all in the Americas. That differential behaviour caused the Eurasians to be exposed to many more diseases historically, altering their immune system's defenses. Matt Deres (talk) 23:22, 13 July 2013 (UTC)[reply]
To continue on from a similar angle, Eurasians and Americans practiced very different kinds of agriculture. In Eurasia, ploughing was widely used in a variety of different forms. Tilling the earth in that way also digs up soil dwelling bacteria and fungi, which may also bring contact with new diseases (example). Native Americans usually didn't go in much for ploughing and felling of trees since they lacked metal tools to do the work (try hacking down a hardwood tree or cut through thick soil with a stone implement and you'll see why they turned to slash-and-burn style agriculture). Burning stuff down rather than tilling it under greatly reduces the contact between people and soil bacteria. See hygiene hypothesis for some recent work that has shown the benefits of contact with, well, filth at an early age to build up the immune system. Matt Deres (talk) 00:14, 14 July 2013 (UTC)[reply]
The Americas weren't empty, but they were a very small portion of world population, with France outpeopling all of North America, see map. While it is widely speculated syphillis came from the New World, the majority of transmission of diseases wnt from the wider pool in which they were established to the smaller. μηδείς (talk) 01:07, 14 July 2013 (UTC)[reply]
I think it's fair to say that the Old World contained far more people than the Americas, even using the highest pre-Columbian population estimates for the latter. That said, I find your map highly dubious. I don't know how it calculated Native American populations, but these calculations are notoriously difficult given the scanty data available. Virtually every society was decimated by disease before seeing the first European. In the few regions that Europeans managed to visit early enough, nobody had the means or the interest to do a census until well after the region got depopulated. Working backwards from European records to pre-Columbian populations is difficult because you have to assume a fatality rate due to disease. If you change your assumption a tiny bit--from 93% to 96%, for example--your pre-Columbian estimate changes by a lot. Due to these difficulties different scholars arrive at wildly different estimates. See Population history of indigenous peoples of the Americas#Population overview for just a few. --Bowlhover (talk) 02:15, 14 July 2013 (UTC)[reply]
The highest population estimate I have ever heard for the entire Americas is 40 million, and that is far less than half of China or India, at most three or four times estimates for France. I am not quite sure what you find dubious, but I have no necessary problem with other maps if they are provided. The point is, the New World was a colonized area with very small founding populations moving through an arctic territory that would have been unlikely to bring many endemic diseases, and other than the dog, who had no domesticated animals with them to serve as disease reservoirs. The OP asked about weakened immune systems in Native Americans, and I have never heard of such a model. μηδείς (talk) 02:37, 14 July 2013 (UTC)[reply]
Population estimates run as high as 100 million for the Americas. Having read a bit about it (and heard the politicking behind the use of small numbers), I'm inclined to believe the 100 million estimate much more than the 10 million one. I've heard the immune system theory before; it's also discussed in the 1491 book here. The short version is that the crossing of Beringia created a population bottleneck and associated founder effect. One of those long term effects was a decrease in the number of HLAs. There's some advanced biochemistry involved there that's beyond me, but the numbers suggest that pre-Columbian New World populations even today are some ten times more likely to be susceptible to a given disease than an Old World population. Matt Deres (talk) 13:47, 14 July 2013 (UTC)[reply]
I am unfortunately quite familiar with people who believe the Americas were inhabited by sparse primitive tribes without civilization, but the political point is not relevant here. The total population numbers don't really matter for the immunity of the Americans, only the fact that the diseases the Europeans brought were not already endemic there because the Beringian founder populations didn't bring those diseases with them. Regardless of their size, had the Americas been founded but multiple waves out of the tropics/subtropics then Europeans moving there would have been at much higher risk of infection than the natives, compare what happened to British colonists who went to Africa or India. μηδείς (talk) 18:23, 14 July 2013 (UTC)[reply]
You're talking in circles. You say you're "unfortunately quite familiar" with that picture of the American populations, yet you happily cited that claptrap when you thought it suited. Fewer people than in all of France - preposterous! But then you throw both arguments away in exchange for... I'm not even sure what to call your last sentence, but it's no answer to anything the OP brought up, nor anyone else. The Americas had the people and the population densities to support all kinds of diseases (as evidenced post-1492), but they had few endemic ones for the reasons I've outlined above, centering mostly on different agricultural and animal domestication strategies. Matt Deres (talk) 21:00, 14 July 2013 (UTC)[reply]
You have a strange notion I think the difference between 40 and 100 million would matter. It has nothing to do with the small, relatively disease-free founding populations from Arctic Asia. How about I stipulate 250 million Native Americans at the time of Columbus? Would that cure the accusation of clap-trap? μηδείς (talk) 00:44, 15 July 2013 (UTC)[reply]
Actually, France had a population of 15 million around the time of Columbus. North America probably had fewer people, even according to most High Counters. The article I linked says "while it is difficult to determine exactly how many Natives lived in North America before Columbus,[5] estimates range from a low of 2.1 million (Ubelaker 1976) to 7 million people (Russell Thornton) to a high of 18 million (Dobyns 1983)." Thus, even the highest non-fringe population estimate of North America is comparable to the population of contemporary France. Most state societies in the Americas, and consequently most of the population, was in Mesoamerica, Peru, and possibly the Amazon shore. --Bowlhover (talk) 01:13, 15 July 2013 (UTC)[reply]
Aside from mere population density, proximity to livestock has as much to do with this kind of immunity. The big killers were essentially agriculture-related diseases: smallpox, the plague (rats associated with grain/close quarters), measles, all the gastrointestinal water-borne illnesses, etc. This is a central point in Guns Germs and Steel. Shadowjams (talk) 06:38, 15 July 2013 (UTC)[reply]
It's worth considering that, as syphilis alone still causes 500,000 stillbirths and miscarriages a year,[2] among other things, and was once a very major cause of death for all ages, the death from New World diseases in the Old World may well have equalled the deaths caused by Old World diseases in the New after all. (It's a bit like gravity, where the tiny effect of a falling ball on the Earth is actually equal to its effect on the ball... it just doesn't seem so visible) Wnt (talk) 15:50, 15 July 2013 (UTC)[reply]

Does the human body only use glucose and ketone bodies as fuel?[edit]

I'm not entirely sure how to word my question, but the basic idea is this: does all the food we eat eventually just get converted into glucose one way or another so the body can use it as fuel? Like the proteins in a steak, are they eventually converted into glucose? ScienceApe (talk) 16:54, 13 July 2013 (UTC)[reply]

The simple answer is no (i.e. not all of the digestible nutrients in food, including protein, can be converted completely to glucose). That said, metabolism and specifically digestion is a really complex topic, with influences from degree of starvation and many other factors. Regarding protein digestion: after protein catabolism, amino acids are often re-used to make protein; however, under some conditions the process of amino acid catabolism can convert portions of many (not all) amino acids to things like acetyl-CoA that can be used on gluconeogenesis to generate glucose (under certain conditions). Similarly, fat digestion can convert fatty acids almost completely to acetyl-CoA. Your question about ketone bodies can be answered by reading about ketosis, but a short answer is that these compounds, which are part of starvation metabolism, are not usually used for energy by most healthy tissues. -- Scray (talk) 18:06, 13 July 2013 (UTC)[reply]
So is this the reason why if you just eat nothing but meat, you'll eventually lose weight? Because your body isn't producing much glucose and the amount of fat consumed and turned into adipose tissue doesn't outweigh the amount your body burns to produce more glucose? Also if your body only needs glucose for fuel, is it possible to have a diet consisting of only carbs and vitamin supplements? ScienceApe (talk) 18:24, 14 July 2013 (UTC)[reply]
This is a huge topic, so please understand we're simplifying things here greatly. Eating nothing but meat generally results in a high-protein, high-fat (low-carb) diet. I don't think there's a consensus on the principal mechanisms at work in ketogenic diet (i.e. extremely low-carb nutrition), but one component is probably the appetite-suppressant effects of ketosis (whether these diets are safe and effective, and for whom, is still unclear). Glucose is not adequate nutrition due to needs for essential fatty acids, minerals, vitamins, amino acids, etc - so carbs and vitamins won't do the job. -- Scray (talk) 23:01, 14 July 2013 (UTC)[reply]
How amino acids enter the citric acid cycle.
(ec)No. Aside from a small amount of energy generated by glycolysis, most of the energy generated in mammalian cells comes from the citric acid cycle. Compounds can enter the cycle through a number of different pathways. While glucose goes through glycolysis and enters the cycle as pyruvate, and fats are broken apart to acetyl-CoA directly, amino acids (what proteins are made of) enter the cycle at different points depending on their identity. They don't have to go through glucose first. That said, the body can convert amino acids to glucose through the process of gluconeogenesis, though that consumes energy rather than produces it, so is only used if the body needs glucose for non-energy reasons. -- 67.40.215.195 (talk) 18:12, 13 July 2013 (UTC)[reply]
We actually have an article glucogenic amino acid that lists them. Note that glucose levels are kept up because it is absolutely required for energy, especially in the brain - even brief episodes of hypoglycemia can be lethal. (The brain also can use ketone bodies for a substantial fraction of its energy according to our article... have to admit, I didn't know that) Wnt (talk) 15:57, 15 July 2013 (UTC)[reply]

Peculiar cell nuclei[edit]

What cells of the human body contain unusual numbers of nuclei? I know that muscle cells merge and pool their nuclei. I think I knew once of neurone cells that have multiple nuclei - is that right? Why do granulocytes have peculiar-shaped nuclei? --217.16.212.250 (talk) 21:36, 13 July 2013 (UTC)[reply]

Take a look here for some information about neutrophil granulocytes, under the section "Normal cells with abnormal nuclei." Lord Arador (talk) 23:43, 13 July 2013 (UTC)[reply]
Is that the same as neonatal jaundice? μηδείς (talk) 03:28, 15 July 2013 (UTC)[reply]
No, the usual cause of neonatal jaundice is breakdown of fetal red blood cells at a rate that exceeds the newborn liver's capacity to conjugate bilirubin. That's generally temporary, as our article explains. Neonatal hepatitis (that article needs work!) is rarely the cause of neonatal jaundice. -- Scray (talk) 04:23, 15 July 2013 (UTC)[reply]