Wikipedia:Reference desk/Archives/Miscellaneous/2014 January 15

= January 15 =

Advice to turn off unnecessary electrical items.
Suppose my house is heated electrically...I get advice to turn off appliances when I'm not using them...my computer, for example. But I wonder about that. All of the electricity that goes into most appliances is ultimately turning into heat and released into the room. That reduces the need for my heater to operate - which saves electricity. Since a TV set is (in effect) a 100% efficient electrical heater, and so is a typical resistive heater - so why not use the electricity to do something useful in order to turn it into heat - rather than "wasting" it heating up a coil of wire?

In practice, my house is heated by a reversible heat-pump, which actually has an efficiency greater than 100% - so this doesn't literally apply to me unless the outside temperatures get really low...but when that happens and the heat-pump's COP drops to 1, I might as well turn on every gadget I can lay my hands on because it won't affect my electricity bill!

I ask this because where I work, the office heating system isn't good enough to warm the corner offices - and this logic caused us to decide to replace the resistive heating units with bitcoin miners, on the grounds that they are every bit as efficient as electrical heaters - but cost less to run because they are literally making money. We now have bitcoin miners in most of the offices - and a couple of them in the break-room where it gets especially cold. (For those who care, we're actually mining litecoin, which is a bit trendier than nasty old bitcoin!)

So now I wonder if the makers of resistive heaters should get into this idea and have their heaters do some useful work with the electricity instead of pointlessly making a coil of wire glow?

Aside from the capital cost of the equipment - is there any flaw in this argument?

SteveBaker (talk) 13:55, 15 January 2014 (UTC)


 * The only other cost that comes immediately to mind is maintenance cost - a resistive heater may need dusted every once in a while and that's it, but computers like to break. Heaters also can come with thermostats so they shut off if the area they're in has warmed up enough, but if you're not getting the room warmed up to that point then it won't matter. The radiant heat off of a heater coil is pleasant if you're standing near it, but the computer fans might do a better job of actually getting the heat mixed through the room. K ati e R  (talk) 15:02, 15 January 2014 (UTC)


 * Yeah - these appliances are never going to provide enough heat to completely heat the room - and if they did, the lack of thermostats would be a problem - but given that the heaters have to turn on some of the time anyway, we still have stable temperatures. Most devices like computers are most likely to fail when turned on or off - leaving them running continuously tends to increase their livespans - although some components (like fans) will fail sooner. SteveBaker (talk) 15:39, 15 January 2014 (UTC)


 * Larger computers and servers are typically left running continuously, partly on the assumption that they will last longer. I have found that my PC operates better if left running continuously with just an occasional reboot (not power off). It seems to be the change in temperature at power-on that causes trouble. Just like with a lightbulb. ←Baseball Bugs What's up, Doc? carrots→ 16:38, 15 January 2014 (UTC)


 * Nothing to do with "change in temperature at power-on", it's the spike in voltage when powering on that causes trouble to electrical equipment. This is the case with lightbulbs as well.  As opposed to offering a personal anecdote, I'd suggest you take a quick look at voltage spike, uninterruptible power supply, and surge protector, all of which give some background on this kind of thing.  The Rambling Man (talk) 18:00, 15 January 2014 (UTC)


 * Also, with motors, the additional torque required to overcome static friction and to accelerate fans and disk drives up to speed wears out parts more quickly than if left running continuously - and the cycling of the temperature of connectors (and socketed components in older hardware) can cause them to slowly wiggle loose, causing eventual failure. SteveBaker (talk) 19:09, 15 January 2014 (UTC)


 * More than maintenance, there's probably a manufacturing impact. I'll bet if you compared the environmental effects of a heating element you'd keep for a twenty years, and a computer you replace every four years, it would be a pretty stark difference. Even if we don't like to think about such things.


 * Mining bitcoins may make you richer, but it's not really "useful work" in the sense that it's created something useful.APL (talk) 16:08, 15 January 2014 (UTC)


 * I agree - but bitcoins are interchangeable for real money - so it does reduce the effective cost of heating by a small amount compared to normal electrical heaters - but they could equally be doing good work for various online cloud processing stuff like the SETI folks or the protein folding thing.  I'm sure there are other ways to harness electricity to earn money while generating heat. SteveBaker (talk) 19:09, 15 January 2014 (UTC)


 * The range of application for this is limited by the fact that electricity is an extremely expensive way to generate heat. Natural gas is much cheaper.  So it really only comes into play in situations where electricity is available but gas is not. Looie496 (talk) 17:20, 15 January 2014 (UTC)


 * Natural gas is only cheaper if 1) your house is already connected to the natural gas network, and 2) you don't live in one of the parts of the world where electricity is cheaper. I happen to live in a home where 1) is false and 2) is barely true; the breakeven point for converting to gas heating (not counting the cost of getting the utility company to extend their network) is several hundred years in the future.  For this and various other reasons, I heat my home using a mix of electric baseboard heating and computers running Folding at Home. --Carnildo (talk) 01:56, 16 January 2014 (UTC)


 * Using the electricity to run a heat pump is more efficient for heating than using a resistive heater. During the Three-Day Week dispute in the 1970's in Britain when electricity was rationed companies were allowed to keep their computers running and the lights on in those rooms - so DEC (now out of business) moved minicomputers to each room they wanted lit and they kept the room warm at the same time. Dmcq (talk) 21:29, 15 January 2014 (UTC)


 * I do tend to leave electronics and lights on more in winter, precisely because the excess heat they generate is appreciated then. However, in summer, the opposite is true, as every bit of heat they generate must now be countered by the A/C.


 * As for gas heating being more cost effective, this is true, overall. However, gas forced air heating has one huge flaw, it does a lousy job at zone heating (no article ?).  That is, you end up heating the entire house, not just the room you're using.  You can fiddle with the registers each time you switch to a new room, but that's annoying and not very effective.  With electronics, on the other hand, it's almost automatic.  You enter a room, and turn on the lights and TV and computer, and you start making heat there.  Hopefully this allows you to turn down the thermostat and heat the rest of the house less. StuRat (talk) 22:01, 15 January 2014 (UTC)


 * Rather than concentrate on where there is any flaw in your argument (because that like asking to have a bucket of cold water thrown over an interesting avenue of venture). Let us look at this way: If you already heat your 'space' with electricity AND have a computer – then mining for Bitcoins is is profitable as of now. However, if you (or resistive heating coil manufactures) invest in computers solely for this purpose, then there is the risk that this sort of venture could collapse before the investment is recouped ( a bubble that bursts). Second. When people talk about heat pumps being more than 100% efficient, they really mean that is takes about 1Kilowatt of electrical power to pump 3 Kilowatts of 'low' grade heat that 'already' exists somewhere else. So one ends up 2 Kilowatts of heat better off. Lastly. Something that is often over looked: Even in the coldest of winters, the water enters a home in liquid form – it contains heat. It often leaves very much warmer. Put a heat pump evaporator coil there – in the flow of the wast or grey water! For example: if one of my granddaughters is in such a rush to go out with her date on time, that she can only spare to spend a very quick 20 minutes in the shower – that's 3 Kilowatt hours of heat down the drain. Then consider, washing machine, kitchen sink, bath etc. Loads of wasted heat that is often at a higher temperature than outside ambient. Heat pump efficiency falls off, when the gradient temperature start to  deviate by more than 20 degrees Celsius. Wast water has a higher concentration heat  per unit volume than air. --Aspro (talk) 22:08, 15 January 2014 (UTC)


 * I've wondered about reclaiming waste-water heat myself, but have never seen it done, so assume there's something impractical about it. StuRat (talk) 02:24, 16 January 2014 (UTC)


 * Apparently they exist. Rmhermen (talk) 03:01, 16 January 2014 (UTC)


 * We also have a not very good article Water heat recycling. At the simplest level you can have the drain for the shower pass over the cold water feed (and similar for other appliances) although regulatory requirements in many countries do require I think double piping or similar. Nil Einne (talk) 12:04, 16 January 2014 (UTC)


 * Yea, it seems a lot more reasonable to use shower drain water to warm the cold water going to the shower than the cold water intake on the water heater. Think of all the piping needed the do the latter, especially if every drain is set up that way.  To make that approach more efficient, I suppose you could send all grey-water to an underground tank, like a septic tank, and run cold water intake pipes through that. StuRat (talk) 00:18, 18 January 2014 (UTC)


 * I've mentioned this before when it comes to lights that it's flawed to think of all forms of heating in a house as equivalent. I'm not denying the first law of thermodynamics or anything like that, simply advising against spherical cow assumptions. Yes all the waste heat will be in the room, it doesn't mean it will have the same effect as a well designed heater suitable for whatever purpose or room you want to heat.
 * The example I used is an oil column heater I had which was actually very poor at heating a room compared ton an equivalent wattage of a better brand. I never tested it but I believe the wattage was accurate. I suspect the power usage with it was lower since it's likely it turned off sooner/more often (although even with the thermostat at maximum setting it was fairly poor) but I also suspect that for an equivalent power usage it was poorer at heating the room since for whatever reason it was poorer at releasing the heat and so it likely remained hotter a lot later than was needed. (Storing heat and slowly releasing it is slowly is one of the points of a oil column heater, but not to this level.)
 * I live in NZ, specifically Auckland. Winters here would be considered mild for people from most temperate countries. But our houses still get very cold because the insulation levels are often terrible. (This is the same for most of NZ.) Electrical heating is the most common form . And as you may guess from the poor insulation, central heating is rare. So most commonly people heat their houses on demand, turning on a heater when it's needed in the room is needed and turning it off when it's no longer needed.
 * A computer may actually not be too bad at heating since as Katie Ryan A they generally have fans, particularly one high end enough to be useful for heating. However they may still be poorly located in the room. And in fact because of their design a high end computer may be worse than a fan heater in terms of noise (yet still significantly lower power so you'd probably need 4 of them or something to come close to a 2400kW heater). You could probably solve this by designing your computer to be a better heater but it's not something you can buy off the shelf.
 * Other appliances are more complicated and would vary from location to location. For example, my TV is mounted a bit high for a heater without some form of forced convection. It's also of course not that close to where people sit in the living. Similarly power plug are often poorly located and I wonder how long a typical AC adapter keeps its heat (in any case these are too insignificant to be useful).
 * Similarly with lights. As I've remarked before in most houses the are located near the ceiling so without some form of forced convection would likely increasing the thermal gradient. So unless you live on the ceiling or perhaps have a second floor and are talking about lights on the ground floor, may be worse at ensuring you get useful heat than a well designed heater (although I admit I think I neglected to mention the significance of infrared for incandescent lights in some of the previous discussions which obviously helps them a fair bit).
 * In other words, the waste heat from all these may not be completely wasted, but it also may be less useful than waste heat from a decent electric heater
 * I would add most of the appliances or gadget examples aren't actually doing useful work when it's a choice of whether to turn them off or not. The only doing you are doing is saving yourself turning them on and off. If there's confusion what I mean is that for most people it's not a case of 'let's turn off my plasma TV even though I want to watch TV because it's wasting power. But rather, I'm no longer watching TV and won't for the next 30 minutes. Shall I turn it off or leave it on? Either way I'd get nothing useful from the TV other than the heat. So while it may be less important to turn off the appliance when you're heating the room the appliance is in electrically, since some, most or all of the 'waste heat' simply offsets the heat from the resistive heater, it's at best no worse than simply using the resistive heater. Unlike with the computer scenario, it can't be better because no one is getting something useful either way.
 * And for many of the appliances, unless you're talking about turning them on and off every few minutes or something, I wouldn't assume that leaving them on results in longer life spans. Sure voltage spikes and thermal cycling and other stuff that comes from turning it on and off will negatively affect lifespan. But the increased heat, wear on components from being powered on including for non solid electrolytic capacitors drying etc mean leaving it on will often negatively affect lifespan too. And there's often no good studies on which one comes out on top if we're talking about say turning it off for 8 hours every day vs leaving it on constantly. I'm not saying you should turn them off because they will last longer. Rather except in a few cases, you most likely don't know which will be better. So there's no reason take any action based on effects on the lifespan.
 * I know you said to ignore capital cost but I think it's worth emphasising that this is probably one of the biggest problems with any such generalised scheme. As Aspro has said, if it's a choice of turning your existing computer off or more likely, just leaving it idle which for any modern computer implies a very low power usage vs doing Litecoin mining perhaps it comes out in favour. Similarly with distributed computing tasks most of which don't give you anything useful but hopefuly do something useful for someone. However for someone to buy a computer to do this, or make one for the purpose, while the energy will be free, people buying it will need to make back its cost under it's normal life. For a computer 4-6 years is probably a fair be for how it has to make but its cost. Even if the computer lasts longer, by the end of that period it's contribution will be fairly small.
 * And in a normal household situation, it's common that you'll want a powerful enough that it doesn't take ages to heat the room yet don't want the room to be too hot so will turn it off after a while. So as I mentioned, you'd like need 3-4 high end computers to come close to a 2400kW heater or likely at least US$3000 or so outlay. Even under a scenario where you only have one computer and assume it generates a baseline to replace lost heat and isn't turned off when heat is needed, you may still be only running it for a few hours every day. Then you have to worry about maintenance etc.
 * By comparison, someone can rent space somewhere and install a large number of rack mount computers and run them all the time. With a large number maintenance is simplified and since they're running them all the time they get much more from their capital outlay even though they have to pay for their energy including cooling when the computers are heating too much.
 * Sure you could also run these heater computers all the time, but this means the free energy is only coming a small part of the time. So your cost effectiveness compare to the dedicated scenario starts to fall apart. This includes the fact that an individual or a bunch of individuals can't provide the level of reliability, security etc that a dedicated service can provide. (Remember that when doing useful work, most of the time you will need to do some more work, sometimes perhaps even the complete work to verify the results. I believe many distributed computer projects do do some verification since they have no way of knowing when submitted results are wrong whether malicious or accidentally. This is much less needed for someone operating a dedicate centre.)
 * A stark reminder of this, IIRC there have been several attempts since back in the early 2000s to start distributed computing systems where the person doing the work is paid. (And evidentally still going ketanm.hubpages .com/hub/Earn-money-by-selling-your-CPU-Idle-time .) They have had little success and most of them were under the assumption that the people would be just using their current computers, in other words they weren't even considering capital cost.
 * In other words, since the only cost advantage the individual has is free energy for part of the time for part of the year (and maybe a simplified cooling situation), under most scenarios it's likely the economics comes down in favour of the normal 'cloud computer' scenario than having a large number of individuals with specialised computers to be your cloud even with their occasional free enery.
 * I would add that because the capital costs are high enough, you also have to consider how you're going to work this system. Does the person pay for the heater computer and then have to recuperate the cost over time? Convincing people to do this is likely difficult, not to mention plenty of people may not have the capital (remembering that the people most likely to be willing to do it for the small amounts you're likely to make are also going to be the less well off) or at least don't think they do. Getting back to the NZ example, despite government subsidies etc, many houses don't even have good ceiling or underfloor insulation, or a air heat pump (let alone a hot water heat pump or solar) despite the fact many of these will pay back with 6 years (although doesn't seem to agree) but it seems to be at least partially based on and then be pure savings with a useful lifespan likely 10-15 years (probably a fair amount more for the insulation). And the cost not much more than your few thousand for several computer 'heater' units.
 * TL;DR: all resistive heaters may produce the same amount of heat but it doesn't mean they are at good at usefully heating the room. And free energy is great but finding something where you can effectively use this free energy available for a shortish time for parts of the year and spread out over the country and do so cheaply and reliably; well that's not so simple. At a minimum, you have to find something where the capital cost for the device is not too high and which has a decent useful lifespan underwise the you'll either never make it back under the products lifespan or you will but it will take too long and few will bother.
 * Nil Einne (talk) 15:26, 16 January 2014 (UTC)


 * Oil column heaters: I find these to be one of the best forms of zone heating. They have several advantages: They are silent (except for a click or two due to thermal expansion/contraction when they turn on and off), there's no burning smell as dust never hits a red-hot element, fire and burning risk is minimal (especially on the lowest setting), there's no carbon monoxide, soot, or unburned fuel vapors (as you get from kerosene heaters), and there's no cold breeze as you get from other heaters when the fan kicks on before the heater element warms up.  They do have a disadvantage that they tend to heat the ceiling more than the rest of the room.  However, if the floor above it needs heating, too, then this isn't so bad, and consider that other space heaters also tend to primarily heat the immediate area, not the far corners of the room.


 * Leaving electrical devices on: I have some devices which I don't mind wearing out. For example, I have 4 old CRT TVs, which I'd like to replace with LCDs, but I just can't justify throwing out perfectly functional TVs.  Also, I find it quite annoying to turn on and off TVs as I move from one room to another, if I'm watching Jeopardy or some other show, and don't want to miss bits.  Similarly, turning lights on and off is annoying.  And since I have all CFL lights, they tend to last a long time in continued use, but don't do so well with on/off cycles (although I only pay 50 cents a bulb, so replacing them is no big deal, in any case). StuRat (talk) 13:07, 17 January 2014 (UTC)

Historical US counties
Does anyone know how much the number of US counties has changed over time? A current list of counties and county equivalent areas is here and has 3143 entries. I am looking at some old coded data from the 1970s and 80s that has 3225 county codes. The description document simply says each county name was grouped by state, and then numbered alphabetically, but it doesn't provide a list of names so I don't have anything to check against. Could boundary changes have reduced the number of US counties by 80 in the last 30-40 years? I am also looking for geographic coordinates for the various counties (preferably as they existed 40 years ago, but modern data would be a place to start). Dragons flight (talk) 19:52, 15 January 2014 (UTC)
 * You should check the article for each state. For example List_of_counties_in_Nevada indicates that Ormsby_County,_Nevada county went away in 1969, so maybe it is included in your list.   There are probably similar cases in other states.   RudolfRed (talk) 21:07, 15 January 2014 (UTC)
 * List of former United States counties may also be useful. -- Jayron  32  00:45, 16 January 2014 (UTC)
 * NHGIS should have what you want. From their description page: The National Historical Geographic Information System (NHGIS) provides, free of charge, aggregate census data and GIS-compatible boundary files for the United States between 1790 and 2011. Jørgen (talk) 10:43, 16 January 2014 (UTC)