James Glanz has a fascinating & detailed story about what's required to enable you to read these very words on SDA, to view a YouTube video, to send an e-mail, or to use Facebook or Twitter. It ain't free!
The next time an environmentalist chastises you online about over consumption, don't hesitate to refer them to this article and point out what hypocrites they are for being online ... ever!











It's never about them, it's only about us. Without us, there would be more for them.
So this the next cause celebre of the progressives? The internet must be regulated, because data centers generate heat?
I am not sure why this is such a surprise.
But then those self-appointed experts who seek to direct our lives take so very much for granted.
Then, when enlightened, hasten to wrest control into their hands.
Eagle nailed it. The MSM would love it.
Eagle,
you hit the nail right on the head.
I guess those "leaks" of truth from the University of East Anglia left some bruising. Now it's the internet's fault.
The New York Times, is the same as those members of my extended family that use the straight / not shortened terms for millions of watts, billions of watts, and not kilowatts / megawatts in order to make the usage of electricity seem as great as they can. I've asked them in moments of heating discussion to start using litres of water, or cups of water, while arguing against the "Alberta Tar Sands" of which there is very little tar. As they are billed for water usage by the cubic metre in Regina.
These same people think we should drive electric cars, but don't actually buy them.
And the problem with this is? Yes, computers use power. All my machines run Seti@home and I can afford the power to keep somewhere on the order of 9 machines doing this (not including all of the microprocessors and webcams on my home network). In the winter I fire up the old PDP-11's that also heat the basement -- considering that these machines only run at a few MHz and use way more power than all my current PC's tells me that we've come a long way in terms of power efficiency.
CPU power consumption is steadily decreasing with intelligent powering down of sections of the processor that aren't being used. The only problem with switching sections on and off is voltage transients and these need to be filtered. Modern CPU's run at riduclously high currents at voltages as low as 1.2 V. Given a choice between having a steady state level of heat production in a CPU or wildly fluctuating levels of heat production and power use, there's only one right answer. Didn't read the whole article but just wonder if they mentioned that the heat output of a fast IA-32 or IA-64 chip/unit area exceeds that of a nuclear power plant. I find it hard to believe that a 1/4" square chunk or silicon has 20 amps or more passing through it. A simple way of looking at a CPU chip is as a short circuit with a humungous heat sink and a fan.
I suggest that those people who don't like the manner in which server farms are setup disconnect themselves permanently from the internet. To my way of thinking, server farms are a lot better use of electricity than heating an oven or heating a house. Perhaps the server farms can dump their excess heat into local houses during the winter. Instead of marvelling at the technologic advances, there are moronic reporters who bitch about how much power it takes to run the machines.
I like the top photo in that article, if you changed the colors to green and black it would look like the inside of a Borg cube.
@ Loki
All good points. It would be logical to move server farms away from Silicon Valley where the energy spent on cooling is wasted on a single purpose. The heat generated would benefit communities in northern climates and would not be difficult to utilize. At the rate California is driving business out of the State , it seems like a prudent long term strategy anyway. I'm sure they will avoid Ontario but all northern States and Canada could benefit.
The next time an environmentalist chastises you online about over consumption, don't hesitate to refer them to this article and point out what hypocrites they are for being online ... ever!
No,no,Robert.Don't you know that by critizing us users,they are not hypocrites? They are allowed to use any means and metohds to make us feel guilty,as they are our elites? How many houses does the Fruitfly own again?
Typical appeal to idiots article from NYT. Foolish heads up their behinds CO2 skewed priorities.
Flash:
24 hours a day industrial base load power consumption is what pays a lot of the bills and keeps all
the stable real power plants online to protect your source of power for your home. Think what a
huge mess would result if the windmills and solar panels pushers had their way. Equipping data
centers with SUPU protection lowers the unit costs involved when it comes to protecting hospitals,
airports and other vital facilities with similar or even more complex equipment. Of course it's easy
to spout off about things when you are a know nothing Luddite enviro crank with no skin in the game.
.
Whenever I set up a new computer on the network I have to make sure I tweak the network card settings so it doesn't turn itself off to save energy. How much energy can a network card use up that we have to worry about it?
If the data centres are wasting all this extra electricity, wouldn't it show up in US electricity usage? Yet per capita usage is almost exactly the same in 2010 as it was in 1999 (13,419/13,345 kWh). This seems to make the case that the internet has made the economy more efficient.
For example, Amazon may use vast amounts of electricity, but how much electricity (and other forms of energy) were used by those nearly empty Barnes & Noble superstores? They had to be lit, heated, cleaned, etc. Their employees and customers had to drive cars to get there. I don't doubt there's a net energy saving. (Employment is another matter..)
And, as is usual with the NYT, they say things that are factually true but misleading. For example, the $1 billion transmission corridor built down to northern Va. - they write it will be paid for by the 'ratepayers'. This makes it sound like it's being paid for by the little people living along the line, when in fact it's being paid for by the people and the companies that use the electricity. And since the complaint is that the data centres are using inordinate amounts of power, it's obvious they're going to pay an inordinate amount of the line's cost. (Not to mention that industrial users usually pay higher rates than residential ones, further increasing the share the data centres will pay.)
All that said, it does seem ridiculous that almost half the centres' electricity costs is for air conditioning. I talked to a data centre guy a few years back, and suggested they consider water cooling. After we stopped laughing - running water and electricity are not considered a good mix - we thought about the practicality of it. The problem is it would require the complete redesign of the server rooms, and the servers themselves. The servers would need to built with big heat sinks that could be suspended in water troughs. The racks would need to be redesigned to support the water troughs, and to protect the servers in the event of a leak. None of this is impossible, but it would require the co-operation of many different groups, and it would cost a lot initially. "Besides" he said "who'd want to go first? You'll have all the problems, probably lose a few customers, and get no credit." We agreed that electricity costs would have to skyrocket before it ever happened.
Heating the basement with PDP-11s FTW...
one of the problems in industry/business is that those in charge are technically and practically "stupid", I discovered this in the 80T's working as a service tech in the plastics industry. Some companies running 10-20 100hp to 400hp DC drives couldn't understand the logic of spending 20 - 40 K on capacitor banks to reduce their power consumption by 3 to 5 k$ per annum, using gov't provided financing at about 0%, and some of these fools had master degrees, or better!!!
KevinB, water cooling has a long history in computing, it was used in the bigger mainframes for the same reason. Air is a terrible cooling medium, water is excellent.
Most blade servers will presently be made with water cooling (or glycol, oil, or something even better) so they can continue to increase the density of processors per rack unit. Some are already and the New York Slimes just didn't see fit to bother telling anybody.
http://www.youtube.com/watch?v=cJmrTG3mDF0
Waste heat isn't a waste if you use your data center to heat the rest of the building. Or a greenhouse, or whatever.
As the need to manage power consumption becomes a budget problem for IT managers, they will see to it. And not before.
Here's one from IBM: http://www.zdnet.com/water-cooled-ibm-supercomputer-to-heat-buildings-3039666687/
NYC, just lying like a Persian carpet as usual.
Seems to me that the New York Times article is inferring that it would be wiser to skip the internet data centres and opt for newsprint media. Heaven forbid that anyone would want information at the click of a mouse.
Texas, excellent point. I wonder how many megawatts a printing press burns up to print the NY Slimes? Would it be hard to run it off a windmill?
Texas and Phantom.
I would bet my horse that the energy reqd to transfer a Gigabyte of info by newsprint is an order of magnitude (at least) higher than the energy reqd to transfer the same Gigabyte by the internet.
Lets start with one byte being equivalent to one letter...
Phantom, beat me to it as far as water cooling in mainframes goes. The biggest problem with water cooling is that H2O and electronics are not a good mix.
One of the things that would work is to have a better heat transfer medium than air and tetrafluoromethane comes to mind. Non-toxic and non-corrosive and used as a refrigerant. One thing about solid state circuits and power consumption is that the higher the temperature, the higher the power consumption. This is a positive feedback loop and I had an IA-32 Athlon chip blow up when one of the metal heat sink straps on the CPU broke. Quite impressive but fortunately the fire was confined to the computer case. If one can cool computers to well below room temperature, their power consumption drops considerably.
Loki said: "If one can cool computers to well below room temperature, their power consumption drops considerably."
Yeah, but then you're burning power refrigerating the chip. The best idea I've seen is that Hardcore Computer "Liquid Blade" thing I posted above. They just immerse the whole motherboard in dielectric fluid and pump it through a liquid-to-air or liquid-to-liquid heat exchanger. If it is set up properly the CPU heat itself can provide most of the pumping energy.
I'm of the opinion that until room-temperature superconductors become practical, cooling is going to be a big deal for computing.