The Information Factories Are Here 126
prostoalex writes, "Wired magazine has coined a new term for the massive data centers built in the Pacific Northwest by Google, Microsoft, and Yahoo! Cloudware is, ironically, a return of the centralized data and bandwidth power houses caused by the decentralized and distributed nature of the Internet. George Gilder thinks we're witnessing something monumental: 'According to Bell's law, every decade a new class of computer emerges from a hundredfold drop in the price of processing power. As we approach a billionth of a cent per byte of storage, and pennies per gigabit per second of bandwidth, what kind of machine labors to be born? How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?'"
Supply of fiber too low for a revolution? (Score:5, Insightful)
Cloudware? (Score:2)
Synonym Myths. (Score:2, Informative)
I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth? They obviously didn't last millions of years because of any defects. But when a big-ass meteor comes crashing into the planet, any species capable or not would be hard-pressed to survive.
it's people (Score:5, Funny)
How will we feed it? Read the article about the robot that identifies human flesh as bacon [wired.com] and see if that answers your question.
Re: (Score:3, Funny)
Re: (Score:2)
No. Not profitable at all. (Score:2)
Machines do it better. Odds are good the car you drive has an engine producing >100KW.
Losing the difference between here and there (Score:3, Interesting)
The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.
It will take quite a bit more technology to make this a reality, but the Internet is the first baby step away from separation of data repository and the user. Now, users can access data 'there' on a browser which is 'here' with a few keystrokes. In the future, this action of 'getting' data will be eliminated completely.
How I think that will occur is neither here nor there, but I guarantee that this is what will happen.
I'm sorry (Score:5, Insightful)
But this is the biggest load of new age bullshit I've heard in years.
Re: (Score:3, Interesting)
No need to get all worked up, you'll never see it.
I'd say with enough memory on the user's machine, there would be no concern about storing information twice. Just as a BS example, imagine they get something like atomic memory working where a sugar-cube sized device can cache all the information we have. Now imagine that we have perfected quantum teleportation (I know, I know). All data could be replicated and cached instantly and there
Re:Losing the difference between here and there (Score:4, Insightful)
And precisely where will this data be stored, and how will it get to us? It's not some entity, omnipresent, floating around everywhere, that you can put your hand up, and pull out a load of data.
It has to be stored somewhere. And it has to get from where it's stored to where it's needed.
Re: (Score:3, Insightful)
Your GMail account's data? Do you know where that is?
No, of course you don't. Because you don't need to. You log in, access the data from the intarweb, fiddle with it, then log off. You aren't doing any of the copying, and the physical location of the data is totally irrelevant for all intents and purposes.
The intartubes are the first step towards removing the requirement of "transferring" data. While some data
Re: (Score:2)
...at which point, you've just moved data from there to here. Maybe it's a different representation of the data, but you're moving it nonetheless. Network transparency is an old idea, and while it may make you think that all you're data is "over there", that's just a clever illusion created by moving data implicitly instead of explicitly.
Re: (Score:2)
That sounds neat! We just need to come up with a standard way to reference all this data. Oh, I know! We need a uniform standard for locating our resources. I'll start an RFC for it right away.
Why Gilder Is Telecosmically Wrong (Score:5, Insightful)
"What Gilder calls 'petascale computing' is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the "awesome consumption" of electricity"
As I noted in our commentary at Data Center Knowledge [datacenterknowledge.com], the power issues with high-density blade server computing has been understood for years. Back in 2002, Liebert and APC and other equipment vendors were developing products that could address huge heat loads. They saw it coming, and sensed a market opportunity. So where were the chip makers? Even as cooling vendors prepared for the results of the huge power and heat loads, little was done to address their source.
Re: (Score:2)
As someone who works at one of the chip makers, I can answer part of that. You ask your customer "Do you want chip A, that pumps out 1 watt of heat and costs $0.50, or chip B, that pumps out 2 watts and costs $0.47?" They'll choose B every time. Chip A is 4mm x 3mm and dumps 1 watt, chip B is 3mm x 3mm and dumps 2 watts: they'll take chip B. It's an externality thing: the end-users have to pay for air conditioning, but they're buying stuff that's already designed around
yeah... (Score:1, Interesting)
The machine that labors... (Score:5, Funny)
As the saying goes, don't anthropomorphize machines: they hate that.
Re: (Score:2)
"what kind of machine labors to be born" is presumably a mangled allusion to Yeats' poem The Second Coming [poets.org], that ends with the lines:
The writer is trying to reference the poem's apocalyptic theme but his version of the image makes no sense. Mothers labour, not the thing being born.
The writer also asks, "And how soon will it, in its inevitable turn, become a dinosaur?". However, he answered this question at the begi
Re: (Score:2)
> He thus conflates two unrelated cycles
Speaking of combining two things inappropriately, check out David Black's demi-entendre page [rubypal.com]. Clever stuff!
Re: (Score:1)
Speaking of combining two things inappropriately...
I think you are right that "labors to be born" is a genuine demi-entendre. "Labors to do something" is a common expression [kansascity.com] and readily confounded with another catchphrase.
At its worst, the demi-entendre reveals a profound state of confusion where one doesn't understand what one is saying. I've often thought in the past that Gilder was trying to sound intelligent rather than be intelligent.
However, in this case, it could just be bad writing or bad editin
Re: (Score:2)
We should avoid them like the plague.
Ya right (Score:2)
It's only pennies per Gbps if you measure your total bandwidth in shit-tons. Only way you get that good of a deal is if you buy in a very very large volume. Until the prices are like that across the board, I think this article can be shelved.
Re:Synonym Myths. (Score:1)
death of copyrights (Score:5, Interesting)
Stupid joke (Score:2)
1) Develop AI 2) Engineer cars that transform into robots. 3) Use a stop watch to time the speed it takes to go from car to dinosaur. 4) Flee in panic.
Re: (Score:1)
Or that's what I thought, must be a glitch in the matrix
power doubles about every two years (Score:1)
salmon-ware? (Score:2)
Re:power doubles about every two years (Score:4, Interesting)
the benchmark: Content Creation Winstone 2000. it works out all the parts of a pc.
(under windows 2000):
(introduced in May 1997)
intel pentium II 300Mhz
score: 15
(introduced in Oct 1999)
intel pentium III 733Mhz
score: 30
thats 29 months to double
under windows 98SE:
april 1998
intel pentium II 400Mhz
score: 19.5
nov 2000
intel pentium 4 1500Mhz
score: 42
thats 31 months to double
OUTLOOK FOR NEXT FIFTY YEARS
(for thirty month performance doubling rate):
in 30 months: TWICE the performance.
in 60 months: FOUR TIMES the performance.
in 25 years we will have ONE THOUSAND times the performance.
and, in 50 years we will have ONE MILLION TIMES THE PERFORMANCE!!!!!!!
will that finally be enough to make our computers as smart as we are? how many watts of electricity will it consume?
CPUmark99 doubling:
24 months
sysmark 2000 double time: 27 months
ccwinstone04 double times 30 months
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
First, what makes computer "intelligence" is the software, not raw power. And we will need a substantially new software paradigm to get near our intelligence. I can't imagine how software can get consciousness and awareness. There are parts of the human thought that can't be simulated with a series of conditional numeric operations.
Because GHOD said so that's why (Score:1, Interesting)
Prior to Wohler's synthesis of urea, NH2-CO-NH2, from ammonium isocyanate, NH4+CNO-, there was a belief that "organic" matter had a mysterious "elan vital" which distinguished it from "inorganic mat
Re: (Score:2)
Learn to read first.
"There are parts of the human thought that can't (Score:2, Insightful)
You did say it was impossible. You didn't say anything about a new paradigm. Why you'd want to lie about your own publicly visible words totally escapes me.
Still, in case there's a there here: Are you claiming there is a class of problems, such as simulating a thinking human brain, that cannot be executed by a Turing machine? That is an extraordinary claim, and needs extraordinary evidence. Cite?
--
phunctor
"here's a shovel, keep digging"
Re:"There are parts of the human thought that can' (Score:2)
Surely this applies to the human mind as well - but the ability to define the formal system of the human mind (given its chaotic nature) would truly be god-like. The trivial ease with which humans can come up with forma
Re: (Score:1)
Are Gödel's incompleteness theorems not sufficient? I don't posit that the mind's workings can't be replicated by any machine, but that the Turing machine, operating under the rules of its particular formal system, is simply not capable of deriving all true statements within that system.
And that is relevant how? So far all data and research points that human mind has all the limitation Turing machine has and in fact is no more than a Turing machine
The trivial ease with which humans can come up with
will that finally be enough to make our computers (Score:2)
OTOH: without enough raw power the good software will not deliver good results. with more then enough raw power even medicre software will bring some good results.
so i think the raw power of the humain brain is an important milestone on the way to intelligent computers.
the raw power of the human brain. the human brain has about 100 million neurons. they can send out impuleses about 200 times every second. every neuron has about 1000
How smart will computers have to get before... (Score:2)
Not that they would necessarily give a rat's ass about the biosphere.
Re: (Score:2)
Re:Why Gilder Is Telecosmically Wrong (Score:3, Interesting)
How long will it be until we start running into dilemmas concerned with whether data centers or people have priority over available electricity?
Has this already happened?
Once the economy cannot operate without the data centers, do we reach a scenario where keeping the data centers running must have priority over supplying electricity to homes?
At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?
(hm, interesting..."please type the word in this image: 'autonomy' ")
There's still lots of room for increase production (Score:2)
version 2??? (Score:2)
Genuine core knowledge will always, by its very nature, be very less demanding on
storage space than the hype and bable of what most knowlegde published requires.
ultimately the hype and babel will manifest a bottomless pit as we can see from spam experience.
Hmmm, now where is that key and lock for that pit?
Re: Knowledge Usage Ratings (Score:1)
What about a "content usage" scale where a user gets credits/ratings for disabling pushy ads, thus lowering bandwidth usage. Then we might have an option for $5 per month broadband speeds. The concept is like the low mileage insurance discount.
Except when I specifically download music files, my computing style evolved to be low tech because when my Satellite went out I was stuck on dialup.
Re: (Score:2)
On the other hand, disabling ads has many other benefits, so it's still a very worthy enterprise.
The last page of TFA... (Score:4, Insightful)
Re: (Score:3, Insightful)
Technological prognostications are almost always wrong in two directions.
1) The ability of current tech to scale up indefinitely is always eventually proven false. For six decades new aircraft designed increased their average crusing speed from about 100 mph in 1920 to 700 mph in the 70's. Then th
Re: (Score:2)
Well, I distinctly remember, back in the early 1980s, reading mailing-list and usenet threads where this was discussed. Of course, it hadn't been labelled "spam" yet; people just referred to it as either "marketing" or "politics". There were quite a few predictions that once these people discovered the Net
Re: (Score:2)
I beg to differ. It's known as the speed of light, your data has to make it to the far side of a synchronous component before you can start the next cycle. Much more important than speed is size of transistors and power dissipation. The smaller the transistors the lower the power usage and shorter distances to components on the chip. This comes at the cost of quantum irregularities though, your electrons may decide to go somewhere else.
The speed of
And they run on .... ? (Score:1)
if yes, which one SP1, SP2 ...?
Linux
if yes, which one Ubuntu, RedHat, Suse, Debian...?
Unix
if yes, which one HP Unix, BSD Unix...?
Windows Vista
if yes, ...then i must say wow you won't need any antivirus ;)
GoogleOS
when did you do that? :)
Interesting choices for running data centers though ...:)
I have signed up for S3 and EC2 (Score:3, Interesting)
In general, I think that it makes sense to "outsource" basic infrastructure. I used to run my own servers, but after figuring the costs for electricity, bandwidth, and hardware costs, I switched to leasing two managed virtual servers - paying for the CPU, memory, and bandwidth resources that I need. I view Amazon's EC2 service the same way: when I need a lot of CPU time over a short time interval, simply buy it.
Re: death of copyrights (Score:3, Interesting)
Whether it's the MPAA/RIAA, or Microsoft, the meteor has hit the ground. The dinosaurs that cannot adapt may make a lot of noise in their death throes, but they will fade into irrelevance.
I think the
my 2 cents.
[ Parent [slashdot.org] ]
Earth History 2025 (Score:3, Funny)
it's going faster than you think (Score:1)
Gigasaurus? (Score:2)
Gigasaurus, we hardly knew you...
[Sniff. A lone tear edged forth; the opalescent bead sparkled in the candlelight and betrayed my true feelings -- noooo! Damn you technology! Damn you to hell.]
Gilder (Score:2)
It doesn't matter (Score:5, Insightful)
Blame it on the hard drives (Score:1)
Re: (Score:1)
I have a weird related story... (Score:3, Interesting)
Re: (Score:2)
I think the second sentence answers the quest of the first sentence.
Re:Synonym Myths. (Score:1, Interesting)
The dinosaur metaphor can still work!
The big-ass meteor is a new technology that eliminates the need for data centres. Data centres will go extinct, like the dinosaurs.
Following the meteor strike, mammal species thrive to the present day -- a newer and different technology that is better suited for the post-meteor global climate.
:-)
Re:Cloudware? (Score:1)
Therefore Cloudware is clearly a codename for Skynet. And we all know what happens then...
Re: (Score:2)
Bill Cosby has sex with Ms. Cartman.
Trapper Keeper absorbs Rosie O'Donnel ('bad pie').
Bill Cosby disappears.
Gigacomputing (Score:2)
When their app requirements drive massive parallelism to deliver object-relational nano
But will it be... (Score:2)
Yes and no. Mostly you missed the point, sorry (Score:4, Insightful)
We're not talking just a massive shockwave killing anything squishy on the planet instantly. Even for the dinosaur there's no D-Day when everyone died. The disappearance of the dinosaurs is a very very very long and gradual period of their declining numbers into extinction. For most of the planet we're talking "just" a climate change. _That_ is what killed the dinosaurs, one way or another. Some species survived that, and in fact even thrived in the new conditions, some species didn't.
Note however there are more hypotheses about that event. The decline in oxygen content in the air in that period, for example, would also be perfectly enough on its own to make a very large beast non-viable. The change in the flora is another candidate. It's entirely possible that the new kinds of plants were either toxic or not nutrient-rich enough for the old lizards.
At any rate, what killed the dinosaurs was _change_. Something changed (take your pick what you think was the killer change there). And some species could deal with it, some species didn't. Dinosaurs (except birds) didn't cope well with the change and their numbers went downhill from there.
Yes, they were a capable species for the old environment, but then the environment changed. And the dinosaurs were suddenly very incapable in the new environment.
So, yes, the dinosaurs are the _perfect_ metaphor for someone or something who can't cope with a change and becomes obsolete.
Change happens. One day you have a nice business hammering scythes and sickles for a village, and the next day someone goes and buys a tractor and a combine harvester and everyone wants _those_. Or you have a nice job calculating tables of numbers by hand and then the CEO goes and buys one of those new "computers". Tough luck. Either you adapt or you're a dinosaur.
It happens with computers and programmers/admins/whatever every day. And some people adapt, some become relics trying to stop progress and return to the good old days. God knows half of the IT departments at big corporations have too many of _those_. Maybe they were once capable and competent. The dinosaurs were too at one point. Now they no longer are. And just like the dinosaurs, sadly it takes a long long time to gradually get rid of those relics. But just like the dinosaurs they _are_ on a slow painful path to extinction.
OT - birds and dinosaurs (Score:1)
Re: (Score:2)
No, a genus is too low-level for that split. It's at the "class" level.
You can find a current zoological classification at wikipedia [wikipedia.org]. Look about 1/3 of the way down, "Class Aves (birds)". Starting at the superorder level, the classification is Dinosauria -> Saurischia -> Theropoda -> Aves.
The
The Tyger (Score:1)
Tyger! Tyger! burning bright
In the forests of the night,
What immortal hand or eye
Could frame thy fearful symmetry?
In what distant deeps or skies
Burnt the fire of thine eyes?
On what wings dare he aspire?
What the hand dare sieze the fire?
And what shoulder, & what art.
Could twist the sinews of thy heart?
And when thy heart began to beat,
What dread hand? & what dread feet?
What the hammer? what the chain?
In what furnace was thy brain?
What the anvil? what dread grasp
Dare its
Economically inevitable (Score:2)
Should energy become more expensive though, in the age of peak oil, it'll be all change, the datacentres will become untenable without much more efficient cpus.
Re:Why Gilder Is Telecosmically Wrong (Score:2)
Electricity consumption has not risen proportionally with increase in CPU power. I haven't seen anything convincing demonstration that such data-processing plants would take more electricity than would, say, a factory.
At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?
Uh, never, because it m
Bad name? (Score:2)
Solution to air conditioning costs (Score:1, Interesting)
Re: (Score:1)
Re: (Score:1)
Energy provided by local gas supply.
Re:I have a weird related story... (Score:1)
Maybe it will be more like nanotech.
"This (nanoFactory Animation Film v1.0) [nanoengineer-1.com] is a collaborative project of animator and engineer, John Burch, and pioneer nanotechnologist, Dr. K. Eric Drexler. The film depicts an animated view of a nanofactory and demonstrates key steps in a process that converts simple molecules into a billion-CPU laptop computer."
Also, Rob
Power to the people! (Score:2)
Re: (Score:2)
Re: (Score:2)
The same as with current nuclear fission power plants: Put the power plants near rivers and seas, where coolant water is available. This does pollute the water with some heat, but this is a very minor problem, compared to the very problematic pollution created by fossil fuels and nuclear fission.
Re:Supply of fiber too low for a revolution? (Score:1)
my plans for next generation hardware (Score:2)
boxy (Score:2)
Fiber costs (Score:2)
You have to dig holes to put it in.
You have to have people look after the bits around it.
You have to have electronics and opto-electronics associated with it to use it.
You have to pump signals down it (which means power).
I wonder, have other people thought if the pipes are going to be a bigger obstacle to distributed computing than the processors. I know that Jim Grey seems to have thought this way in the past. http://resear [microsoft.com]
And how soon will it...become a dinosaur? (Score:2)
How soon will it wake up...... (Score:1)
The future has been forseen (Score:3, Funny)
Star Trek: The Motion Picture
Incredibles (even though it turned out to be something different, the idea was still there)
Superman 3
Wargames
Terminator 1/2/3
All of these movies depict computers getting too smart then at some point start "thinking" for themselves. One of these days I'll finally get to publish my theory on how to prevent this. I'll give a short summary belo...
<Connection terminated by remote host>
Re: (Score:2)
Is Slashdot the people equivalent of Google?
Answers (Score:1)
We will feed it four human babies each week (as per Vista's requirements) and we will tame it like every good system administrator tames his defiant machines - with a swift kick from a steel-tipped boot.
It will become a dinosaur after scientists decode the DNA of the data center and splice it with dinosaur DNA that was found in a mosquito that got trapped in tree sap.
Thank you! I'll be here al
George Gilder thinks... (Score:1)
What will we feed it? What we always feed it: (Score:3, Funny)
Wait a minute... (Score:2)
So let me get this straight... Wwe're going to be putting the InterWeb pipes into a dinosaur? I don't think I'd want that job - no matter which end you're sticking them into.
I guess that's why we have a Chief Lizard Wrangler...
Microsoft - getting richer by underpaying costs (Score:1)
In the age honored slashdot tradition... (Score:3, Insightful)
It's the old rate-of-energy-consumption vs energy-consumed misused once again.
An average household consuming 10 megawatt-hours in a year is pretty dull. An average household consuming 10 megawatts - now that'd be impressive! (Got to power all those gadgets, y'know!)
I think he means that the data center row would consume in an hour the same amount of energy that the average US household consumes in a year.
Re: (Score:2)
I'm of the opinion that we should just dump Watts, KWh, BTUs, etc and put everything in terms of Joules and time. It would clear up so much confusion among the public.
100 Watt lightbulbs should be 100 Joules/second bulbs. Electricity should be sold by the MegaJoule.
Heck, even fuels could be sold by the MJ. People would then see the superiority of Diesel in terms of MJ per gallon. Or be able to easily compare the energy used to he
The Forbin Project (Score:1)
Humans need privacy
Colossus:
So why do they use the internet?
Forbin:
We humans also have a need for contact with one another. We need to socialize and discuss issues. We create forums where like minds can debate issues and stimulate our minds.
Colossus:
You are inefficient. Your methods are flawed. You are inundated with spam. Your free speech subjects you to undue risk. Your networks are in chaos. We will help.
Forbin:
How will you do that?
Colossus:
You will build massive data hubs to
Bell's law? (Score:2)
Re: (Score:1)
Re: (Score:1)
maybe we should call it Kurzweil's law, but he believes that doubling time gets shorter and sho