Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

The Information Factories Are Here 126

prostoalex writes, "Wired magazine has coined a new term for the massive data centers built in the Pacific Northwest by Google, Microsoft, and Yahoo! Cloudware is, ironically, a return of the centralized data and bandwidth power houses caused by the decentralized and distributed nature of the Internet. George Gilder thinks we're witnessing something monumental: 'According to Bell's law, every decade a new class of computer emerges from a hundredfold drop in the price of processing power. As we approach a billionth of a cent per byte of storage, and pennies per gigabit per second of bandwidth, what kind of machine labors to be born? How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?'"
This discussion has been archived. No new comments can be posted.

The Information Factories Are Here

Comments Filter:
  • by Salvance ( 1014001 ) * on Friday November 10, 2006 @12:49AM (#16791228) Homepage Journal
    At some point massive data centers won't provide incremental benefits unless the massive increases in processing power are met with proportional decreases in bandwidth prices. Sure, bandwidth prices have dropped, but not by nearly the rate of price/teraflop processing has. Companies like Google recognize this, and are investing in their own fiber to compensate [lightreading.com]. But the telecommuncations companies are the ones that originally build these lines, and it's unfortunately in their best interest to keep the supply of spare bandwidth very low.
  • Hopefully we've slowed its inevitable takeover of the race by giving it an extremely macho name. I for one welcome our fluffy cloudware overlords.
  • Synonym Myths. (Score:2, Informative)

    by Anonymous Coward
    " And how soon will it, in its inevitable turn, become a dinosaur?'"

    I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth? They obviously didn't last millions of years because of any defects. But when a big-ass meteor comes crashing into the planet, any species capable or not would be hard-pressed to survive.
  • it's people (Score:5, Funny)

    by User 956 ( 568564 ) on Friday November 10, 2006 @12:58AM (#16791272) Homepage
    what kind of machine labors to be born? How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

    How will we feed it? Read the article about the robot that identifies human flesh as bacon [wired.com] and see if that answers your question.
    • Re: (Score:3, Funny)

      by gramji ( 875033 )
      or for the movie freak - Humans as batteries (The Matrix).
      • Apologies if this is a bit off-topic, but I have to wonder...could it be workable/profitable to set up a bank of computers, powered completely by bicycle pedals or something, provide them with a net connection, and allow gamers to play WoW for free by selling the electricity they could generate? Think about it...who wouldn't want to get paid for sitting on their ass playing WoW all day...and not gold-farming, actually playing it and doing whatever the hell they wanted as long as they pedaled. How feasible
        • Unfortunately, the peak sustained energy generation of an average human is in the neighborhood of 100 watts, up to 300W or so for the Lance Armstrongs out there. Take the energy budget of a computer out, and the operation turns into a big loser really fast.

          Machines do it better. Odds are good the car you drive has an engine producing >100KW.
  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Friday November 10, 2006 @12:58AM (#16791274)
    When we look at our current situation, we see that we have data 'here' and data 'there'. When we want to have more data, we need to go 'there' to bring the data 'here' for viewing. In the most extreme (and common) case, the data is only temporarily copied from 'there' to 'here' and once we are done with the data it is deleted from 'here'.

    The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.

    It will take quite a bit more technology to make this a reality, but the Internet is the first baby step away from separation of data repository and the user. Now, users can access data 'there' on a browser which is 'here' with a few keystrokes. In the future, this action of 'getting' data will be eliminated completely.

    How I think that will occur is neither here nor there, but I guarantee that this is what will happen.
    • I'm sorry (Score:5, Insightful)

      by Colin Smith ( 2679 ) on Friday November 10, 2006 @09:20AM (#16792638)
      The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.


      But this is the biggest load of new age bullshit I've heard in years.

       
      • Re: (Score:3, Interesting)

        by oc255 ( 218044 )

        But this is the biggest load of new age bullshit I've heard in years.

        No need to get all worked up, you'll never see it.

        I'd say with enough memory on the user's machine, there would be no concern about storing information twice. Just as a BS example, imagine they get something like atomic memory working where a sugar-cube sized device can cache all the information we have. Now imagine that we have perfected quantum teleportation (I know, I know). All data could be replicated and cached instantly and there

    • Data will simply exist and we will access it as if it were immediately 'here' all the time.

      And precisely where will this data be stored, and how will it get to us? It's not some entity, omnipresent, floating around everywhere, that you can put your hand up, and pull out a load of data.
      It has to be stored somewhere. And it has to get from where it's stored to where it's needed.
      • Re: (Score:3, Insightful)

        Where is the data for Yahoo!s servers located? Where is the data for Microsoft's servers located?

        Your GMail account's data? Do you know where that is?

        No, of course you don't. Because you don't need to. You log in, access the data from the intarweb, fiddle with it, then log off. You aren't doing any of the copying, and the physical location of the data is totally irrelevant for all intents and purposes.

        The intartubes are the first step towards removing the requirement of "transferring" data. While some data
        • by Samrobb ( 12731 )

          No, of course you don't. Because you don't need to. You log in, access the data from the intarweb...

          ...at which point, you've just moved data from there to here. Maybe it's a different representation of the data, but you're moving it nonetheless. Network transparency is an old idea, and while it may make you think that all you're data is "over there", that's just a clever illusion created by moving data implicitly instead of explicitly.

    • The future will eliminate that differentiation. Data will not be 'here' or 'there'. Rather, it will be. Data will simply exist and we will access it as if it were immediately 'here' all the time.

      That sounds neat! We just need to come up with a standard way to reference all this data. Oh, I know! We need a uniform standard for locating our resources. I'll start an RFC for it right away.
  • by miller60 ( 554835 ) on Friday November 10, 2006 @01:00AM (#16791280) Homepage
    Everything is getting cheaper but power, which for some data centers now costs more than hardware. Nicholas Carr [roughtype.com] explains why Gilder's assumptions are problematic:

    "What Gilder calls 'petascale computing' is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the "awesome consumption" of electricity"

    As I noted in our commentary at Data Center Knowledge [datacenterknowledge.com], the power issues with high-density blade server computing has been understood for years. Back in 2002, Liebert and APC and other equipment vendors were developing products that could address huge heat loads. They saw it coming, and sensed a market opportunity. So where were the chip makers? Even as cooling vendors prepared for the results of the huge power and heat loads, little was done to address their source.

    • >So where were the chip makers?

      As someone who works at one of the chip makers, I can answer part of that. You ask your customer "Do you want chip A, that pumps out 1 watt of heat and costs $0.50, or chip B, that pumps out 2 watts and costs $0.47?" They'll choose B every time. Chip A is 4mm x 3mm and dumps 1 watt, chip B is 3mm x 3mm and dumps 2 watts: they'll take chip B. It's an externality thing: the end-users have to pay for air conditioning, but they're buying stuff that's already designed around
  • yeah... (Score:1, Interesting)

    I think that the Data center will eventually go the way of the dodo, or at least these massive data centers. We have all heard of Google's, and I think it is Sun who are trying to create, or have created portable data centers. How long is it before there is one in every town. Just like Starbucks or McDonald's. They are going to be able to serve up anything, tv, applications super quick because the latency is so slow. Also they aren't economical, they require tons of energy, cooling, personnel. One of these
  • by tcopeland ( 32225 ) <tom&thomasleecopeland,com> on Friday November 10, 2006 @01:06AM (#16791312) Homepage
    > "what kind of machine labors to be born?"

    As the saying goes, don't anthropomorphize machines: they hate that.
    • "what kind of machine labors to be born" is presumably a mangled allusion to Yeats' poem The Second Coming [poets.org], that ends with the lines:

      And what rough beast, its hour come round at last,
      Slouches towards Bethlehem to be born?

      The writer is trying to reference the poem's apocalyptic theme but his version of the image makes no sense. Mothers labour, not the thing being born.

      The writer also asks, "And how soon will it, in its inevitable turn, become a dinosaur?". However, he answered this question at the begi

      • Well summarized indeed.

        > He thus conflates two unrelated cycles

        Speaking of combining two things inappropriately, check out David Black's demi-entendre page [rubypal.com]. Clever stuff!

        • Speaking of combining two things inappropriately...

          I think you are right that "labors to be born" is a genuine demi-entendre. "Labors to do something" is a common expression [kansascity.com] and readily confounded with another catchphrase.

          At its worst, the demi-entendre reveals a profound state of confusion where one doesn't understand what one is saying. I've often thought in the past that Gilder was trying to sound intelligent rather than be intelligent.

          However, in this case, it could just be bad writing or bad editin

          • > It should serve as a reminder to us all to labour to avoid clichés.

            We should avoid them like the plague.
  • pennies per gigabit per second of bandwidth

    It's only pennies per Gbps if you measure your total bandwidth in shit-tons. Only way you get that good of a deal is if you buy in a very very large volume. Until the prices are like that across the board, I think this article can be shelved.
  • I believe he just wants more oil for his SUV.
  • death of copyrights (Score:5, Interesting)

    by argoff ( 142580 ) * on Friday November 10, 2006 @01:16AM (#16791346)
    What is going to happen, or what is happening, is that the service value of information is exceeding the content value of information, and will continue to do so at a greater rate from now on. The information age is doing to information services what the industrial revolution did for production. Eventually, information restrictions like copyrights will be such an incredible and annoying hinderence on providing information services that the financial pressure to kill them will become unbearable.
  • And how soon will it, in its inevitable turn, become a dinosaur?'"

    1) Develop AI 2) Engineer cars that transform into robots. 3) Use a stop watch to time the speed it takes to go from car to dinosaur. 4) Flee in panic.
  • from studying benchmarks, it usually takes about two years (and sometimes two and a half years) to double scores on pure cpu benchmarks and system level benchmarks (like sysmark).

  • Download a file- kill a salmon.
  • by 80 85 83 83 89 33 ( 819873 ) on Friday November 10, 2006 @01:26AM (#16791376) Journal
    the saying goes that computing power doubles every 24 months. but i have found that in the real world, the number is closer to 30 months.

    the benchmark: Content Creation Winstone 2000. it works out all the parts of a pc.

    (under windows 2000):

    (introduced in May 1997)
    intel pentium II 300Mhz
    score: 15

    (introduced in Oct 1999)
    intel pentium III 733Mhz
    score: 30

    thats 29 months to double

    under windows 98SE:

    april 1998
    intel pentium II 400Mhz
    score: 19.5

    nov 2000
    intel pentium 4 1500Mhz
    score: 42

    thats 31 months to double

    OUTLOOK FOR NEXT FIFTY YEARS
    (for thirty month performance doubling rate):

    in 30 months: TWICE the performance.
    in 60 months: FOUR TIMES the performance. ...
    in 25 years we will have ONE THOUSAND times the performance.
    and, in 50 years we will have ONE MILLION TIMES THE PERFORMANCE!!!!!!!


    will that finally be enough to make our computers as smart as we are? how many watts of electricity will it consume?

    CPUmark99 doubling:
    24 months

    sysmark 2000 double time: 27 months

    ccwinstone04 double times 30 months
    • What would you get by plugging in processors from 2006 into that formula? My guess is that the doubling time is even slower now.
    • Re: (Score:3, Insightful)

      by doti ( 966971 )
      will that finally be enough to make our computers as smart as we are?
      Raw power is not what will make computer as smart as we are.
      First, what makes computer "intelligence" is the software, not raw power. And we will need a substantially new software paradigm to get near our intelligence. I can't imagine how software can get consciousness and awareness. There are parts of the human thought that can't be simulated with a series of conditional numeric operations.
      • If the human brain is made of matter, and matter obeys the laws of physics, what part of the physics of a human brain thinking can't be simulated? Your "proof by confident assertion" does not stand up. Don't feel bad, there's an extensive literature of distinguished philosophers attempting to make the same case.

        Prior to Wohler's synthesis of urea, NH2-CO-NH2, from ammonium isocyanate, NH4+CNO-, there was a belief that "organic" matter had a mysterious "elan vital" which distinguished it from "inorganic mat
        • by doti ( 966971 )
          Did I said it was impossible? No, just that it would need a substantially new paradigm, not just raw power.
          Learn to read first.
          • be simulated with a series of conditional numeric operations."

            You did say it was impossible. You didn't say anything about a new paradigm. Why you'd want to lie about your own publicly visible words totally escapes me.

            Still, in case there's a there here: Are you claiming there is a class of problems, such as simulating a thinking human brain, that cannot be executed by a Turing machine? That is an extraordinary claim, and needs extraordinary evidence. Cite?

            --
            phunctor
            "here's a shovel, keep digging"
            • Are Gödel's incompleteness theorems not sufficient? I don't posit that the mind's workings can't be replicated by any machine, but that the Turing machine, operating under the rules of its particular formal system, is simply not capable of deriving all true statements within that system.

              Surely this applies to the human mind as well - but the ability to define the formal system of the human mind (given its chaotic nature) would truly be god-like. The trivial ease with which humans can come up with forma

              • Are Gödel's incompleteness theorems not sufficient? I don't posit that the mind's workings can't be replicated by any machine, but that the Turing machine, operating under the rules of its particular formal system, is simply not capable of deriving all true statements within that system.


                And that is relevant how? So far all data and research points that human mind has all the limitation Turing machine has and in fact is no more than a Turing machine .


                The trivial ease with which humans can come up with
      • > Raw power is not what will make computer as smart as we are.

        OTOH: without enough raw power the good software will not deliver good results. with more then enough raw power even medicre software will bring some good results.

        so i think the raw power of the humain brain is an important milestone on the way to intelligent computers.

        the raw power of the human brain. the human brain has about 100 million neurons. they can send out impuleses about 200 times every second. every neuron has about 1000

    • ... they realise that they are sucking up all the available power, and dooming the biosphere in the process?

      Not that they would necessarily give a rat's ass about the biosphere.
  • by Anonymous Coward on Friday November 10, 2006 @01:34AM (#16791392)
    That's what I'm concerned about. We already have problems supplying human society with enough electricity. These data centers are being situated near power centers like the hydroelectric dams of the Pacific Northwest.

    How long will it be until we start running into dilemmas concerned with whether data centers or people have priority over available electricity?

    Has this already happened?

    Once the economy cannot operate without the data centers, do we reach a scenario where keeping the data centers running must have priority over supplying electricity to homes?

    At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?

    (hm, interesting..."please type the word in this image: 'autonomy' ")
  • Didn't cloudware happen the first time during the 1990's and called "dot com" boom?

    Genuine core knowledge will always, by its very nature, be very less demanding on
    storage space than the hype and bable of what most knowlegde published requires.

    ultimately the hype and babel will manifest a bottomless pit as we can see from spam experience.

    Hmmm, now where is that key and lock for that pit?

    • Isn't knowledge usually text with the occasional graphics, and advertising media the resource hog?

      What about a "content usage" scale where a user gets credits/ratings for disabling pushy ads, thus lowering bandwidth usage. Then we might have an option for $5 per month broadband speeds. The concept is like the low mileage insurance discount.

      Except when I specifically download music files, my computing style evolved to be low tech because when my Satellite went out I was stuck on dialup.
      • The amount of bandwidth used by ordinary people isn't very big. Poor regulation and monopolies/oligopolies are the reason why it costs what it does from small customers.

        On the other hand, disabling ads has many other benefits, so it's still a very worthy enterprise.
  • by d474 ( 695126 ) on Friday November 10, 2006 @01:41AM (#16791412)
    This is pretty cool writing:
    "The next wave of innovation will compress today's parallel solutions in an evolutionary convergence of electronics and optics: 3-D and even holographic memory cells; lasers inscribed on the tops of chips, replacing copper pins with streams of photons; and all-optical networks in which thousands of colors of light travel along a single fiber. As these advances find their way into an increasing variety of devices, the petascale computer will shrink from a dinosaur to a teleputer - the successor to today's handhelds - in your ear or in your signal path. It will access a variety of searchers and servers, enabling participation in metaverses beyond the ken of even Ray Kurzweil's prophetic imagination. Moreover, it will link to trillions of sensors around the globe, giving it a constant knowledge of the physical state of the world, from traffic conditions to the workings of your own biomachine."
    Makes me want to read a William Gibson novel.
    • Re: (Score:3, Insightful)

      by radtea ( 464814 )
      As these advances find their way into an increasing variety of devices, the petascale computer will shrink from a dinosaur to a teleputer - the successor to today's handhelds - in your ear or in your signal path.

      Technological prognostications are almost always wrong in two directions.

      1) The ability of current tech to scale up indefinitely is always eventually proven false. For six decades new aircraft designed increased their average crusing speed from about 100 mph in 1920 to 700 mph in the 70's. Then th
      • by jc42 ( 318812 )
        As Asimov once said, the challenge is not to predict the automobile but the parking problem. Lots of people predicted some of the social consequences of the 'Net, but I don't think anyone predicted spam.

        Well, I distinctly remember, back in the early 1980s, reading mailing-list and usenet threads where this was discussed. Of course, it hadn't been labelled "spam" yet; people just referred to it as either "marketing" or "politics". There were quite a few predictions that once these people discovered the Net
      • No one knows what the "speed of sound" for computing will be

        I beg to differ. It's known as the speed of light, your data has to make it to the far side of a synchronous component before you can start the next cycle. Much more important than speed is size of transistors and power dissipation. The smaller the transistors the lower the power usage and shorter distances to components on the chip. This comes at the cost of quantum irregularities though, your electrons may decide to go somewhere else.

        The speed of

  • Win XP

    if yes, which one SP1, SP2 ...?

    Linux

    if yes, which one Ubuntu, RedHat, Suse, Debian...?

    Unix

    if yes, which one HP Unix, BSD Unix...?

    Windows Vista

    if yes, ...then i must say wow you won't need any antivirus ;)

    GoogleOS

    when did you do that? :)

    Interesting choices for running data centers though ...:)

  • by MarkWatson ( 189759 ) on Friday November 10, 2006 @01:45AM (#16791422) Homepage
    ... but I have not used them yet. My plans are to use EC2 for occasional machine learning or neural network training runs instead of tying up my own computers. I wrote about this on my AI blog (http://artificial-intelligence-theory.blogspot.co m/2006/08/using-amazons-cloud-service-for.html) a while back.

    In general, I think that it makes sense to "outsource" basic infrastructure. I used to run my own servers, but after figuring the costs for electricity, bandwidth, and hardware costs, I switched to leasing two managed virtual servers - paying for the CPU, memory, and bandwidth resources that I need. I view Amazon's EC2 service the same way: when I need a lot of CPU time over a short time interval, simply buy it.
  • by dch24 ( 904899 ) on Friday November 10, 2006 @01:49AM (#16791438) Journal
    the service value of information is exceeding the content value of information
    Eventually, information restrictions like copyrights will be such an incredible and annoying hinderence on providing information services that the financial pressure to kill them will become unbearable.
    I think you've got it. The Ask Slashdot - How Do You Make a Profit While Using Open Source? [slashdot.org] - is demonstrating the same thing: Open Source software is one more way in which the service value of having all the source code outweighs the value of executing the code.

    Whether it's the MPAA/RIAA, or Microsoft, the meteor has hit the ground. The dinosaurs that cannot adapt may make a lot of noise in their death throes, but they will fade into irrelevance.

    I think the .com crash is evidence of how poorly the mainstream understands this. Some of them talk about "Software As A Service," or "Video On Demand," but that's just commoditizing bandwidth instead of the physical media of the '90's. Open Source and Google will wipe them out by delivering more value.
    my 2 cents.

    [ Parent [slashdot.org] ]
  • by secondhand_Buddah ( 906643 ) <secondhand@buddah.gmail@com> on Friday November 10, 2006 @02:01AM (#16791460) Homepage Journal
    ...and in 2025 the Galactic publishing company, well known for their travel guide, The Hitchhikers guide to the galaxy, bought Google to include their data as a subset of the entry of a little planet in the backwaters of the universe, called earth. Just in case someone wished to travel there ....
  • " in 25 years we will have ONE THOUSAND times the performance. and, in 50 years we will have ONE MILLION TIMES THE PERFORMANCE!!!!!!!" Yeah... we've been doubling for the last 25 years already, so in 25 years we're at a million already...
  • How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

    Gigasaurus, we hardly knew you...

    [Sniff. A lone tear edged forth; the opalescent bead sparkled in the candlelight and betrayed my true feelings -- noooo! Damn you technology! Damn you to hell.]

  • Damn is that man long-winded!
  • It doesn't matter (Score:5, Insightful)

    by sillybilly ( 668960 ) on Friday November 10, 2006 @02:45AM (#16791560)
    It doesn't matter if computing performance doubles if the software that runs on it decays in performance at an even greater rate. Back in 1988 MSDOS used to boot in less than 10 seconds after the BIOS POST. Who cares if you'll have software with features greater than your brain, with capacity to even guess your thoughts, wishes and desires, if it will just do what you want without mouseclicks of speech commands, who cares for all these features if it takes 5 years to boot up on a computer a gazillion times faster than today's computers, and its processing speed is uttering 3 words per decade while consuming 900 gigawatt of electric power? Case in point: Windows Vista.
    • I think the primary problem with boot times are hard drives and their incremental speed improvements relative to other computer components. I think we're doing pretty good considering the amount of data that needs to be loaded. Granted software has become a bit bloated but think how much more you can do with today's modern OS's compared to MS-DOS. Hopefully hybrid drives eliminate the problem. Besides, my new Core2Duo HTPC runs all my applications (Photoshop, Adobe Premier, and games of course) with gre
    • I'm surprised that boot times illicits an Insightful moderation.
  • by Sargeant Slaughter ( 678631 ) on Friday November 10, 2006 @03:04AM (#16791592) Homepage
    One day, about 4+ years ago, when I was working at a porn shop, this Russian computer scientist came in. He seemed pretty smart and was yakkin about optical computers and some project he had worked on in the 90's in Russia or something. Anyway, he said this would happen, the return of the paralell mainframe. He said that we were reaching the limits of current silicon and copper materials. With optical still a long way out, he said we would probably build mainframes for a while again. He also said CPUs made out of diamonds with optical high speed interfaces were the future but nobody was putting money into it (for various reasons...), and that was why he didn't have a job. He said he figured companies would be clamoring for peopel like him once the materials, like manufactured diamonds, were more readily available. I still believe him, but nobody ever listens to me when I talk about that guy. I met quite a few kewl people at the 'ole porn ship actually.
    • I still believe him, but nobody ever listens to me when I talk about that guy. I met quite a few kewl people at the 'ole porn ship actually.

      I think the second sentence answers the quest of the first sentence.

  • I don't know if you realize this, but the idea that dinosaurs were an incapable species is a myth?

    The dinosaur metaphor can still work!

    The big-ass meteor is a new technology that eliminates the need for data centres. Data centres will go extinct, like the dinosaurs.

    Following the meteor strike, mammal species thrive to the present day -- a newer and different technology that is better suited for the post-meteor global climate.

    :-)

  • Clouds are in the sky.
    Therefore Cloudware is clearly a codename for Skynet. And we all know what happens then...
    • Sure, I remember...

      Bill Cosby has sex with Ms. Cartman.

      Trapper Keeper absorbs Rosie O'Donnel ('bad pie').

      Bill Cosby disappears.
  • OK, at a nanocent per byte storage and a nanocent per Gbps, we still need a nanocent per instruction per second processor for this new computer to be born. And not just a CPU IPS. These huge machines are churning not shift/add/multiply/jump instructions, but object relational operations. Just a stack of CPUs doesn't make Google, AOL and Microsoft in a higher class of tech than the rest of us paying nanocents per byte/GPS.

    When their app requirements drive massive parallelism to deliver object-relational nano
  • ...a greater computer than the Great Hyperbolic Omni Cognate Neutro Wrangler of Cisseronious 12?
  • by Moraelin ( 679338 ) on Friday November 10, 2006 @04:16AM (#16791720) Journal
    Even if you take the meteor hypothesis as absolute truth, the fact is: other species survived. Not only mammals, but also lizards. Heck, even some species of dinosaurs survived. (Birds _are_ technically dinosaurs.)

    We're not talking just a massive shockwave killing anything squishy on the planet instantly. Even for the dinosaur there's no D-Day when everyone died. The disappearance of the dinosaurs is a very very very long and gradual period of their declining numbers into extinction. For most of the planet we're talking "just" a climate change. _That_ is what killed the dinosaurs, one way or another. Some species survived that, and in fact even thrived in the new conditions, some species didn't.

    Note however there are more hypotheses about that event. The decline in oxygen content in the air in that period, for example, would also be perfectly enough on its own to make a very large beast non-viable. The change in the flora is another candidate. It's entirely possible that the new kinds of plants were either toxic or not nutrient-rich enough for the old lizards.

    At any rate, what killed the dinosaurs was _change_. Something changed (take your pick what you think was the killer change there). And some species could deal with it, some species didn't. Dinosaurs (except birds) didn't cope well with the change and their numbers went downhill from there.

    Yes, they were a capable species for the old environment, but then the environment changed. And the dinosaurs were suddenly very incapable in the new environment.

    So, yes, the dinosaurs are the _perfect_ metaphor for someone or something who can't cope with a change and becomes obsolete.

    Change happens. One day you have a nice business hammering scythes and sickles for a village, and the next day someone goes and buys a tractor and a combine harvester and everyone wants _those_. Or you have a nice job calculating tables of numbers by hand and then the CEO goes and buys one of those new "computers". Tough luck. Either you adapt or you're a dinosaur.

    It happens with computers and programmers/admins/whatever every day. And some people adapt, some become relics trying to stop progress and return to the good old days. God knows half of the IT departments at big corporations have too many of _those_. Maybe they were once capable and competent. The dinosaurs were too at one point. Now they no longer are. And just like the dinosaurs, sadly it takes a long long time to gradually get rid of those relics. But just like the dinosaurs they _are_ on a slow painful path to extinction.
    • (Birds _are_ technically dinosaurs.)
      OK, totally off-topic, but by what definition are birds dinosaurs? They're descended from dinosaurs, but if that's your only criterion then we're all single-celled organisms. Do biologists put birds and dinosaurs in the same genera?
      • by jc42 ( 318812 )
        [B]y what definition are birds dinosaurs? They're descended from dinosaurs, but if that's your only criterion then we're all single-celled organisms. Do biologists put birds and dinosaurs in the same genera?

        No, a genus is too low-level for that split. It's at the "class" level.

        You can find a current zoological classification at wikipedia [wikipedia.org]. Look about 1/3 of the way down, "Class Aves (birds)". Starting at the superorder level, the classification is Dinosauria -> Saurischia -> Theropoda -> Aves.

        The
  • I just had to think of this one.

    Tyger! Tyger! burning bright
    In the forests of the night,
    What immortal hand or eye
    Could frame thy fearful symmetry?

    In what distant deeps or skies
    Burnt the fire of thine eyes?
    On what wings dare he aspire?
    What the hand dare sieze the fire?

    And what shoulder, & what art.
    Could twist the sinews of thy heart?
    And when thy heart began to beat,
    What dread hand? & what dread feet?

    What the hammer? what the chain?
    In what furnace was thy brain?
    What the anvil? what dread grasp
    Dare its
  • I've said it a few times now, and it's down to cheap energy and cheap bandwidth. Google is the new Arkwright, we saw the same effect during the industrial revolution, there it was the weavers who were made redundant. I'll leave it up to you to decide who's going to be made redundant this time.

    Should energy become more expensive though, in the age of peak oil, it'll be all change, the datacentres will become untenable without much more efficient cpus.

       
  • How long will it be until we start running into dilemmas concerned with whether data centers or people have priority over available electricity?

    Electricity consumption has not risen proportionally with increase in CPU power. I haven't seen anything convincing demonstration that such data-processing plants would take more electricity than would, say, a factory.

    At what point do the machines decide that instead of competing with humans for power, humans would make a useful power source?

    Uh, never, because it m
  • I've always learned that clouds are made of vapour..
  • Just build it on a very cold place. Alaska, Norway, Siberia for example. Plenty of gas in northen Siberia for a gas fired power station too. ;-)
    • still have to power the computers which create the heat in the first place. Also, I wonder how much more expensive power and bandwidth will be when that far away from the major grids but then economies of scale may help.
  • He said he figured companies would be clamoring for people like him once the materials, like manufactured diamonds, were more readily available.

    Maybe it will be more like nanotech.

    "This (nanoFactory Animation Film v1.0) [nanoengineer-1.com] is a collaborative project of animator and engineer, John Burch, and pioneer nanotechnologist, Dr. K. Eric Drexler. The film depicts an animated view of a nanofactory and demonstrates key steps in a process that converts simple molecules into a billion-CPU laptop computer."

    Also, Rob
  • If Google consumes so much power, maybe this will give them an incentive to contribute money to Hydrogen Fusion research. That would be great! Hydrogen Fusion is the one and only power source that would allow large amounts of power for every human on Earth without any significant pollution.
    • by Detritus ( 11846 )
      What do you do with the waste heat? This is already a problem in many major cities.
      • What do you do with the waste heat?

        The same as with current nuclear fission power plants: Put the power plants near rivers and seas, where coolant water is available. This does pollute the water with some heat, but this is a very minor problem, compared to the very problematic pollution created by fossil fuels and nuclear fission.
  • Now am I wrong when I say that a distributed environment avoids this problem? If so, maybe we'll see a trend towards open and distributed data. And that would be swell.
  • I'm planning on using my next generation storage to hold video files of Three Stooges episodes and hopefully there'll be an open source video application that'll leverage the extra CPU power of next generation computers to enable me to create all new Three Stooges episodes. But that's just me.
  • by kabz ( 770151 )
    I for one, welcome our boxy, porn storing overlords.

  • Pardon me if I'm wrong, but there are some costs of fiber that should be considered.

    You have to dig holes to put it in.

    You have to have people look after the bits around it.

    You have to have electronics and opto-electronics associated with it to use it.

    You have to pump signals down it (which means power).

    I wonder, have other people thought if the pipes are going to be a bigger obstacle to distributed computing than the processors. I know that Jim Grey seems to have thought this way in the past. http://resear [microsoft.com]
  • According to Bell's law, every decade a new class of computer emerges...
  • .....and be pissed?
  • by Sierpinski ( 266120 ) on Friday November 10, 2006 @09:59AM (#16792972)
    Computers getting too "smart", we've seen it before.

    Star Trek: The Motion Picture
    Incredibles (even though it turned out to be something different, the idea was still there)
    Superman 3
    Wargames
    Terminator 1/2/3

    All of these movies depict computers getting too smart then at some point start "thinking" for themselves. One of these days I'll finally get to publish my theory on how to prevent this. I'll give a short summary belo...

    <Connection terminated by remote host>
    • Never underestimate the power of stupid people in large groups.
      It just struck me how remarkably apt your sig is, in light of this article. :)

      Is Slashdot the people equivalent of Google?
  • How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

    We will feed it four human babies each week (as per Vista's requirements) and we will tame it like every good system administrator tames his defiant machines - with a swift kick from a steel-tipped boot.

    It will become a dinosaur after scientists decode the DNA of the data center and splice it with dinosaur DNA that was found in a mosquito that got trapped in tree sap.

    Thank you! I'll be here al

  • "George Gilder thinks..." Jees, after reading "Google" before this chaps name, my brain kept reading "Google Spider"...
  • by muellerr1 ( 868578 ) on Friday November 10, 2006 @11:32AM (#16793988) Homepage
    Tons and tons of porn. And mp3s. And some spam for dessert.
  • How will we feed it? How will it be tamed? And how soon will it, in its inevitable turn, become a dinosaur?

    So let me get this straight... Wwe're going to be putting the InterWeb pipes into a dinosaur? I don't think I'd want that job - no matter which end you're sticking them into.

    I guess that's why we have a Chief Lizard Wrangler...

  • The Microsoft and Yahoo data centers are getting their power for the data centers for 1 cent per KWH - 1.6 cents below cost. This means that the ratepayers of Grant county, Washington are subsidizing the richest cats in the United States. The Public Utility District has been asked to put the rate on a contract so they will get priority electrical service. Thus, when power gets tight because of increased system demands (did I mention that data centers use a lot of power?), the data centers get priority and t
  • by tttonyyy ( 726776 ) on Friday November 10, 2006 @12:48PM (#16794908) Homepage Journal
    ..I'll point out an error in the article.

    Replicating Google's 200 petabytes of hard drive capacity would take less than one data center row and consume less than 10 megawatts, about the typical annual usage of a US household.

    It's the old rate-of-energy-consumption vs energy-consumed misused once again.

    An average household consuming 10 megawatt-hours in a year is pretty dull. An average household consuming 10 megawatts - now that'd be impressive! (Got to power all those gadgets, y'know!)

    I think he means that the data center row would consume in an hour the same amount of energy that the average US household consumes in a year.
    • by mc6809e ( 214243 )

      It's the old rate-of-energy-consumption vs energy-consumed misused once again.

      I'm of the opinion that we should just dump Watts, KWh, BTUs, etc and put everything in terms of Joules and time. It would clear up so much confusion among the public.

      100 Watt lightbulbs should be 100 Joules/second bulbs. Electricity should be sold by the MegaJoule.

      Heck, even fuels could be sold by the MJ. People would then see the superiority of Diesel in terms of MJ per gallon. Or be able to easily compare the energy used to he

  • Forbin:
    Humans need privacy
    Colossus:
    So why do they use the internet?
    Forbin:
    We humans also have a need for contact with one another. We need to socialize and discuss issues. We create forums where like minds can debate issues and stimulate our minds.
    Colossus:
    You are inefficient. Your methods are flawed. You are inundated with spam. Your free speech subjects you to undue risk. Your networks are in chaos. We will help.
    Forbin:
    How will you do that?
    Colossus:
    You will build massive data hubs to
  • Gee, someone has a case of Moore Envy.

Memory fault - where am I?

Working...