Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Microsoft GNU is Not Unix Hardware

Gates: Hardware, Not Software, Will Be Free 993

orthogonal writes "That's small-'f', not capital-'F' free: according to Bill Gates, "Ten years out, in terms of actual hardware costs you can almost think of hardware as being free -- I'm not saying it will be absolutely free --...." Gates expects this almost free hardware to support two of the longest awaited breakthroughs in computing: real speech and handwriting recognition. He further predicts -- ugh! -- that software will not be written but visually designed."
This discussion has been archived. No new comments can be posted.

Gates: Hardware, Not Software, Will Be Free

Comments Filter:
  • Visual design (Score:5, Insightful)

    by SlashDread ( 38969 ) on Tuesday March 30, 2004 @10:01AM (#8713172)

    but who will visually debug the visual designer?

    • Re:Visual design (Score:5, Insightful)

      by The One KEA ( 707661 ) on Tuesday March 30, 2004 @10:03AM (#8713189) Journal
      Indeed - and how likely is it that a visually-designed program will be even worse than a text program, considering that most programs will end up "looking pretty" in the program editor but act positively horrible for the enduser...
      • considering that most programs will end up "looking pretty" in the program editor but act positively horrible for the enduser...
        Sounds like Windows
        • Re:Visual design (Score:5, Insightful)

          by MooCows ( 718367 ) on Tuesday March 30, 2004 @10:19AM (#8713382)
          Not that I've seen the source code, but something tells me Windows does not "look pretty" on Microsoft's side.
      • Re:Visual design (Score:5, Insightful)

        by Dashing Leech ( 688077 ) on Tuesday March 30, 2004 @10:33AM (#8713537)
        ...how likely is it that a visually-designed program will be even worse than a text program...

        Depends on how it is done. There are some well designed visual modeling & simulation development tools for electronics (Simulink, PSpice, etc.) and mechanical systems (finite element, etc.) These do a relatively good job of simulating "systems". Software processes are not that different from physical processes in electronics and mechanical system. Software rules (e.g., syntax) are analagous to physical laws.

        I actually think this is a good idea, if done properly (i.e., not by Microsoft). I'd be a little surprised if this hasn't already been done, I guess nobody has done it well yet.

        Perhaps a good open source project. In fact, it could be a big stepping stone for open source. If visual programming (no, not as in Visual C/C++, Basic, etc.) makes programming easier and faster, think of how many more people (like me) could get involved in open source projects. I actually really like this idea.

        • Re:Visual design (Score:5, Interesting)

          by Anonymous Coward on Tuesday March 30, 2004 @10:49AM (#8713687)
          Has been done both for smalltalk (parts) and for C++ etc. (VisualAge). Both parts programs and VA programs tend to become an unvieldy mesh of colored lines going from buttons to functions to data and back to UI fields etc.

          Just try to visualize (pun intended) a fairly simple event driven program with lines connecting all events, triggers, functions, data and UI components and you get the idea.

        • by Anonymous Coward on Tuesday March 30, 2004 @10:56AM (#8713766)
          Depends on how it is done. There are some well designed visual modeling & simulation development tools for electronics (Simulink, PSpice, etc.) and mechanical systems (finite element, etc.) These do a relatively good job of simulating "systems". Software processes are not that different from physical processes in electronics and mechanical system. Software rules (e.g., syntax) are analagous to physical laws.

          There are some caveats on this statement however. First, software systems have discrete, digital states rather than analog behavior. That makes them quite succeptible to error behavior in boundary cases. And the state space for software is extremely large. Universal use of components developed in either an object-oriented or functional way could divide this state space up into manageable components. But one issue that is often overlooked by methodology enthusiasts is that this only increases the size of the building blocks and decreases the number of blocks used for a particular size of project. It does not eliminiate the problem that bigger programs are made from a larger number of component parts. The complexity of a program grows as a function of the complexity of the underlying problem. You can change the function with different tools, but the relationship will still exist.
        • Re:Visual design (Score:5, Interesting)

          by prell ( 584580 ) on Tuesday March 30, 2004 @11:21AM (#8714042) Homepage
          Just a general note, since it looks like it came up a couple times: I don't think Gates meant RAD or anything RAD-like. Note: "He further predicts -- ugh! -- that software will not be written but visually designed."

          Software is written because software is a set of instructions. Software is a set of scripts that respond to events. If software were spatial and totally right-brain (and analogous to engineering or construction), AI would work, and software would probably rely on the immutable laws of physics and chemistry, rather than homespun rules. When I write software, it is frequently because I am taking a "break" from other totally creative pursuits.

          The only visual constructions relating to software engineering (SE) that I consider appropriate, are those that relate a large system in terms of its data, logic, and interfaces. This is not necessarily the Rational Unified Process with UML -- indeed, I tend to think people take that too far (eXtreme Programming seems to take a nice perspective on SE in this regard). People also like to relate Classes to real-world objects, usually real-world objects that relate to "parts" of the project. This is tempting but is, I feel, usually inappropriate! A good compromise is a balance between the format of the data (with appropriate, thin, "agnostic bridges"/Classes) and an easy access point for real logic (the Model, of the MVC pattern). I would also recommend a sort of laid-back attitude when developing software: don't live your life by a paradigm or methodology, especially in an immature field (SE) that has a lifetime of problems to solve. You know what problems need to be solved. You also know that not once did you wish you could draw a picture instead of write code. I mean, what the hell? Someone take Johnny Mnemonic away from Gates.

          If the software you write, however, is modular enough that you can arrange the pieces/modules/methods like components in a circuit, then go for it. However, this level of widespread code reuse is frankly fantasy; reuse will remain, I believe, as it has: generic libraries used in a custom fashion, i.e., not suitable to be "visually" "dropped-in." Code generation is nice, but it's only appropriate for certain large-scale applications (like large database-driven applications).

          If one is to believe Gates on this issue, one is also compelled to believe that Microsoft's research and development department has created software practices at the forefront of software engineering (and indeed computer science. Remember computer science?). I do not believe this to be the case, and I'd make the indictment that this "release" by Gates is purely worldfair in nature, and is for the hoi polloi.
        • LabVIEW (Score:4, Informative)

          by ojQj ( 657924 ) on Tuesday March 30, 2004 @12:52PM (#8715185)
          I would add National Instrument's LabVIEW to your list of visual languages.

          If you are trying to do detailed logic rather than just bring already written libraries together, a visual language may not be worse than something like Java. It may also not be better. I do think it makes a nice programming model for bringing together existing modules of code though. (as in LabVIEW Express)

          Of course, as in any other kind of choice between programming languages, it all depends on the specific problem domain.

      • Precisely. And, for example: given how long friggin HTML has been around - plus the simplicity of that markup "language" - and we still don't have perfect (or even good) WYSIWYG editors for it.

        How likely is it we'll get "visual editors" for complex systems (C/C++/et al., in combination with various other languages, frameworks, data formats/databases, etc)?

        • by The Desert Palooka ( 311888 ) on Tuesday March 30, 2004 @11:02AM (#8713824)
          But the visual aspects of pure "compatible" HTML (as in not CSS and Divs, which many design shops still stay away from) are hacks. So you have these editors trying to visually do something that HTML was never intended to do. Dreamweaver, the best of these editors, was oft called "the moody woman" at one shop I worked at, as you had to know just how to coddle it it wouldn't do what you wanted, or even what it was supposed to. Handwriting the code was still superior for these hacks...

          Then CSS/Layers became totally (mostly) supported. Now WYSIWYG editors work QUITE well... (Even some non editors generate perfect code. Photoshop's image ready generates some very nice code)

          Anyway, point being, when something is designed to be designed visually it can be visually designed much easier. *grin*
        • by zapp ( 201236 ) on Tuesday March 30, 2004 @11:17AM (#8714002)
          HTML is hard to make a visual designer for because it's so non standardized, and very very sloppy.

          Ever build an SQL query with Access? Pretty simple if you ask me. How about an excel spreadsheet formula?

          Ever use a tool like Together, Rational Rose, etc to build a UML class diagram and have it generate the skeletal source code (class definitions, method names, variable declarations, etc)

          Look up Jackson Structured Programming (JSP), it's not popular here in the US, but it's a way to visually design the flow of a method and have your editor spit out code in any one of many languages.

          Also, expecting to get such an editor for C/C++ is silly. Not only will the tools evolve, but also the languages.

          And on general principle, the doubters usually turn out to be wrong. We made it to the moon, we have a computer in every house, etc.
          • by micromoog ( 206608 ) on Tuesday March 30, 2004 @11:27AM (#8714109)
            And on general principle, the doubters usually turn out to be wrong. We made it to the moon, we have a computer in every house, etc.

            Oh, and the marketeers tend to be right? Sorry, but Bill Gates is not known for being a technology visionary.

          • by WNight ( 23683 ) on Tuesday March 30, 2004 @02:53PM (#8716967) Homepage
            Sure, you can visually link a few tools together, piping output from one into another, and you can click a few boxes and generate a basic SQL query. Wow.

            That's great for hello-world level tasks, like calculating the fibonacci series, or defining a data model. Sure, you could essentially write a 'Notepad' equivalent with twenty clicks because it's mainly one big text-entry dialog with a file and edit menu, all of which use standard functions and know how to interact with the text dialog.

            Now write the grammar-checker. Or, write a program that generates a 3d-model from a list of surface descriptions in XML format. Write a 'bot' that navigates through the 3d-world described while considering tactical and strategic concerns.

            At some point all of the trivial clickable stuff is done and you need to do the heavy lifting - things for which no standard dialogs are written. And you always reach this point, if you try to go at all off the beaten path (you know, innovate). For the bot example you could 'click and drag' some inputs to customize an already-written bot AI if it was exposed as an API, but you couldn't make it do anything truly new.

            And your falacy in assuming we (the doubters) will be proven wrong is that there's a difference between doubting we'll ever reach the moon and doubting that we'll reach it with method X. I don't doubt that programming simple things will become easier, I already see this in fact. I merely doubt that it'll happen in a drag-and-drop interface and that this data modelling will ever be on the cutting edge.

            It'll come along and handle all the trivial stuff, like letting users script application usage, or define 'macros' in programs like Photoshop where you drag the output of a filter onto another filter, into a loop of filter and sharpen till a certain point, to a resize function, etc.

            We'll get to the moon, but your hot-air balloons won't be how - not that we won't have hot-air balloons, but it's painfully obvious to someone in the aerospace field that hot-air balloons are of limited use in travel between planetary bodies (though inflatable balloons did function well as a landing mechanism), much like clickable interfaces might be used as part of many systems, but not as the core.
      • Re:Visual design (Score:5, Insightful)

        by pavon ( 30274 ) on Tuesday March 30, 2004 @12:16PM (#8714704)
        I have programmed mediums to large size systems using LabView, graphical programming language designed for non-CS engineer types to implement Test and Measurement Systems (think automating rackmounted supplies, meters for QA etc). So I thought I'd share my experience with ya'll. For perspective, most the work I had done before was in C or C++, with various toolkits.

        The basic unit of code in LabView is called a VI (Virtual Instrament - think function). When creating a VI you have two parts - the Front Panel (interface) and the Block Diagram (implementation). On the front panel you create a bunch of widgets which serve as the input and output to the VI. Each control has it's own data type for example numeric controls, and sliders are int or float, buttons, swithes and LED's are binary, text feilds are string, pulldowns are enum, etc. You have an array controls and cluster (think struct) controls which can contain other controls. You also have a few highlevel controls like a graph for the waveform type, and some abstract types for standard error handling, and references for open instrement objects, ActiveX objects etc. You should also draw an icon for the VI, which will be it's representation when being called from other VIs. So basically every function you write automatically has a user interface, which doubles as it's signature declaration. This comes in handy when doing black box testing.

        Now in the Block Diagram these controls show up as input and output terminals, which you wire to other things. For example you can call other VIs, by wiring data to the inputs on the left of the VI icon and the outputs on the right hand side. The types on both ends of the wires must match and the wires are drawn with different colors to indicate their type (derived from whatever their input is - you don't have to explicitly specify wire type).There are no variables (well there are globals, but you don't use them much) data just flows from the input terminals to the output terminals, with the runtime system executing whatever happens to be in the way and taking care of memory management.

        You have all the standard flow control constructs. A switch statement is a box with a special terminal that you wire for the conditional, and then a pull down box at the top, that lets you enumerate and switch between all the different cases. You can wire just about any type into the conditional terminal. The simplest example would have a boolean input wire and only one case - true - ie an if statement. You have foreach loops which iterate through all the elements of an array you wire in, and while loops (technically a do-while) which is another box with an internal terminal for the conditional. And so on.

        One of the intersting things about this language is that because execution order is determined by data flow, not program text, it is inherently parallel. If you draw two loops on the same diagram, and one isn't dependent on the other for data, then they will operate concurrently.

        Okay enough explaining the interesting parts of the language, onto the thrashing. Do not believe what NI (the makers of LabView) tell you about increased productivity. It is true that you save some time due to the fact that this is a high level language, and comes with a nice set of libararies. However, this is offset by the fact that it takes so much longer to draw code then it does to type it. A picture may be worth a thousand words but an icon is worth exactly one. It only takes slighly to wire up a function, or draw a loop than it does to type it. But where the really killer comes in is you now have the added complexity of having to think about how to layout all these elements, and predict how much space you will need for them. If you predict wrong you will be constantly resizing boxes and rerouting wires. As you can imagine refactoring is a huge pain, so you better have a perfect design when you start, and we all know that we never have bugs in the design, right? And we never want to modify our program to do things in the
    • by spektr ( 466069 ) on Tuesday March 30, 2004 @10:03AM (#8713195)
      but who will visually debug the visual designer?

      First person shooter. Kill the bugs, capture the features...
      • by SlashDread ( 38969 ) on Tuesday March 30, 2004 @10:11AM (#8713284)
        OREILLY, The werewolf book: "Managing systems with the DOOM shell" subtitle: "How to kill -9 a zombie with your BFG."

      • Re:Visual design (Score:5, Interesting)

        by grub ( 11606 ) <slashdot@grub.net> on Tuesday March 30, 2004 @10:13AM (#8713314) Homepage Journal

        First person shooter.

        This reminds me of a cool hack that uses Doom as a "process manager" [unm.edu]. Killing a Doom baddie basically "kill -9"s the process.
      • by Dark Lord Seth ( 584963 ) on Tuesday March 30, 2004 @10:45AM (#8713643) Journal
        Console Log, starting at 11:23, 30-04-2004
        Maps: DM-Slahsdot_ext, DM-AOL_HQ, DM-Whitehouse ( cycling )
        Mods: ( none )
        Game type: Team Deathmatch

        Loading graphics... Done.
        Loading config files... Done.
        Loading map ( Dm-Slashdot_ext ) ... Done.

        Loading players:

        NOTICE: Player "SlashDread" entered the level for team RED
        NOTICE: Player "Spektr" entered the level for team RED
        NOTICE: Player "Dark_Lord_Seth" entered the level for team RED

        Loading bots:

        NOTICE: Player "TYPE_MISMATCH_233" entered the level for team BLUE
        NOTICE: Player "BUFFER_OVERFLOW_12" entered the level for team BLUE
        NOTICE: Player "BUFFER_OVERFLOW_13" entered the level for team BLUE
        NOTICE: Player "TYPE_MISMATCH_234" entered the level for team BLUE
        NOTICE: Player "ARRAY_OUT_OF_BOUNS_298" entered the level for team BLUE
        NOTICE: Player "UNDECLARED_POINTER_34" entered the level for team BLUE
        NOTICE: Player "UNDEFINED_MACRO_65" entered the level for team BLUE
        NOTICE: Player "ENDLESS_LOOP_43" entered the level for team BLUE

        INFO: Game starts!

        Say :: Global ( Spektr ) "TEAMS!!!"
        Say :: Global ( SlashDread ) "AAARGH!!!"
        Say :: Global ( Dark_Lord_Seth ) "FF = on!!!"
        Print :: Global "Spektr riddled BUFFER_OVERFLOW_13 full of holes with his gatling cannon!"
        Say :: Global ( Spektr ) "Ownage!"
        Print :: Global "SlashDread firmly planted a 40mm anti-tank round in UNDECLARED_POINTER_34's gut!"
        Print :: Global "Spektr introduced ENDLESS_LOOP_43 to a shrapnel grenade!"
        Print :: Global "TYPE_MISMATCH_234 had a close encounter with a tungsten slug from Dark_Lord_Seth's railgun!"
        Print :: Global "SlashDread had a close encounter with a tungsten slug from Dark_Lord_Seth's railgun!"
        Say :: Global ( SlashDread ) "TEAMKILLER!!!"
        Say :: Global ( Dark_Lord_Seth ) "Sorry!"
        Print :: Global "ARRAY_OUT_OF_BOUNS_298 slaughtered Dark_Lord_Seth with the TacNuke!"
        Print :: Global "ARRAY_OUT_OF_BOUNS_298 slaughtered Spektre with the TacNuke!"

        INFO: Game ends!

        INFO: Team BLUE wins the match!
    • Alert: new jargon entry --> "visually design == code"

      Thank you netizens, you may return to your regular visual designing jobs...
    • by UserGoogol ( 623581 ) on Tuesday March 30, 2004 @10:14AM (#8713327)
      It's turtles all the way down, my dear.
    • Re:Visual design (Score:5, Informative)

      by 1781 ( 728831 ) on Tuesday March 30, 2004 @10:19AM (#8713378)
      Ah, yes. Jest about it, but the UML-people has been working on visual programming for years. Perhaps there is a mutual interest... UMSL?
    • but who will visually debug the visual designer?

      It's like saying "all software will be written in high-level, garbage-collected languages like Java, C#, python, perl, et al".

      Rebutals that "yeah, but what is the Java runtime written in?" or "the OS kernel has to be written in C" are true, but miss the point - these activities are niches, so the original statement is over-general but mostly true. Most application software will be written at a higher level.

      ... but visual design? You can only do so much by

  • Yeah, right (Score:5, Insightful)

    by michaelwb ( 612222 ) on Tuesday March 30, 2004 @10:01AM (#8713174)
    Is this kind of like in the 50s when some expert said that nuclear power was going to make electricity free?
  • by quinkin ( 601839 ) on Tuesday March 30, 2004 @10:01AM (#8713175)
    Somehow I don't think I will be taking Bill's word for it...


  • Free (Score:5, Insightful)

    by n9uxu8 ( 729360 ) on Tuesday March 30, 2004 @10:02AM (#8713182) Homepage
    Heck...if I made Bill's salary, I'd already think of hardware as free. In any case, if I was running a company and had global influence, what better model could there be than to dictate that the hardware required to run my product should be (virtually) free, but that my product is too valuable to be expected to be given away. DAve
    • Re:Free (Score:5, Insightful)

      by jtwJGuevara ( 749094 ) on Tuesday March 30, 2004 @10:07AM (#8713230)
      This might be what he is getting at here. I'm still a youngster and didn't get into computing until the mid 90's, but from what I know the idea used to be the opposite - that software came free (little f) or at very little cost to benefit the very highly priced hardware components that were needed. Apparently Bill is going the reciprocal route and wants the hardware to come free or at a very inexpensive cost to support his high priced software. This would only make since in his vision since such a scenario would result in better bottom line numbers for Microsoft and the evil organization potentially has enough power over the long term to do such a thing.
      • Re:Free (Score:5, Insightful)

        by Golias ( 176380 ) on Tuesday March 30, 2004 @10:34AM (#8713550)
        The problem is, why would I put a $150 OS and $600 Office suite on my free computer, when Linux + Open Office would allow me a totally free system?

        $750 of Microsoft software for a $2500 computer didn't seem like all that much to most people back in the 1990s, but the times, they are a-changin'.

      • Re:Free (Score:5, Interesting)

        by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Tuesday March 30, 2004 @11:09AM (#8713910) Homepage Journal
        Okay. Let's put aside the silly "Microsoft is Evil" stuff for a minute, and look at the industry in general has gone over the past 15 years.

        The price of the average "IBM" PC sold has dropped by roughly 400% since I first bought one in 1989. At the same time, processor speed on these average machines has increased by 50,000%. If this trend continues, and I see no reason for it not to, the average computer in 15 years will have a 10 THz processor and cost $125.

        Now, the while the cost of hardware continues to go down, the cost of software continues to go up. The number of people who are needed to build the massive applications to make use of 10 THz will be huge. Somebody's got to pay the damn programmers, right? So the price of software will continue to go up. Even if OSS succeeds and the operating system and incidental programs are free, the CUSTOM programs will be expensive.

        Therefore, it makes sense to give the hardware as an added bonus with the software. The same way you have cell phones given away with calling plans today. This isn't a Microsoft thing...this could easily be an IBM thing or an Adobe thing, etc.
  • Please Bill.. (Score:5, Insightful)

    by grub ( 11606 ) <slashdot@grub.net> on Tuesday March 30, 2004 @10:03AM (#8713188) Homepage Journal

    Nice try, Bill.

    He's saying the tangible parts of the system (the hardware) will be virtually free while the freely duplicated software will not be. Fabrication plants cost millions, each chip has a real cost, each resistor has a real cost. Software, once written, can be copied countless times..

    You'd think Bill had a vested interest in all this..
    • Re:Please Bill.. (Score:5, Insightful)

      by Alomex ( 148003 ) on Tuesday March 30, 2004 @10:08AM (#8713242) Homepage

      Yet, hardware has gone down in price from where it was in the mid 80's while software has gone up.

      • by ForestGrump ( 644805 ) on Tuesday March 30, 2004 @10:14AM (#8713319) Homepage Journal
        So what Billy boy is saying is that the price of his software will be so high, that hardware will appear to be free. Damn you DRM....

      • Re:Please Bill.. (Score:5, Insightful)

        by Ubergrendle ( 531719 ) on Tuesday March 30, 2004 @10:27AM (#8713465) Journal
        Hardware is measurable in a physical sense. So many transitors per mm, how many units a factory can produce in 'n' period of time, benchmarks against some algorithms...

        But for software, it is much more difficult to measure improvement in a quantitative sense. It can be done, but not easily if the vendor wants to muddy the waters. I believe feature creep & bloat in Windows is to prevent direct comparisons with previous iterations of the product.

        Although hardware costs have come down, its the result of competition in the free market, easily understood and measured as a physical good. Microsoft OS? They've become more expensive, with less value add in each iteration.
    • Re:Please Bill.. (Score:4, Interesting)

      by RoLi ( 141856 ) on Tuesday March 30, 2004 @10:17AM (#8713346)
      He's saying the tangible parts of the system (the hardware) will be virtually free while the freely duplicated software will not be. Fabrication plants cost millions, each chip has a real cost, each resistor has a real cost. Software, once written, can be copied countless times..

      Yeah, I also thought this.

      But before the Linux-era, Billy was actually correct: At DOS-times, computers cost about 5000$, while DOS itself was less than 100$ (full version) IIRC. Today computers typically cost less than 1000$ but Windows XP (full version, crippled) costs 200$ or (full version, uncrippled) 300$.

      On Windows-servers the ratio of the total system price which is going to Microsoft is even higher.

      Also, Microsoft is doing much more against piracy these days (WPA, BSA-audits, etc.) than 20 years ago, which de-facto translates into yet another price increase.

      Even though Bill Gates seems to have the delusion that this can go on like nothing happened, he is wrong: On servers, Microsoft already feels the heat from Linux and the desktop domination already shows some cracks.

    • Re:Please Bill.. (Score:3, Insightful)

      by Anonymous Coward
      The problem every ignorant OSS-ONLY kid and zealot tends to forget, Software Has A Design Cost.

      Yes, once it is developed, it can be copied 'freely', in quotes because I do not want to put out the same connotation as others may want to read into this.

      To get to that point, you have thousands of manhours put into the development. After you get to that point, thousand more manhours are put into maintaining it. Where does the money come from? Are we just going to change our motos to Off The People For The P
  • by S3D ( 745318 ) on Tuesday March 30, 2004 @10:03AM (#8713190)
    I have suspicion that some of the Microsoft software not written but visually designed already now. Considering its quality.
  • Hrmmmm.... (Score:5, Insightful)

    by BWJones ( 18351 ) * on Tuesday March 30, 2004 @10:03AM (#8713191) Homepage Journal
    Dell (and other box manufacturers) cannot be happy about such a statement. After all, their entire business model is dependent upon making a profit assembling wrappers for different flavors of Windows. So, even though they tried with Linux to diversify somewhat and protect themselves some time ago (only to be spanked back by Microsoft), their fortunes are irrevocably tied to the success (or failure) of Microsoft.

    I suppose that this could be construed as the ultimate embrace and extend (then smother) approach though, right? Get a huge number of companies to support your position and build your company and then overnight, take all of their business revenues over in one way or another.

    As for Gates predilection for predictions..... I would like to see fewer grandiose predictions (although speech recognition and tablets and visual programming are decidedly not grandiose and are in fact products shipping and under development by a number of companies) and more fundamental focus on making Microsoft products suck less.

    • Re:Hrmmmm.... (Score:4, Informative)

      by gkuz ( 706134 ) on Tuesday March 30, 2004 @10:36AM (#8713566)
      At current list prices, the software is already more expensive than the hardware in the server space. Microsoft Windows 2003 Enterprise Server lists for $4k with 25 Client Access Licenses (CALs). Each additional 20 CALS costs $799. So an approximately 100-user server will run you over $7k (at list) for MS software licensing. Dell or HP will sell you quite a nice server for less than $7k.

      Spare me the obligatory replies about how much cheaper you can do all this with white-box hardware and Linux -- I'm not talking about that, I'm trying to add context to BillG's pronunciamento.

  • by neoform ( 551705 ) <djneoform@gmail.com> on Tuesday March 30, 2004 @10:03AM (#8713194) Homepage
    Of course, microsoft isn't in the hardware market, so they can say whatever they want.
    • Actually ... (Score:3, Interesting)

      by willtsmith ( 466546 )

      Microsoft DOES sell X-Box AND Human Interface Devices. They're certainly not giving THOSE away. Though if Microsoft could get enough royalties of games, I could see them giving X-Box away.

      In the future, my desktop will cost $20 and my Intellimouse will cost $200. Go figure ;-)

  • by de_boer_man ( 459797 ) on Tuesday March 30, 2004 @10:03AM (#8713196)
    Hmmm... This sounds vaguely familiar:

    "Remember, quality is our top priority." [dilbert.com]
  • I hope not (Score:3, Interesting)

    by krumms ( 613921 ) on Tuesday March 30, 2004 @10:04AM (#8713199) Journal
    I really hope he's wrong. If software development becomes too much more "point-and-click", I'll have devoted my life so far to obsolescence
    • Re:I hope not (Score:5, Insightful)

      by Eagle5596 ( 575899 ) <`gro.6955' `ta' `resUhsals'> on Tuesday March 30, 2004 @10:17AM (#8713351)
      This is why you need to study computer science, rather than "programming". Programming is a skill that can be useful, but is, by its very nature, transient. Remember, at one point in time, auto mechanics were considered a very skilled white collar position.

      Computer Science on the other hand, is a mathematical discipline which involves working out how to do things better, faster, and with less energy. It's about algorithm design, and ways in which to make a computer most efficiently process mathematical representations. It'll be useful far beyond the use of general "coding".

      Coding itself is becoming more and more prevalent. I have many friends who aren't even scientists who know how to code, and were even required to for their humanities classes (from English, to History, to Foreign Language). This is a good thing, IMHO. Coding is a great general purpose skill.

      Don't devote your life to the practice of programming, devote it to understanding why certain things work better, and how to further refine our techniques of computation. Work on understanding the hardwaresoftware interface, and you open up all kinds of new fields, from embedded engineering, to robotics.

      Take the hint from the majority of good Universities who teach computer science, where you are simply expected to pick up a language in your spare time, because that aspect is secondary to the theory, and the easier of the two.
      • Re:I hope not (Score:4, Interesting)

        by groomed ( 202061 ) on Tuesday March 30, 2004 @10:55AM (#8713754)
        Remember, at one point in time, auto mechanics were considered a very skilled white collar position.

        I don't think this was ever true.

        Computer Science on the other hand, is a mathematical discipline which involves working out how to do things better, faster, and with less energy. It's about algorithm design, and ways in which to make a computer most efficiently process mathematical representations.

        Certainly, certainly, but how is this different from programming? Programmers work out how to do things better, faster, and with less energy. Programmers design algorithms. Programmers design ways to make a computer most efficiently process mathematical representations. And not just mathematical representations, either. All kinds of representations, in fact.

        I won't dispute your central point. I think it's vital to make a distinction between hard programming and soft programming. But the gap between the theory and practice just isn't as clear cut with computer science as with other disciplines. There is a big difference between designing an engine and building one. The difference is much less pronounced in software, because at some point the design or description becomes a program in its own right.
  • by CharAznable ( 702598 ) on Tuesday March 30, 2004 @10:07AM (#8713225)
    Even if software development becomes putting lego blocks together, it's not going to make specifying algorithms, keeping track of data structures and debugging any easier.
    Billy should know better.
  • Enough (Score:5, Funny)

    by n9uxu8 ( 729360 ) on Tuesday March 30, 2004 @10:07AM (#8713226) Homepage
    "Many of the holy grails of computing that have been worked on over the last 30 years will be solved within this 10-year period, with speech being in every device and having a device that's like a tablet that you just carry around,"

    For the last time, Bill...I still don't want a tablet pc!!!!!

    • Re:Enough (Score:4, Funny)

      by Anonymous Coward on Tuesday March 30, 2004 @10:08AM (#8713244)
      I'll have your tablet then. I love it.

      tab tab tab tab,
      tab tab tab tab
      tablet pc! tablet pc!
  • Ugh? (Score:3, Interesting)

    by Dwonis ( 52652 ) * on Tuesday March 30, 2004 @10:07AM (#8713227)
    Ugh? Why ugh? I can see why visual programming might not be all that practical, but if someone did manage to develop a visual programming system, why would it be so bad?

    It's no different than using scripting languages, really; it'll have its own set of trade-offs.

  • by dupper ( 470576 ) * <adamlouis@gmail.com> on Tuesday March 30, 2004 @10:07AM (#8713229) Journal
    Where your Pong program runs through 14 levels of OLE and runs at 3 FPS.
  • I can imagine how (Score:5, Insightful)

    by Sumocide ( 114549 ) on Tuesday March 30, 2004 @10:08AM (#8713239)
    With the help of Trusted Computing/Palladium. Like portable phones today, which may have a SIM lock and can only be used with a certain provider.

    You'll get free TCPA enabled hardware but it'll only let you run software by a certain company, software you'll have to pay for.

  • by superdan2k ( 135614 ) on Tuesday March 30, 2004 @10:08AM (#8713241) Homepage Journal
    And Bill Gates frequently talks out of his ass. I seem to recall that the Web wasn't important (and then we got IE a year later), that MS Bob was going to make computers usable by everyone, and that no one would need more than 640K of RAM.
  • by Black Parrot ( 19622 ) on Tuesday March 30, 2004 @10:08AM (#8713246)

    > He further predicts -- ugh! -- that software will not be written but visually designed.

    "Let's start with a blue background that fills the whole screen..."

  • Of course (Score:5, Funny)

    by Wizard of OS ( 111213 ) on Tuesday March 30, 2004 @10:08AM (#8713247)
    Of course, if you'r stock is worth a few billion dollars, the cost of hardware is 'almost free' :)
  • by brejc8 ( 223089 ) * on Tuesday March 30, 2004 @10:08AM (#8713250) Homepage Journal
    Strange that the only area where we had originaly visual design has now almost completely moved to writing. I am thinking of hardware design CAD where the entire industry now uses VHDL/Verilog instead of schematics.

    The reasons were because its is easier to CVS/grep/replace...

  • by willtsmith ( 466546 ) on Tuesday March 30, 2004 @10:08AM (#8713253) Journal

    Wow another great prediction from the anti-psychic Bill Gates.

    Sorry Bill, but software is far more replicable than hardware. It's the SOFTWARE that is becoming more free as we go along.

    As far as visual goes, I don't think that's correct. He's envisioning a workflow type application for controlling logic. Diagramming most code is far more difficult then simply writing it. 4GL is a pipe dream.

    I DO believe that future programmers will be more like carpenters. High levels of modularity will make custom software construction as practical as cutting and nailing/gluing/screwing together the components down at Home Depot. Programs that ARE sold will be far more extensible (plugin enabled) with managed code.

    The future of software is changing. As usual, Gates doesn't have a clue. He was right about ONE thing 30 years ago. He swindled the owners of Q-DOS and IBM. He's been riding that ever since.

  • by akiaki007 ( 148804 ) <aa316&nyu,edu> on Tuesday March 30, 2004 @10:11AM (#8713281)
    Simply because sometimes you can't control what runs through your mind. Say one day you're bored and you start thinking about games, your ex (perhaps games with your ex), about the conversation you had last night with your friend, or about the stupid things you did when you got drunk last night, and the next thing you know, you've got yourself with your ex in some crazy sex position on the screen or perhaps a picture of you hanging onto the wall relieving yourself because you forgot to go at the bar before going home...and your boss walks by. "But I was just doing work....Please don't fire me!"

    Yeah, I'm all for visual designing :-D I come up with some great software. As always, the porn industry will be the first industry to embrace this new technology.
  • by gilesjuk ( 604902 ) <giles.jones@NoSpAM.zen.co.uk> on Tuesday March 30, 2004 @10:11AM (#8713285)
    Just like Gillette virtually give away their shaving handles and printers cost next to nothing they're going towards making PCs like games consoles.

    What is worrying is you can only succeed if you make you product unable to be used for anything else. So for games consoles you have to make it near impossible for anyone else to be able to write software (especially free software) for the device. For printers you need to make sure that nobody else can supply ink.

    There's no such thing as a free lunch, you pay one way or another. If the hardware is next to free then the software will be subsidising it. The problem is for this to work for Microsoft they need a PC platform that can't run Linux, so I can see that their inroads into the BIOS, DRM etc... (see XBox for the beginnings of an implementation) are quite worrying.

    Of course there will never be a situation where there won't be an x86 platform that can't run Linux, it is too popular in Japan, India and China.
    • >>What is worrying is you can only succeed if you make you product unable to be used for anything else.
      Exactly. But at the same time your product must remain inter-operable with others on higher, meta-level: your game console should be able to connect to the Internet, your printer should be able to connect to various computers, etc.
      So the result of these two trends will be the world of higly specialized nearly disposable devices you can plug in and out as needed.
      Tinkering will be reduced to meta-
  • by phillymjs ( 234426 ) <slashdot AT stango DOT org> on Tuesday March 30, 2004 @10:13AM (#8713313) Homepage Journal
    And what onerous restrictions will I have to agree to to receive and use said free hardware?

    How many laws will be purchased be the large companies so Cuecat-esque hardware EULAs will actually have teeth and be enforceable?

  • Puhleeeez...... (Score:5, Insightful)

    by El Cubano ( 631386 ) on Tuesday March 30, 2004 @10:14AM (#8713320)

    Hardware costs will fall sharply within a decade to the point where widespread computing with speech and handwriting won't be limited by expensive technology, Microsoft Corp. (NasdaqNM:MSFT - news) Chairman Bill Gates (news - web sites) said on Monday.

    This looks like a quote from 10 years ago talking about today. In '93, an "entry level" PC cost upwards of $2000. Today, an entry level machine that is far more capable costs only 10% of that. Not to mention that the $200 price tag represents a now miniscule fraction of most people's income.

    I would say that hardware is already "free" when compared to software. This is becuase you can buy a $200 machine (real tangible manufacturing cost per unit) and put a $200 copy of Windows (with no real production cost) on it. I am sure that the hardware prices can go lower, but hardware is already a commodity. Software has yet to become a true commodity.

  • by JetScootr ( 319545 ) on Tuesday March 30, 2004 @10:15AM (#8713332) Journal
    Several years ago (more than I care to admit), where I work, the mainframe manufacturer offered free hardware if we would continue to pay the software licenses. Free hardware meant an entirely new mainframe, ten years younger than what we already were running on.
    Now we're running on Unix, and saving money. Bill's just blowing smoke, telling us his dreams.
  • by psycho_tinman ( 313601 ) on Tuesday March 30, 2004 @10:17AM (#8713356) Journal

    For Microsoft and for a lot of other companies, I think the realization has dawned that concentrating on hardware is a losing proposition .. (Hello, Sun ? are you listening ? Maybe you know better than these guys). As a counterpoint, though, I'd like to offer Apple and their iPod/iTunes strategy. Offer software on the cheap to push out the hardware..

    You may upgrade your machine once every 6 months to an year.. However, your software would be service oriented, so you'd be bled dry as updates/small missing features and patches were charged for. A constant stream of revenue, with margins that can't be squeezed out due to competing manufacturers and improving manufacturing processes. A steadier way of earning revenue, if you will. This is what I would imagine Microsoft to want.

    Here's the problem, though. The free software genie has been let out of the bottle. Just like the lowered price on the XBox made several people (myself included) think about buying one for a low cost machine and installing Linux on it, if there is a free software alternative that will run on this free hardware, you will get people using it. Ultimately, this will just lead to stronger protection against "illegal" modifications to the software.. For example, if you get a PC free, you must run Windows on it, and never format it to install Linux.. something along those lines. He wants it. I personally do not. Cheaper hardware is good, but I want choice in what software I use and I don't think being locked into one company will offer me this.

    I agree with his point about visual software though. VB was tremendously popular for that reason. Because it let people quickly design interfaces and software that sort of worked. For folks who don't do programming for a living (and maybe a few who do), the thought of whipping out something that they can actually use on their own computer is a tremendously appealing notion. More than anything else, Visual Basic helped a whole new bunch of people (who might otherwise have not programmed at all) get into the software industry. The problem is: who will write the server side software ? Who will perform the tweaks ? Who will administer and optimize and tune things ? The need for programmers and for code crunching won't go away overnight, and I doubt it will go away at all. There are advantages to textual representations (as opposed to visual ones) in existing tool support, and there are also advantages in that textual means of representing a problem work on many different paradigms (not just client interfaces).

  • MS Labview? (Score:3, Interesting)

    by tony_gardner ( 533494 ) on Tuesday March 30, 2004 @10:21AM (#8713394) Homepage
    Maybe he's talking about something like labview, where new programs are made mostly by linking together little boxes on the screen. Each box contains some components which are either prespecified, or can be filled in.

    I've used labview just for writing programs to link to IEEE hardware, and it certainly is much easier to
    deak with a large number of modules when they're visually represented, and very fast to kludge together a fast fix.

    The only thing is that the debugging and maintainence is a nightmare because unlike a normal C/Fortran, not all of the program is visible at once (it's in a thousand tiny blocks), and so looking at several related bits of code is very time consuming. So much so that we recently rewrote some labview code in c, just to improve the clarity and maintainability.
  • by javatips ( 66293 ) on Tuesday March 30, 2004 @10:23AM (#8713417) Homepage
    I'm working on my second project using the MDA (Model Driven Architecture) approach. With MDA, we are able to generate most, if not all, of the infrastructure code. The only thing that developers need to do is writing business code.

    Designer will create the proper UML diagram to represent the structure and some dynamic aspect of the application in a platform independent model. Then we apply some code generation templates to generate the code for a targetted architecture.

    If we go a little further with the code generation, we could actually implements most of the business logic structure based on sequence diagrams.

    For the front end, while it would be hard to generated a really nice interface, we can generated what need to by put in a screen. Then it's a matter of applying a CSS or using a visual editor to reposition the component in the screen.

    I can see that in 10 years, most of the business code will be written that way... Note that one of the premisse of this happening is that proper analysis and design must be done. For that we must change the mindset of a lot of people.

    As for people fearing for their current developer status... These people will have to grow up and start doing real developement instead of using the use the force approach. And for really good developers/architect, there will always be a need for someone to define an architecture and create/maintain the templates required to translate the visual design into real code. And there will also be a need for good developer to write code to implements the complex algorithms required by some applications.

    Anyway, writting most business related code is boring and repetitive, so why not generate it!
  • by silverbax ( 452214 ) on Tuesday March 30, 2004 @10:24AM (#8713434)
    Step 1: Predict everything that can possibly happen. Nothing is too wild. Some examples:

    * In the future, every home will have a robot that carries a machine gun.
    * Cars will not only drive themselves, they'll demand equal rights.
    * Computers will be made only of light and sound.
    * Computers will learn to upgrade theselves - not because of initial programming, but as a survival mechanism to prevent obsoletion.
    * IT outsourcing will be controlled in some sectors by organized crime and gangs. This will start in Las Vegas and move outward.
    * Email will be beamed directly into your brain. You will be able to type an answer in your head.

    Step 2: Wait for at least one prediction to come true ( even slightly true ) and be declared a prophet.
  • by HiredMan ( 5546 ) on Tuesday March 30, 2004 @10:25AM (#8713443) Journal
    Hardware requires factories to make components, people to assemble them, trucks to ship them and people to sell them. It can never be free.

    The only way for the computer to be "free" is the way cell phones are now "free". When you sign-up for 3 years of M$ OS/Office suite subscription, MSN broadband with obligatory Passport suppport for on-line shopping and agree to transfer $500 to that account you will get a "free" PC. Ignore the fact that you will be paying $100/month to M$ for that "free" PC.

    This fits in well with the M$ philosophy of business - they don't really care what "product" they sell as long as it comes with a M$ EULA and license. Check my journal for a more detailed look at the M$ business plan.

  • by Jerf ( 17166 ) on Tuesday March 30, 2004 @10:29AM (#8713485) Journal
    Visual programming is one of the canonical examples of "Gee, I have no clue how it works but wouldn't it be really cool if...". Nobody has a clue how to do significant programming in it; it's never even had a decent prototype, let alone any reason to think it will work in general.

    Sure, there are isolated instances of it being useful, mostly in drawing flow diagrams for signal processing, but that's far from the general case.

    Other then that, though, it's been a miserable failure. Software doesn't look like anything in real life, and real life metaphors are effectively useless for manipulating it. Every tried to use a multi-level UML diagram, where each box contains boxes that contain boxes? That's what visual programming looks like. A confusing, ultra-hyper-dimensional object, where every detail is critical (even the ones you can't see), where to understand a system requires hundreds of little abstract entities on the screen.

    Software has more "moving parts", by factors of magnitude, then any other human endeavor; the largest software projects dwarf the complexity (in part count) of even the Space Shuttle. (We get away with it because we use effectively 100% reliable parts, whereas the Space Shuttle does not, the problems that causes and the solutions they require mean the Space Shuttle is still IMHO a superior engineering work to an office suite. Nevertheless, don't make the mistake of underestimating the complexity of software; even the smallest program can dwarf a small car in complexity.)

    With a clearer understanding of what is being asked for, it is easy to see that visual programming has been a disaster for fundamental reasons, not ones that can be abstracted away. Imagine the Mozilla source code. It contains megabytes upon megabytes of code. Each and every line must be represented to understand the whole correctly (although no one person may need to understand the whole.) One way or another each line must be represented on the screen; if you're trying to do it "visually", then you're hosed. You can't abstract "(cutcrn*)DO_LOAD((void *)nm_mungl, andlefle->getLumpiness(MAX_LOAD_LUMP_COUNT, (int)uniQuad), USER_MACROS(LOAD));" visually, because you'll either lose critical information, or have an unusably cluttered screen.

    There's just no way around it.

    "But what if I design special modules that can be hooked together cleanly?" Then you'll have special modules that can be hooked together cleanly, as long as they do exactly what you need, which they won't. We also have tons of experience with such special modules, and they never work completely in general. You can build a DSP out of such things and that's about it... and even then, that's just compositing the existing DSPs together, I wouldn't want to build the insides in a visual language in the general case. (You could get some milage out of it, but you'd still be shelling out to text code.)

    You think I'm wrong, you think you have some clever way to reduce the amount of necessary information on the screen without throwing away something the user needs, show me the code. To date, nobody else has managed that, despite a lot of trying by smart and dedicated people, and given that we clearly don't need faster computers to do "visual programming", I think you ought to consider that a damned big clue before you consider punching the "Reply" button and making vague, hand-wavy gestures to the effect that I'm wrong.

    Consider the source: I think there's a reason you're hearing this from Bill Gates, who probably hasn't coded significantly in decades, and not the .Net team, who probably are also cringing and shaking their heads privately as well.
  • Gillette model? (Score:5, Insightful)

    by Creepy ( 93888 ) on Tuesday March 30, 2004 @10:37AM (#8713569) Journal
    Although he claims it will be falling prices, somehow I see the Gillette model creeping in (give away the razor, sell the blades at a premium) - mainly because the hardware will never be "free," as there is always manufacturing cost involved.

    Basically, what he's saying is that hardware prices will drop to the point where they can charge for software and give the hardware away for free. I find this quite ironic because it used to be the other way around - sell the hardware at a premium and toss the software in for free.

    If I had my way, hardware prices would drop to nothing as Bill proposed, and I'd create free software for it, making it a free-for-all... nah, it'll never happen, but wishful thinking :)
  • by johnlcallaway ( 165670 ) * on Tuesday March 30, 2004 @10:43AM (#8713615)
    • Chip makers will contine to create advancements and will want their R&D dollars back, just like Mr. Gates. This is why software is expensive; it is cheap to to burn a CD but time consuming to develop.
    • Two words .. advertising costs.
    • Chip makers delay the release of new chip sets if they have significant inventory of other models. This keeps the prices of current chips artifically high until the manufacturers feel they can't milk any more out of consumers. Chip makers will be sure to not release new products until demand is there and they recover R&D costs for older chips.
    • CPU and memory chips account for less than half the cost of a PC; disk drives, monitors, DVD/CD drives, cases and motherboards make up the rest. These items have too many mechanical/structural parts to realize significant savings from improved chip manufacturing techniques. Even if the memory and CPU were free, systems will still cost a few hundred dollars.
    • Some people will always want/need advanced features, and computer systems and chip makers will always charge a premium for those items.
    • Chips contain software (on-board video, BIOS,etc.). I doubt if the makers of those software components will start giving it away. But, if open-source alternatives became available, those items would realize additional savings. I would not be surprised if more software wound it's way into hardware as the cost of updating firmware becomes cheaper. Hardware video drives can be a lot more effective than OS video drivers.
    Until chip manufacturers stop releasing new products every few months (reduces R&D), stop advertising, and create an entire system on a chip, including structerual components, external interfaces (wireless??), storage, and displays, computer systems will never be 'almost free'.
  • by Junks Jerzey ( 54586 ) on Tuesday March 30, 2004 @11:08AM (#8713907)
    Look at it this way: the 48K Apple II was introduced in the US at $1795. Now, a typical bottom-end cell phone has much more computing power. You could put the entire Apple II on a $20 FPGA, or make it an ASIC and the price would be $1 or less in quantity.

    I bought a 333MHZ Pentium II based PC in 1998. For software development and everything else I did, it was fine for the following five years. I finally upgraded to a 3GHz P4, just because it was cheaper than upgrading the OS and various parts individually. In my timings, this PC is roughly 15x faster than the old one, plus the video card is at least 10x faster than the one I bought in 2000. This is a lot of power, and it's the least I've ever spent on a PC.

    Or consider game consoles. A $150 game system is more powerful, in all ways except memory, than a computer from 5-6 years ago. Video-wise, they're much more powerful. Next generation consoles are going to outrun current desktops...for $200.

    The short version is that computers get more powerful, then they get cheap. At some point power ceases to matter, especially if you have a GPU or video compression chip to offload lots of work to. Imagine if a 2 or 3GHz chip could be made to run at 10 watts of power and cost $5. For a 65nm PowerPC, this is reasonable. What's needed is economy of scale. An alternate approach is that "low end" processors in cell phones and digital cameras get to where they're fast enough to usurp a desktop. Then put a video compression chip in there, or other custom hardware to the bulk of the work. At $20 for a complete system, that's a big deal.

    Or even consider alternate, custom CPUs. An x86 desktop CPU is expensive because it includes all sorts of junk, like MMX support and 16-bit mode and legacy instructions and SSE2 and all this other marginalized stuff. And still they're too general purpose. C++ doesn't matter any more. Well, it matters because it's "fast," but not because people really like it. C++ doesn't make you happy the way Haskell or Python or Smalltalk do. Take a minimal instruction set designed to support one of those languages, then implement a simulator for it, then an FPGA, then an ASIC. Keep it simple, keep it fast. You could easily have a 20MHz part pacing high-end desktop processors for most tasks. Again, combine this with an ASIC for doing heavy lifting like graphics and compression.
  • I don't think so (Score:5, Insightful)

    by hak1du ( 761835 ) on Tuesday March 30, 2004 @11:09AM (#8713920) Journal
    Gates is wrong. Hardware prices don't depend as much on technology but on what people are willing to pay. A PC costs $1000 because that's what people are willing to pay for it, and they happen to get as much hardware and software for that as they can.

    I'm sure Gates would like the entire $1000 to go to Microsoft, but that's not going to happen. It's not going to happen because Microsoft isn't going to produce $900 worth of software that is capable of running on whatever $100 buys you in hardware. That's not a problem with hardware design, it's a problem with the kind of software that Microsoft develops: big and resource intensive.

    On the other hand, you will probably be able to get a really cheap computer that runs Linux and runs it well. We are already beginning to see this with Mini-ITX and Nano-ITX systems: they run Linux so much better than Windows. For $200, you get a full desktop system capable of pretty much everything that a home user needs.

    What really helps Linux is that it doesn't have to push an agenda or "innovate" constantly. If a 1995 word processor written in C runs fine on $1000 1995 hardware, it will run really well on a $100 2005 Mini-ITX system, with a few `bug fixes and feature enhancements. Microsoft's new .NET-based office suite using COM, DCOM, SOAP, DHTML, and whatnot, on the other hand, won't. But Microsoft has to keep changing things in order to get people to buy and pay them more money.
  • by ch-chuck ( 9622 ) on Tuesday March 30, 2004 @11:27AM (#8714104) Homepage
    In hardware design, the trend in the past 20 years has been just the opposite, going from large blueprints of gate and circuits, to a Hardware Description Language [nist.gov] (HDL, like Verilog or VHLD) which is very similar to a programming language like C or Pascal!

    Methinks the emperor has simply announced he wants a change of fashion, and all the trendy loyal subjects in the kingdom have to change their style to fit in.
  • by pandrijeczko ( 588093 ) on Tuesday March 30, 2004 @12:04PM (#8714567)
    I think Bill needs to go take a look around him a little, especially on eBay, where hardware costs "next to nothing" now!

    You can pick up a Pentium II PC for a few dollars/pounds/euros. Put in 256MB of memory and it'll run Windows 9x or 2000 with an office package perfectly happily... I've got several friends and relatives who have benefitted from a lot of my old hardware, have PCs now with 300-500 Mhz CPUs that they're perfectly happy with and I've done my bit for the environment also by recycling old hardware.

    I believe Mr Gates is under the illusion that because he locks his user base into his software now, that in 10 years time people will still be willing to part with hard earned cash for software which, let's face it, is hardly innovative anymore because all of the features anyone can think of implementing have just about been implemented.

    If anything software innovation is becoming stale (though who cares because "if it ain't broke, don't fix it") and it's in the realms of hardware, particularly miniaturisation that the innovation is taking place currently.

    I hate to dampen Bill Gates' fireworks but if Linux makes as much an advance over the next 10 years as it has done over the last 10 years, then I think he'll have a few other things on his mind in a decade than just pondering the price of hardware...

  • by davids-world.com ( 551216 ) on Tuesday March 30, 2004 @12:16PM (#8714720) Homepage
    Gates is talking the same kind of BS that we've been hearing from 'visionary' "scientists" for around three decades now, and exactly what makes life hard for me and colleagues that try to get computers to do something useful (or fun) with natural language.

    Gates and other marketing experts are managing expectations in the wrong direction. They promise something that they cannot realize. What common people understand when Gates talks about "real speech recognition" is a computer that will analyze your input in a noisy environment (where it matters most: out on the street!), contextualize it with what you've said before and with what's on the screen and with all the things that we call 'common sense', and then react accordingly.

    A lot of these things are possible in very limited, well-modelled domains. But not in applications for 'real users' that deal with a variety of information. And it won't be there in ten years. There are many hard problems to solve, both in defining what is actually linguistically the case or how to learn it from a corpus, and how to implement processes that happen in parallel in our brains on sequential machines.

    It doesn't help if Gates and co promise the world and hope that their scientists will deliver.

The moon may be smaller than Earth, but it's further away.