Gates: Hardware, Not Software, Will Be Free 993
orthogonal writes "That's small-'f', not capital-'F' free:
according to Bill Gates, "Ten years out, in terms of actual hardware costs you can almost think of hardware as being free -- I'm not saying it will be absolutely free --...." Gates expects this almost free hardware to support two of the longest awaited breakthroughs in computing: real speech and handwriting recognition. He further predicts -- ugh! -- that software will not be written but visually designed."
Visual design (Score:5, Insightful)
but who will visually debug the visual designer?
"/Dread"
Re:Visual design (Score:5, Insightful)
Re:Visual design (Score:3, Funny)
Re:Visual design (Score:5, Insightful)
Re:Visual design (Score:5, Insightful)
Re:Visual design (Score:5, Insightful)
Starting around the time of Windows NT4, Windows got reasonably stable by itself. Windows 2000 took it a good deal further and there was less wrong with the OS than ever before (excluding security issues). Windows XP and 2003 Server have certainly raised the bar quite a bit. So why do we get all these stories about the OS "crashing" all the time? I'll tell you why:
1. Poorly written/Buggy application or server software (Office Suites, Web Servers, Mail Servers, etc...)
2. Misconfigured application or server software
3. Misconfigured OS settings by people who don't really know what they are doing despite their certs
4. Underpowered hardware (overclocked CPUs or just plain slow/older machines, not enough RAM, etc..)
5. Inappropriate hardware (Using a Gateway brand desktop PC as a Domain Controller) non-ECC RAM, etc..
6. Malfunctioniing hardware (bad RAM, MB, CPU, cooling problems, etc...)
In many instances, the people responsible for these machines either don't know HOW to fix the problem, or CAN'T (proprietary software) until their vendor puts out a fix. These people may not know how to figure out where the problems is. Is it hardware? Is it software? Where in the chain does it exists? If anything, most tech's troubleshooting skills are pretty poor. But the ever present pressures from clueless suits to "make it work!" lead to workarounds or... the ubiquitous scheduled nightly reboot. This is NOT the fault of the OS. Don't get me wrong, I'm not saying that Windows is a reliable OS compared to other OSes. I'm saying it's more likely that the applications and services that people are running on their machines are more likely to be the cause of crashes or forcing the nightly reboot. Windows has plenty of issues at the server and the desktop that ARE Micro$oft's fault. But seriously people... put the blame where it belongs the other 50% of the time.
Re:Visual design (Score:5, Insightful)
Re:Visual design (Score:5, Insightful)
Re:Visual design (Score:5, Insightful)
Depends on how it is done. There are some well designed visual modeling & simulation development tools for electronics (Simulink, PSpice, etc.) and mechanical systems (finite element, etc.) These do a relatively good job of simulating "systems". Software processes are not that different from physical processes in electronics and mechanical system. Software rules (e.g., syntax) are analagous to physical laws.
I actually think this is a good idea, if done properly (i.e., not by Microsoft). I'd be a little surprised if this hasn't already been done, I guess nobody has done it well yet.
Perhaps a good open source project. In fact, it could be a big stepping stone for open source. If visual programming (no, not as in Visual C/C++, Basic, etc.) makes programming easier and faster, think of how many more people (like me) could get involved in open source projects. I actually really like this idea.
Re:Visual design (Score:5, Interesting)
Just try to visualize (pun intended) a fairly simple event driven program with lines connecting all events, triggers, functions, data and UI components and you get the idea.
Why visual design will help, but is no panacea (Score:5, Insightful)
There are some caveats on this statement however. First, software systems have discrete, digital states rather than analog behavior. That makes them quite succeptible to error behavior in boundary cases. And the state space for software is extremely large. Universal use of components developed in either an object-oriented or functional way could divide this state space up into manageable components. But one issue that is often overlooked by methodology enthusiasts is that this only increases the size of the building blocks and decreases the number of blocks used for a particular size of project. It does not eliminiate the problem that bigger programs are made from a larger number of component parts. The complexity of a program grows as a function of the complexity of the underlying problem. You can change the function with different tools, but the relationship will still exist.
Re:Visual design (Score:5, Interesting)
Software is written because software is a set of instructions. Software is a set of scripts that respond to events. If software were spatial and totally right-brain (and analogous to engineering or construction), AI would work, and software would probably rely on the immutable laws of physics and chemistry, rather than homespun rules. When I write software, it is frequently because I am taking a "break" from other totally creative pursuits.
The only visual constructions relating to software engineering (SE) that I consider appropriate, are those that relate a large system in terms of its data, logic, and interfaces. This is not necessarily the Rational Unified Process with UML -- indeed, I tend to think people take that too far (eXtreme Programming seems to take a nice perspective on SE in this regard). People also like to relate Classes to real-world objects, usually real-world objects that relate to "parts" of the project. This is tempting but is, I feel, usually inappropriate! A good compromise is a balance between the format of the data (with appropriate, thin, "agnostic bridges"/Classes) and an easy access point for real logic (the Model, of the MVC pattern). I would also recommend a sort of laid-back attitude when developing software: don't live your life by a paradigm or methodology, especially in an immature field (SE) that has a lifetime of problems to solve. You know what problems need to be solved. You also know that not once did you wish you could draw a picture instead of write code. I mean, what the hell? Someone take Johnny Mnemonic away from Gates.
If the software you write, however, is modular enough that you can arrange the pieces/modules/methods like components in a circuit, then go for it. However, this level of widespread code reuse is frankly fantasy; reuse will remain, I believe, as it has: generic libraries used in a custom fashion, i.e., not suitable to be "visually" "dropped-in." Code generation is nice, but it's only appropriate for certain large-scale applications (like large database-driven applications).
If one is to believe Gates on this issue, one is also compelled to believe that Microsoft's research and development department has created software practices at the forefront of software engineering (and indeed computer science. Remember computer science?). I do not believe this to be the case, and I'd make the indictment that this "release" by Gates is purely worldfair in nature, and is for the hoi polloi.
LabVIEW (Score:4, Informative)
If you are trying to do detailed logic rather than just bring already written libraries together, a visual language may not be worse than something like Java. It may also not be better. I do think it makes a nice programming model for bringing together existing modules of code though. (as in LabVIEW Express)
Of course, as in any other kind of choice between programming languages, it all depends on the specific problem domain.
Re:Sustainibility (Score:5, Funny)
Precisely - we can't even get WYSIWYG HTML right.. (Score:5, Insightful)
How likely is it we'll get "visual editors" for complex systems (C/C++/et al., in combination with various other languages, frameworks, data formats/databases, etc)?
Re:Precisely - we can't even get WYSIWYG HTML righ (Score:5, Interesting)
Then CSS/Layers became totally (mostly) supported. Now WYSIWYG editors work QUITE well... (Even some non editors generate perfect code. Photoshop's image ready generates some very nice code)
Anyway, point being, when something is designed to be designed visually it can be visually designed much easier. *grin*
Re:Precisely - we can't even get WYSIWYG HTML righ (Score:4, Insightful)
Ever build an SQL query with Access? Pretty simple if you ask me. How about an excel spreadsheet formula?
Ever use a tool like Together, Rational Rose, etc to build a UML class diagram and have it generate the skeletal source code (class definitions, method names, variable declarations, etc)
Look up Jackson Structured Programming (JSP), it's not popular here in the US, but it's a way to visually design the flow of a method and have your editor spit out code in any one of many languages.
Also, expecting to get such an editor for C/C++ is silly. Not only will the tools evolve, but also the languages.
And on general principle, the doubters usually turn out to be wrong. We made it to the moon, we have a computer in every house, etc.
Re:Precisely - we can't even get WYSIWYG HTML righ (Score:5, Insightful)
Oh, and the marketeers tend to be right? Sorry, but Bill Gates is not known for being a technology visionary.
Re:Precisely - we can't even get WYSIWYG HTML righ (Score:5, Insightful)
That's great for hello-world level tasks, like calculating the fibonacci series, or defining a data model. Sure, you could essentially write a 'Notepad' equivalent with twenty clicks because it's mainly one big text-entry dialog with a file and edit menu, all of which use standard functions and know how to interact with the text dialog.
Now write the grammar-checker. Or, write a program that generates a 3d-model from a list of surface descriptions in XML format. Write a 'bot' that navigates through the 3d-world described while considering tactical and strategic concerns.
At some point all of the trivial clickable stuff is done and you need to do the heavy lifting - things for which no standard dialogs are written. And you always reach this point, if you try to go at all off the beaten path (you know, innovate). For the bot example you could 'click and drag' some inputs to customize an already-written bot AI if it was exposed as an API, but you couldn't make it do anything truly new.
And your falacy in assuming we (the doubters) will be proven wrong is that there's a difference between doubting we'll ever reach the moon and doubting that we'll reach it with method X. I don't doubt that programming simple things will become easier, I already see this in fact. I merely doubt that it'll happen in a drag-and-drop interface and that this data modelling will ever be on the cutting edge.
It'll come along and handle all the trivial stuff, like letting users script application usage, or define 'macros' in programs like Photoshop where you drag the output of a filter onto another filter, into a loop of filter and sharpen till a certain point, to a resize function, etc.
We'll get to the moon, but your hot-air balloons won't be how - not that we won't have hot-air balloons, but it's painfully obvious to someone in the aerospace field that hot-air balloons are of limited use in travel between planetary bodies (though inflatable balloons did function well as a landing mechanism), much like clickable interfaces might be used as part of many systems, but not as the core.
Re:Visual design (Score:5, Insightful)
The basic unit of code in LabView is called a VI (Virtual Instrament - think function). When creating a VI you have two parts - the Front Panel (interface) and the Block Diagram (implementation). On the front panel you create a bunch of widgets which serve as the input and output to the VI. Each control has it's own data type for example numeric controls, and sliders are int or float, buttons, swithes and LED's are binary, text feilds are string, pulldowns are enum, etc. You have an array controls and cluster (think struct) controls which can contain other controls. You also have a few highlevel controls like a graph for the waveform type, and some abstract types for standard error handling, and references for open instrement objects, ActiveX objects etc. You should also draw an icon for the VI, which will be it's representation when being called from other VIs. So basically every function you write automatically has a user interface, which doubles as it's signature declaration. This comes in handy when doing black box testing.
Now in the Block Diagram these controls show up as input and output terminals, which you wire to other things. For example you can call other VIs, by wiring data to the inputs on the left of the VI icon and the outputs on the right hand side. The types on both ends of the wires must match and the wires are drawn with different colors to indicate their type (derived from whatever their input is - you don't have to explicitly specify wire type).There are no variables (well there are globals, but you don't use them much) data just flows from the input terminals to the output terminals, with the runtime system executing whatever happens to be in the way and taking care of memory management.
You have all the standard flow control constructs. A switch statement is a box with a special terminal that you wire for the conditional, and then a pull down box at the top, that lets you enumerate and switch between all the different cases. You can wire just about any type into the conditional terminal. The simplest example would have a boolean input wire and only one case - true - ie an if statement. You have foreach loops which iterate through all the elements of an array you wire in, and while loops (technically a do-while) which is another box with an internal terminal for the conditional. And so on.
One of the intersting things about this language is that because execution order is determined by data flow, not program text, it is inherently parallel. If you draw two loops on the same diagram, and one isn't dependent on the other for data, then they will operate concurrently.
Okay enough explaining the interesting parts of the language, onto the thrashing. Do not believe what NI (the makers of LabView) tell you about increased productivity. It is true that you save some time due to the fact that this is a high level language, and comes with a nice set of libararies. However, this is offset by the fact that it takes so much longer to draw code then it does to type it. A picture may be worth a thousand words but an icon is worth exactly one. It only takes slighly to wire up a function, or draw a loop than it does to type it. But where the really killer comes in is you now have the added complexity of having to think about how to layout all these elements, and predict how much space you will need for them. If you predict wrong you will be constantly resizing boxes and rerouting wires. As you can imagine refactoring is a huge pain, so you better have a perfect design when you start, and we all know that we never have bugs in the design, right? And we never want to modify our program to do things in the
Re:Visual design (Score:5, Insightful)
Yeah, but that happens to also be the step that introduces the details of the logic! These details can't be magically derived. They must be crafted by a programmer. If that involves drawing lots of highly detailed pictures within pictures at the "design" level, then fine, but it wouldn't make anything less complex or less bug-ridden. For the most part, the complexity of programming is inherent. Abstraction and the use of building-block libraries help tremendously, of course, but these techniques work just as well in the written-programming-language world as they do in the design-by-drawing-pictures world.
Re:Visual design (Score:4, Insightful)
In fact, it's likely to just make things more confusing. There's a reason that mathematicians don't do geometric proofs so much anymore - symbolic manipulation is more clear, more general, and more compact. It's the same reason that hardware designers use things like VHDL now.
Many people seem to think that a "graphical language" makes things easier for lay-people to understand. And that's true at the very highest levels of abstraction. But when you get down to the details a graphical language must have the same expressiveness as an equivalent symbolic language. That means that it will almost inevitably have the same level of complexity as the symbolic language, and be equally impenetrable to lay-people. One only has to look at the newest versions of UML to see this effect in action.
Bottom line: graphics are great at a high level of abstraction, and as documentation to aid understanding of a symblic expression, but for implementing complex systems symbolic languages are much better.
Re:Visual design (Score:5, Funny)
First person shooter. Kill the bugs, capture the features...
Re:Visual design (Score:5, Funny)
"/Dread"
Re:Visual design (Score:5, Interesting)
First person shooter.
This reminds me of a cool hack that uses Doom as a "process manager" [unm.edu]. Killing a Doom baddie basically "kill -9"s the process.
Re:Visual design (Score:5, Funny)
Re:Visual design (Score:3, Funny)
Thank you netizens, you may return to your regular visual designing jobs...
Re:Visual design (Score:5, Funny)
Re:Visual design (Score:5, Informative)
Re:Visual design (Score:3, Insightful)
It's like saying "all software will be written in high-level, garbage-collected languages like Java, C#, python, perl, et al".
Rebutals that "yeah, but what is the Java runtime written in?" or "the OS kernel has to be written in C" are true, but miss the point - these activities are niches, so the original statement is over-general but mostly true. Most application software will be written at a higher level.
Yeah, right (Score:5, Insightful)
Re:Yeah, right (Score:5, Funny)
Screw the 1950's promise of free electricity. Where the hell is my dishwashing, breakfast making, stainless steel life sized Robot?!?!
I want Robots!
Re:Yeah, right (Score:4, Funny)
It's hanging out with your happily house-chained June Cleaver 1950's wife in your fully mechanically automated home of the future!
Re:Yeah, right (Score:4, Funny)
(come on, anyone else here watch SNL in the day?)
Too Cheap To Meter (Score:5, Informative)
Not "free": the exact phrase, from Lewis Strauss, chairman of the Atomic Energy Commission, was:
"Our children will enjoy in their homes electrical energy too cheap to meter." [bartleby.com]
-kgj
Re:Yeah, right (not with bloatware) (Score:5, Insightful)
At one level, hardware can already be free. I saw a small PDA with about the same specs as the original Palm Pilot selling for $19.95. Yet such devices are NOT popular because everyone wants the latest wiz-bang features on their PDA.
Its the same reason why laptops get such aweful battery life. I'm sure that someone could create a very functional laptop with a 50 MHz processor that does a competent job running a basic office suite and have superb battery life. As a real-life example, my Psion 5Mx gets 30 hours on 2 AAs and does a great job of basic office work on a 37 MHz ARM processor. You don't need battery-sucking GHz to do the job.
Yet nobody wants to buy "under-powered" devices because they have been trained for 2 decades by Wintel that they must have the fastest machine to get decent performance.
Re:Yeah, right (not with bloatware) (Score:5, Insightful)
If everyone was willing to settle for older or slower hardware, demand for it, and thus prices, would be higher. Did you ever stop to wonder why older or lower-end stuff is so cheap? The people buying the new stuff at much higher prices are essentially subsidizing it.
Where laptop power goes ... (Score:4, Informative)
Right now, the display is the big power consumer in portable devices. The processors have been tuned to use minimal power.
The Scion 5Mx has a B/W LCD screen. How long do the batteries last when the backlight is on????
When OLED comes to laptops, that will significantly increase battery time.
Re:Yeah, right (Score:5, Informative)
Product costings from richest man in world? (Score:5, Funny)
Q.
I dunno, he got some of it right... (Score:3, Informative)
He's just predicted Visual BASIC post factum. Whoopee. (-:
Free (Score:5, Insightful)
Re:Free (Score:5, Insightful)
Re:Free (Score:5, Insightful)
$750 of Microsoft software for a $2500 computer didn't seem like all that much to most people back in the 1990s, but the times, they are a-changin'.
Re:Free (Score:5, Interesting)
The price of the average "IBM" PC sold has dropped by roughly 400% since I first bought one in 1989. At the same time, processor speed on these average machines has increased by 50,000%. If this trend continues, and I see no reason for it not to, the average computer in 15 years will have a 10 THz processor and cost $125.
Now, the while the cost of hardware continues to go down, the cost of software continues to go up. The number of people who are needed to build the massive applications to make use of 10 THz will be huge. Somebody's got to pay the damn programmers, right? So the price of software will continue to go up. Even if OSS succeeds and the operating system and incidental programs are free, the CUSTOM programs will be expensive.
Therefore, it makes sense to give the hardware as an added bonus with the software. The same way you have cell phones given away with calling plans today. This isn't a Microsoft thing...this could easily be an IBM thing or an Adobe thing, etc.
Please Bill.. (Score:5, Insightful)
Nice try, Bill.
He's saying the tangible parts of the system (the hardware) will be virtually free while the freely duplicated software will not be. Fabrication plants cost millions, each chip has a real cost, each resistor has a real cost. Software, once written, can be copied countless times..
You'd think Bill had a vested interest in all this..
Re:Please Bill.. (Score:5, Insightful)
Yet, hardware has gone down in price from where it was in the mid 80's while software has gone up.
Re:Please Bill.. (Score:4, Funny)
-Grump
Re:Please Bill.. (Score:5, Insightful)
But for software, it is much more difficult to measure improvement in a quantitative sense. It can be done, but not easily if the vendor wants to muddy the waters. I believe feature creep & bloat in Windows is to prevent direct comparisons with previous iterations of the product.
Although hardware costs have come down, its the result of competition in the free market, easily understood and measured as a physical good. Microsoft OS? They've become more expensive, with less value add in each iteration.
Re:Please Bill.. (Score:4, Interesting)
Yeah, I also thought this.
But before the Linux-era, Billy was actually correct: At DOS-times, computers cost about 5000$, while DOS itself was less than 100$ (full version) IIRC. Today computers typically cost less than 1000$ but Windows XP (full version, crippled) costs 200$ or (full version, uncrippled) 300$.
On Windows-servers the ratio of the total system price which is going to Microsoft is even higher.
Also, Microsoft is doing much more against piracy these days (WPA, BSA-audits, etc.) than 20 years ago, which de-facto translates into yet another price increase.
Even though Bill Gates seems to have the delusion that this can go on like nothing happened, he is wrong: On servers, Microsoft already feels the heat from Linux and the desktop domination already shows some cracks.
Re:Please Bill.. (Score:4, Interesting)
Re:Please Bill.. (Score:3, Insightful)
Yes, once it is developed, it can be copied 'freely', in quotes because I do not want to put out the same connotation as others may want to read into this.
To get to that point, you have thousands of manhours put into the development. After you get to that point, thousand more manhours are put into maintaining it. Where does the money come from? Are we just going to change our motos to Off The People For The P
Already now ? (Score:4, Funny)
Hrmmmm.... (Score:5, Insightful)
I suppose that this could be construed as the ultimate embrace and extend (then smother) approach though, right? Get a huge number of companies to support your position and build your company and then overnight, take all of their business revenues over in one way or another.
As for Gates predilection for predictions..... I would like to see fewer grandiose predictions (although speech recognition and tablets and visual programming are decidedly not grandiose and are in fact products shipping and under development by a number of companies) and more fundamental focus on making Microsoft products suck less.
Re:Hrmmmm.... (Score:4, Informative)
Spare me the obligatory replies about how much cheaper you can do all this with white-box hardware and Linux -- I'm not talking about that, I'm trying to add context to BillG's pronunciamento.
Microsoft leading the way (Score:4, Insightful)
Actually ... (Score:3, Interesting)
Microsoft DOES sell X-Box AND Human Interface Devices. They're certainly not giving THOSE away. Though if Microsoft could get enough royalties of games, I could see them giving X-Box away.
In the future, my desktop will cost $20 and my Intellimouse will cost $200. Go figure
Security Still Top Priority? (Score:5, Funny)
"Remember, quality is our top priority." [dilbert.com]
I hope not (Score:3, Interesting)
Re:I hope not (Score:5, Insightful)
Computer Science on the other hand, is a mathematical discipline which involves working out how to do things better, faster, and with less energy. It's about algorithm design, and ways in which to make a computer most efficiently process mathematical representations. It'll be useful far beyond the use of general "coding".
Coding itself is becoming more and more prevalent. I have many friends who aren't even scientists who know how to code, and were even required to for their humanities classes (from English, to History, to Foreign Language). This is a good thing, IMHO. Coding is a great general purpose skill.
Don't devote your life to the practice of programming, devote it to understanding why certain things work better, and how to further refine our techniques of computation. Work on understanding the hardwaresoftware interface, and you open up all kinds of new fields, from embedded engineering, to robotics.
Take the hint from the majority of good Universities who teach computer science, where you are simply expected to pick up a language in your spare time, because that aspect is secondary to the theory, and the easier of the two.
Re:I hope not (Score:4, Interesting)
I don't think this was ever true.
Computer Science on the other hand, is a mathematical discipline which involves working out how to do things better, faster, and with less energy. It's about algorithm design, and ways in which to make a computer most efficiently process mathematical representations.
Certainly, certainly, but how is this different from programming? Programmers work out how to do things better, faster, and with less energy. Programmers design algorithms. Programmers design ways to make a computer most efficiently process mathematical representations. And not just mathematical representations, either. All kinds of representations, in fact.
I won't dispute your central point. I think it's vital to make a distinction between hard programming and soft programming. But the gap between the theory and practice just isn't as clear cut with computer science as with other disciplines. There is a big difference between designing an engine and building one. The difference is much less pronounced in software, because at some point the design or description becomes a program in its own right.
Software will never be easy (Score:5, Insightful)
Billy should know better.
Enough (Score:5, Funny)
For the last time, Bill...I still don't want a tablet pc!!!!!
Dave
Re:Enough (Score:4, Funny)
[vikings]
tab tab tab tab,
tab tab tab tab
tablet pc! tablet pc!
There are plenty of valid uses for a tablet PC (Score:4, Informative)
Ugh? (Score:3, Interesting)
It's no different than using scripting languages, really; it'll have its own set of trade-offs.
Ah, visual design in VB (Score:4, Insightful)
I can imagine how (Score:5, Insightful)
You'll get free TCPA enabled hardware but it'll only let you run software by a certain company, software you'll have to pay for.
Yeah, yeah, yeah... (Score:5, Insightful)
Re:Yeah, yeah, yeah... (Score:5, Funny)
Visually designed... (Score:5, Funny)
> He further predicts -- ugh! -- that software will not be written but visually designed.
"Let's start with a blue background that fills the whole screen..."
Of course (Score:5, Funny)
visually designed software (Score:3, Interesting)
The reasons were because its is easier to CVS/grep/replace...
Mr 640k and unimportant internet (Score:5, Insightful)
Wow another great prediction from the anti-psychic Bill Gates.
Sorry Bill, but software is far more replicable than hardware. It's the SOFTWARE that is becoming more free as we go along.
As far as visual goes, I don't think that's correct. He's envisioning a workflow type application for controlling logic. Diagramming most code is far more difficult then simply writing it. 4GL is a pipe dream.
I DO believe that future programmers will be more like carpenters. High levels of modularity will make custom software construction as practical as cutting and nailing/gluing/screwing together the components down at Home Depot. Programs that ARE sold will be far more extensible (plugin enabled) with managed code.
The future of software is changing. As usual, Gates doesn't have a clue. He was right about ONE thing 30 years ago. He swindled the owners of Q-DOS and IBM. He's been riding that ever since.
Re:Mr 640k and unimportant internet (Score:5, Insightful)
SB
Visual Design is prone to problems (Score:3, Funny)
Yeah, I'm all for visual designing
Gillete model, Consoles, Printers etc... (Score:5, Insightful)
What is worrying is you can only succeed if you make you product unable to be used for anything else. So for games consoles you have to make it near impossible for anyone else to be able to write software (especially free software) for the device. For printers you need to make sure that nobody else can supply ink.
There's no such thing as a free lunch, you pay one way or another. If the hardware is next to free then the software will be subsidising it. The problem is for this to work for Microsoft they need a PC platform that can't run Linux, so I can see that their inroads into the BIOS, DRM etc... (see XBox for the beginnings of an implementation) are quite worrying.
Of course there will never be a situation where there won't be an x86 platform that can't run Linux, it is too popular in Japan, India and China.
Re:Gillete model, Consoles, Printers etc... (Score:3, Insightful)
Exactly. But at the same time your product must remain inter-operable with others on higher, meta-level: your game console should be able to connect to the Internet, your printer should be able to connect to various computers, etc.
So the result of these two trends will be the world of higly specialized nearly disposable devices you can plug in and out as needed.
Tinkering will be reduced to meta-
Free hardware. Riiiiiiiiiight. (Score:4, Insightful)
How many laws will be purchased be the large companies so Cuecat-esque hardware EULAs will actually have teeth and be enforceable?
~Philly
Puhleeeez...... (Score:5, Insightful)
Hardware costs will fall sharply within a decade to the point where widespread computing with speech and handwriting won't be limited by expensive technology, Microsoft Corp. (NasdaqNM:MSFT - news) Chairman Bill Gates (news - web sites) said on Monday.
This looks like a quote from 10 years ago talking about today. In '93, an "entry level" PC cost upwards of $2000. Today, an entry level machine that is far more capable costs only 10% of that. Not to mention that the $200 price tag represents a now miniscule fraction of most people's income.
I would say that hardware is already "free" when compared to software. This is becuase you can buy a $200 machine (real tangible manufacturing cost per unit) and put a $200 copy of Windows (with no real production cost) on it. I am sure that the hardware prices can go lower, but hardware is already a commodity. Software has yet to become a true commodity.
been there, didn't do that (Score:4, Insightful)
Now we're running on Unix, and saving money. Bill's just blowing smoke, telling us his dreams.
this is *his* vision.. (Score:3, Insightful)
For Microsoft and for a lot of other companies, I think the realization has dawned that concentrating on hardware is a losing proposition .. (Hello, Sun ? are you listening ? Maybe you know better than these guys). As a counterpoint, though, I'd like to offer Apple and their iPod/iTunes strategy. Offer software on the cheap to push out the hardware..
You may upgrade your machine once every 6 months to an year.. However, your software would be service oriented, so you'd be bled dry as updates/small missing features and patches were charged for. A constant stream of revenue, with margins that can't be squeezed out due to competing manufacturers and improving manufacturing processes. A steadier way of earning revenue, if you will. This is what I would imagine Microsoft to want.
Here's the problem, though. The free software genie has been let out of the bottle. Just like the lowered price on the XBox made several people (myself included) think about buying one for a low cost machine and installing Linux on it, if there is a free software alternative that will run on this free hardware, you will get people using it. Ultimately, this will just lead to stronger protection against "illegal" modifications to the software.. For example, if you get a PC free, you must run Windows on it, and never format it to install Linux.. something along those lines. He wants it. I personally do not. Cheaper hardware is good, but I want choice in what software I use and I don't think being locked into one company will offer me this.
I agree with his point about visual software though. VB was tremendously popular for that reason. Because it let people quickly design interfaces and software that sort of worked. For folks who don't do programming for a living (and maybe a few who do), the thought of whipping out something that they can actually use on their own computer is a tremendously appealing notion. More than anything else, Visual Basic helped a whole new bunch of people (who might otherwise have not programmed at all) get into the software industry. The problem is: who will write the server side software ? Who will perform the tweaks ? Who will administer and optimize and tune things ? The need for programmers and for code crunching won't go away overnight, and I doubt it will go away at all. There are advantages to textual representations (as opposed to visual ones) in existing tool support, and there are also advantages in that textual means of representing a problem work on many different paradigms (not just client interfaces).
MS Labview? (Score:3, Interesting)
I've used labview just for writing programs to link to IEEE hardware, and it certainly is much easier to
deak with a large number of modules when they're visually represented, and very fast to kludge together a fast fix.
The only thing is that the debugging and maintainence is a nightmare because unlike a normal C/Fortran, not all of the program is visible at once (it's in a thousand tiny blocks), and so looking at several related bits of code is very time consuming. So much so that we recently rewrote some labview code in c, just to improve the clarity and maintainability.
Visual designing is a reality now! (Score:4, Interesting)
Designer will create the proper UML diagram to represent the structure and some dynamic aspect of the application in a platform independent model. Then we apply some code generation templates to generate the code for a targetted architecture.
If we go a little further with the code generation, we could actually implements most of the business logic structure based on sequence diagrams.
For the front end, while it would be hard to generated a really nice interface, we can generated what need to by put in a screen. Then it's a matter of applying a CSS or using a visual editor to reposition the component in the screen.
I can see that in 10 years, most of the business code will be written that way... Note that one of the premisse of this happening is that proper analysis and design must be done. For that we must change the mindset of a lot of people.
As for people fearing for their current developer status... These people will have to grow up and start doing real developement instead of using the use the force approach. And for really good developers/architect, there will always be a need for someone to define an architecture and create/maintain the templates required to translate the visual design into real code. And there will also be a need for good developer to write code to implements the complex algorithms required by some applications.
Anyway, writting most business related code is boring and repetitive, so why not generate it!
Nostradamus 2.0 (Score:5, Funny)
* In the future, every home will have a robot that carries a machine gun.
* Cars will not only drive themselves, they'll demand equal rights.
* Computers will be made only of light and sound.
* Computers will learn to upgrade theselves - not because of initial programming, but as a survival mechanism to prevent obsoletion.
* IT outsourcing will be controlled in some sectors by organized crime and gangs. This will start in Las Vegas and move outward.
* Email will be beamed directly into your brain. You will be able to type an answer in your head.
Step 2: Wait for at least one prediction to come true ( even slightly true ) and be declared a prophet.
Re:Nostradamus 2.0 (Score:5, Funny)
Step 3: Prophet!
There's free and then there's "free"-tm (Score:3, Insightful)
The only way for the computer to be "free" is the way cell phones are now "free". When you sign-up for 3 years of M$ OS/Office suite subscription, MSN broadband with obligatory Passport suppport for on-line shopping and agree to transfer $500 to that account you will get a "free" PC. Ignore the fact that you will be paying $100/month to M$ for that "free" PC.
This fits in well with the M$ philosophy of business - they don't really care what "product" they sell as long as it comes with a M$ EULA and license. Check my journal for a more detailed look at the M$ business plan.
=tkk
Visual programming - snort! (Score:5, Insightful)
Sure, there are isolated instances of it being useful, mostly in drawing flow diagrams for signal processing, but that's far from the general case.
Other then that, though, it's been a miserable failure. Software doesn't look like anything in real life, and real life metaphors are effectively useless for manipulating it. Every tried to use a multi-level UML diagram, where each box contains boxes that contain boxes? That's what visual programming looks like. A confusing, ultra-hyper-dimensional object, where every detail is critical (even the ones you can't see), where to understand a system requires hundreds of little abstract entities on the screen.
Software has more "moving parts", by factors of magnitude, then any other human endeavor; the largest software projects dwarf the complexity (in part count) of even the Space Shuttle. (We get away with it because we use effectively 100% reliable parts, whereas the Space Shuttle does not, the problems that causes and the solutions they require mean the Space Shuttle is still IMHO a superior engineering work to an office suite. Nevertheless, don't make the mistake of underestimating the complexity of software; even the smallest program can dwarf a small car in complexity.)
With a clearer understanding of what is being asked for, it is easy to see that visual programming has been a disaster for fundamental reasons, not ones that can be abstracted away. Imagine the Mozilla source code. It contains megabytes upon megabytes of code. Each and every line must be represented to understand the whole correctly (although no one person may need to understand the whole.) One way or another each line must be represented on the screen; if you're trying to do it "visually", then you're hosed. You can't abstract "(cutcrn*)DO_LOAD((void *)nm_mungl, andlefle->getLumpiness(MAX_LOAD_LUMP_COUNT, (int)uniQuad), USER_MACROS(LOAD));" visually, because you'll either lose critical information, or have an unusably cluttered screen.
There's just no way around it.
"But what if I design special modules that can be hooked together cleanly?" Then you'll have special modules that can be hooked together cleanly, as long as they do exactly what you need, which they won't. We also have tons of experience with such special modules, and they never work completely in general. You can build a DSP out of such things and that's about it... and even then, that's just compositing the existing DSPs together, I wouldn't want to build the insides in a visual language in the general case. (You could get some milage out of it, but you'd still be shelling out to text code.)
You think I'm wrong, you think you have some clever way to reduce the amount of necessary information on the screen without throwing away something the user needs, show me the code. To date, nobody else has managed that, despite a lot of trying by smart and dedicated people, and given that we clearly don't need faster computers to do "visual programming", I think you ought to consider that a damned big clue before you consider punching the "Reply" button and making vague, hand-wavy gestures to the effect that I'm wrong.
Consider the source: I think there's a reason you're hearing this from Bill Gates, who probably hasn't coded significantly in decades, and not the
Gillette model? (Score:5, Insightful)
Basically, what he's saying is that hardware prices will drop to the point where they can charge for software and give the hardware away for free. I find this quite ironic because it used to be the other way around - sell the hardware at a premium and toss the software in for free.
If I had my way, hardware prices would drop to nothing as Bill proposed, and I'd create free software for it, making it a free-for-all... nah, it'll never happen, but wishful thinking
Why hardware won't become free, or even close (Score:5, Interesting)
almost free hardware makes sense (Score:5, Insightful)
I bought a 333MHZ Pentium II based PC in 1998. For software development and everything else I did, it was fine for the following five years. I finally upgraded to a 3GHz P4, just because it was cheaper than upgrading the OS and various parts individually. In my timings, this PC is roughly 15x faster than the old one, plus the video card is at least 10x faster than the one I bought in 2000. This is a lot of power, and it's the least I've ever spent on a PC.
Or consider game consoles. A $150 game system is more powerful, in all ways except memory, than a computer from 5-6 years ago. Video-wise, they're much more powerful. Next generation consoles are going to outrun current desktops...for $200.
The short version is that computers get more powerful, then they get cheap. At some point power ceases to matter, especially if you have a GPU or video compression chip to offload lots of work to. Imagine if a 2 or 3GHz chip could be made to run at 10 watts of power and cost $5. For a 65nm PowerPC, this is reasonable. What's needed is economy of scale. An alternate approach is that "low end" processors in cell phones and digital cameras get to where they're fast enough to usurp a desktop. Then put a video compression chip in there, or other custom hardware to the bulk of the work. At $20 for a complete system, that's a big deal.
Or even consider alternate, custom CPUs. An x86 desktop CPU is expensive because it includes all sorts of junk, like MMX support and 16-bit mode and legacy instructions and SSE2 and all this other marginalized stuff. And still they're too general purpose. C++ doesn't matter any more. Well, it matters because it's "fast," but not because people really like it. C++ doesn't make you happy the way Haskell or Python or Smalltalk do. Take a minimal instruction set designed to support one of those languages, then implement a simulator for it, then an FPGA, then an ASIC. Keep it simple, keep it fast. You could easily have a 20MHz part pacing high-end desktop processors for most tasks. Again, combine this with an ASIC for doing heavy lifting like graphics and compression.
I don't think so (Score:5, Insightful)
I'm sure Gates would like the entire $1000 to go to Microsoft, but that's not going to happen. It's not going to happen because Microsoft isn't going to produce $900 worth of software that is capable of running on whatever $100 buys you in hardware. That's not a problem with hardware design, it's a problem with the kind of software that Microsoft develops: big and resource intensive.
On the other hand, you will probably be able to get a really cheap computer that runs Linux and runs it well. We are already beginning to see this with Mini-ITX and Nano-ITX systems: they run Linux so much better than Windows. For $200, you get a full desktop system capable of pretty much everything that a home user needs.
What really helps Linux is that it doesn't have to push an agenda or "innovate" constantly. If a 1995 word processor written in C runs fine on $1000 1995 hardware, it will run really well on a $100 2005 Mini-ITX system, with a few `bug fixes and feature enhancements. Microsoft's new
That's odd - visual design of software (Score:4, Insightful)
Methinks the emperor has simply announced he wants a change of fashion, and all the trendy loyal subjects in the kingdom have to change their style to fit in.
Comment removed (Score:5, Insightful)
BS. real speech recognition is far away (Score:4, Insightful)
Gates and other marketing experts are managing expectations in the wrong direction. They promise something that they cannot realize. What common people understand when Gates talks about "real speech recognition" is a computer that will analyze your input in a noisy environment (where it matters most: out on the street!), contextualize it with what you've said before and with what's on the screen and with all the things that we call 'common sense', and then react accordingly.
A lot of these things are possible in very limited, well-modelled domains. But not in applications for 'real users' that deal with a variety of information. And it won't be there in ten years. There are many hard problems to solve, both in defining what is actually linguistically the case or how to learn it from a corpus, and how to implement processes that happen in parallel in our brains on sequential machines.
It doesn't help if Gates and co promise the world and hope that their scientists will deliver.
Re:All this on 640K? (Score:5, Funny)
About 640,000 shares - should be a big enough investment for anyone!
Re:Another Quote (Score:5, Informative)
Re:Another Quote (Score:4, Informative)
Re:I Hate The Bastard But He's Right. (Score:4, Insightful)
Yes! Death to text! If you need proof that communication can be more efficiently done in a visual medium, you need look no further than this very web site where we converse exclusively using...
Oh. Never mind. Move along, nothing to see here.
Re:Cost of hardware =0? (Score:5, Interesting)
It's not that insane, and remember that Gates is a billionaire in part because his company has been abusing its monopoly. To an extent, if he wants something to happen, he can make it happen. What he means by "free" is that users will "subscribe to" software and in doing so, receive a machine on which to run that software, effectively for nothing. This is what Microsoft wanted to accomplish by bullying retailers not to bundle other operating systems. My guess is that they will attempt to use "Trusted Computing" (or some technology just like it) to make their intention into a reality; if you want to run Microsoft's software you will have to run it on computers which only run Microsoft's software or software written by Microsoft's partners (in other words, companies which have bought the right to have their software run on MS's hardware). So they can make the cost of hardware approach zero, so long as they can be sure the hardware is only usable for purposes for which they can make some money. Of course, all of this depend on governments around the world letting MS get away from it.