Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

When Your Hardware Isn't Obsolete Soon Enough 360

GrandCow writes: "There's a really interesting article over at Sharky Extreme on why the computer industry is slowing down recently. He talks about looking for the killer application that will make him go out and spend the big money on a whole new system... and can't find it. It's a really good read. For lots of the people on /. (me included) getting the latest hardware piece is a given, but for many people there's just no real reason to." Strangely, the proposed solution seems to be for the hardware industry to write bloated code so people will have more incentive to replace their currently-OK PCs. (Huh?) All I want is a machine on which Broadcast 2000 works.
This discussion has been archived. No new comments can be posted.

When Your Hardware Isn't Obsolete Soon Enough

Comments Filter:
  • Pshew, telephone service was never mandated for anyone. There's probably many an old coot without phone service to this day.

    What the New Deal rural moderanization programs did was force the telephone systems into "Universal Service", which means that rural customers essentially were subsidized by urban customers. In some areas, the feds also formed subsidized co-operative phone companies to accomplish the same thing.

    Since the end of Universal Service in 1996, if you found yourself on a farm with no phone line, the phone company would want to charge you for all the poles, wire, and labor to get the line to your house. That could be pretty pricey if you were in the middle of Montana or something.
  • I'm thinking primarily of 'X-Plane'. This is an aero sim that I'm very fond of- a real hacker's playground, because unlike all other flightsim games, X-Plane takes a physical design you give it, with weight, balance, airfoil descriptions etc, and literally _models_ the whole pile of parts ten times a second to produce a flight model.

    It works well enough that it can model anything from an X plane to a 747 to a Cessna 172 with enough plausibility that people gripe about small details of the flight handling, without even recognising the brilliance of the accomplishment- the program doesn't know it's trying to model a 172, it's just interacting with the flight surfaces as they are given to it. You can put together absolutely anything no matter how weird and it will model that with the same degree of pretty-much-accurate-mostly.

    I play with this sim on a 300mhz G3, and get almost 20 fps out of it, so by itself it is not that killer app. But- the compelling quality about it is that capacity to model real aircraft behavior from a given design. Well- one word, turbulence. Spins, stalls, nonlinear airflow, interaction of control surfaces- it already models propwash, but the fact is, 100% true modelling of airflow is not possible. Turbulence is chaotic... and yet, the closer you get, the more compellingly interesting the problem becomes.

    An unhappy side-note: I've had to drop out of the upgrade cycle for X-Plane, even though I've paid for 5.* series upgrades, because the most recent ones, with drool-inducing GFX improvements and flight model refinements, are coded to require MacOS 8.5 or 9. I'm sticking to 8.1 so far, because I can control it and have hardware that's known to work with it (and some, that's potentially troublesome under later versions- some, like my glass-lens ADB Color Quickcam, is already rendered incompatible with recent versions of Quicktime). So even though my hardware will mostly support the very latest newest X-Plane- my other software will not. And so it goes- I'll have to catch up with X-Plane when I'm good and ready to run a substantially more recent computer, one that's a better match with the newer software etc.

    So it's not only the digital unwashed who don't upgrade- some people who are very informed on the details of the systems also will resist, _because_ they are capable of establishing 'snapshots' of their computer systems in which any faults are known and avoidable, and the behavior is predictable... and will sometimes pass up 'newer and better' because it is also 'newer and untested'...

  • You will need five channels and that means bigger files, but more importantly more redudancy between channels to be compressed which means more CPU cycles

    But the point is that you _don't_ need that for music. "CD quality" really is the upper bound on what people need from audio for music. They just don't care about having more than 2 channels. (Remember quadrophonic audio?) The vast majority of people don't even care about better-than-stereo audio for movies.

    I bought a Pro Audio Spectrum 16 sound card almost 10 years ago. It was one of the first that was capable of CD quality output. (44KHz,16 Bit stereo). There is still no good reason for me to want to upgrade it today -- it is every bit as good as a Soundblaster Live for playing MP3 files, etc.

  • by heroine ( 1220 ) on Saturday April 14, 2001 @12:58PM (#291521) Homepage
    Compaq donated about half a million dollars in hardware to promote Broadcast 2000 at this year's National Association of Broadcasters convention so if that's the application which is to drive consumer sales, you're probably looking too high.

    Really there is more effort being made to downsize all applications to handheld devices than there is for upsizing applications to supercomputers. Regardless of how smart the mainstream industry is being about this, one thing is for sure, consumers still prefer bigger SUV's over mopeds.
  • by mattdm ( 1931 ) on Saturday April 14, 2001 @09:57AM (#291527) Homepage
    IE 1.0 was released with Windows 95, in August of 1995. From all reasonable accounts, IE wasn't very good until version 5, which was released March 1999. That's three and a half years.

    The Mozilla project started with Gecko in Oct 1998. Even if you start with the less-charitable date of April 1998 (when the Communicator source was opened -- and turned out to not actually be very useful), it's still only three years 'til right now. (And remember, IE didn't start from scratch -- they began with the Spyglass Mosaic codebase.)

    If you look at the current Mozilla roadmap [mozilla.org], even the "if we're unlucky" plan calls for 1.0 to be out by Q3 of this year -- plenty of time to beat IE 5.0 by your suggested metric.
  • OMS is a bit of a processor hog-- but that's mostly because it's difficult to get specifications on the Motion Comp and iDCT features from video card venders.
    If you play at the highest resolutions and with the highest level of detail, some games are pretty demanding-- although performance is usually dependent on the make/model of video card.
    At one site (Tom's Hardware, perhaps) benchmarks processors with a MPEG2 to MPEG4 conversion test.
  • by sheldon ( 2322 )
    Heh. I have a two year old PIII-550 which I run Windows 2000 on and really have no complaints.

    In fact my Win2k server at home is a PPro 200, and it operates perfectly as a file/print server, has my scanner and CD-RW on it and I use it as a development web server.

    So I just gotta ask. What Microsoft bloat? I'm running the latest greatest software on computers that are 2 and 4 years old respectively.

    Heh... 700 Mhz computer to run Linux. *laugh*
  • by sheldon ( 2322 ) on Saturday April 14, 2001 @11:03AM (#291530)
    DOS 5.0 has given new life to all of my old obsolete hardware.

    I tried running Linux on my IBM PS/2 Model 70, and it was slow as a dog.

    Even after upgrading it with a Cyrix 486 processor, 387 math coprocessor, boosting memory to 12 Megs and harddrive space to 200Megs.

    It was still slow running Linux!

    But now with DOS 5.0 I can run Lotus 123 and Wordperfect 5 slick as a whistle I tell you!

    But on the positive side, I now have a top of the line computer for 1991!
  • I recently got a new machine. It might be better t say "new" machine, as it's not a Power Mac G4. But it still is under warranty, and had not been taken out of the shrinkwrap. My "new" machine was a Blue & White Power Mac G3 (Blue G3), and it has been perfect for me. It's got USB and ADB (Apple Desktop Bus, for keyboards, mice, etc.) ports, plus FireWire (no SCSI, alas), three PCI slots, a relatively fast processor (which I'll make faster once it's out of warranty), decent video card, etc.

    The great thing about it for me is that it can use almost all of my existing peripherals (USB 3-btn mouse, ADB Kinesis keyboard) without requiring special PCI cards or funky adapters. It seems blazing fast to me (says the man who still loves the 200 MHz PowerPC) 604e that it does everything I need to do perfectly. SSH, web, e-mail, an occasional game. It's great! Server functions never slow it down. Honestly, I don't need a G4.

    I am not the only person using "old" hardware. Look at the winning responses to our Mac OS X Celebration Essay Contest [linuxppc.org], and you will see people that could take LinuxPPC and install it on a Power Mac 7200 (75 MHz PowerPC 601), and turn it from something that took up closet space into an effective AppleShare (netatalk), Samba, web server, or firewall.

    That said, a lot of people are installing it on their dual processor G4s and PowerBook G4s. A lot of people. The responses that mentioned these machines said "I want to unleash my machine, so I'm installing LinuxPPC." So, Linux actually has driven new hardware sales. It also helps old hardware that's in the closet at the moment, as it can run on machines that Mac OS X never will run on.

    You can definitely say that it is a benefit to everyone, regardless of how new or how old their hardware is.

    Haaz: Co-founder, LinuxPPC Inc., making Linux for PowerPC since 1996.
  • I know I'm still using my 5 year old 486 as a firewall, mail server, ftp server, web server, samba server, dns server, nfs server, etc. All thanks to Linux.

    ___
  • Comment removed based on user account deletion
  • Assuming that you really are ignorant here (not stupid, just not yet informed on the particular topic under discussion), or that other's reading this are, there's an old saying/proverb/parable/old wives tale/whatever about boiling frogs. If you stick one in a pan of boiling water he'll do his best to jump out, same as would you or I. If you stick one in a pan of room temperature water and slowly raise the heat under the pan, he'll sit there not realizing anything is wrong and eventually cook to death. Don't know if it's really true or not, but the concept of gradually imposing something undesirable so that you don't realise that it's being done until it's too late is very real and happens all the time.
  • It's not done yet... not even version 1. Compare it to IE 2.0 and then talk trash.

  • If you're referring to Netscape 6.0x you're 100% correct. :-) Netscape really sucks d***** b**** due to its slowness on anything less than a Pentium II 233 MHz machine and also has trouble rendering many web pages, too. :-(

    I believe that Mozilla 0.8 is a bit better, though.
  • I think Black and White may work well even on a Celeron 466 MHz machine just as long as you have enough RAM (128 MB minimum!) and a decently fast AGP graphics card (e.g., Matrox G400 with 32 MB of video RAM).

    It won't be extremely zipping along like it does on a 650 MHz or faster CPU but at least the program actually works at a reasonable speed.
  • Actually, it really depends on the vintage of the motherboard.

    If you're using a motherboard that supports the earlier Pentium II's but is using the Intel 440LX chipset (which means it has AGP, ATA-33 IDE ports and 168-pin SDRAM DIMM support), the best solution is to get as much RAM and bigger hard drive you can afford. With the price of PC-133 SDRAM DIMM's going for US$40 per 128 MB and 40 GB hard drives selling just over US$100, there's totally NO excuse to upgrade to get what could be as much as 50-75% increase in performance without swapping out your motherboard. :-)
  • One of the reasons why I have invested time and money in Linux is because it has given new life to my old and obsolete hardware. In fact this is one of the selling points that Linux people often quote. Perhaps Linux is contributing to a lack of interest in new hardware?

    Even in the Windows environment it's not really necessary to get the latest motherboard. A very easy way to promptly boost Windows performance is a combination of getting a system RAM upgrade and getting a 7200 RPM ATA-66/100 hard drive. With the price of SDRAM DIMM's going for US$40 per 128 MB and 40 GB ATA-66/100 hard drives going for under US$115, there's no excuse to upgrade, especially when you can get as much as 50% to 75% increase in performance.
  • I run Mozilla on a P133 every day and it runs as well as anything else. People who think it's bloated obviously don't use it. At least Mozilla doesn't crash as often as Netscape. Thank gawd! :)
  • I use consoles to play games myself but I can tell you I've been tempted to buy bigger PC's w/ my evil nemesis OS running just to play some groovy games. It is sometimes very tempting. If the game itself was free, cross-platform (Linux, Windows, etc), and awesome I'd probably drop the cash for a enw computer. I don't even consider myself a major gamer. I play a game maybe once a week.

    Something like Everquest that was free to use would be a good way to lock people into an upgrade cycle to kepe playing their games.. and you could actually USE the extra power rather than go with BLOAT. Social games are highly addictive.
  • My gawd.. as a web developer I hate IE 5. It's worse than IE 4 even in many ways. Even the newest versions w/ all fixes applied still are buggy as hell. I like to test in every browser I can find and IE is always the one that gives me the msot pain. It also crashes far more than anything except Netscape. Those two (the most popular) are the worst. If you only do minor web development you might not notice but I do full fledged programs and Intranet sites and IE5 is my worst nightmare everytime. It even acts quite a bit different between Windows and Mac versions.
  • Mozilla runs for me for days at a time under heavy abuse but then I hand pick the builds I use. As it's still in development some days builds are much much more stable than other days builds. Netscape is just screwy. I honestly hate Netscape almost as much as IE. It tends to work better than IE when it is working but it crashes often and I have to wrap it to keep it's memory leaks from messing with the rest of my system.
  • by MikeFM ( 12491 ) on Saturday April 14, 2001 @07:59AM (#291550) Homepage Journal
    Is this surprising? I've been wondering why Dell or other slipping PC companies haven't done this. To keep the PC market hot they have to drive people to buy PC's faster.. most games I've seen work on less than 500Mhz machines so why buy a 1.5Ghz machine?

    If I was Dell or a similar company I'd fund a free game that could be awesome.. and given away for free.. that was designed to be played on 1Ghz+ machines. That'd be a lot of power a game could take advantage of and would be the perfect reason why games could work as opensource. The key is to make the game fantastic so people will really want it badly. :)
  • I still have a Pentium 166 (overclocked from 150 - pushing the envelope here :) that still works perfectly well as a desktop. It runs W2K, Office 2000, and most everything else I throw at it, but alas, no modern games. It is 6 years old now I believe. Granted the original hard drive is too small to hold anything but the operating system and the orignal 16MB of memory was laughable, but I stuck a new 16 gig drive in a few years back, and I upgraded the RAM to 92MB when EDO was still cheap.

    -josh
  • First of all good luck on getting a dual P4 up and running. If you had a harddrive with a higher spindle speed and lower seek time your programs will start up faster. Your application developers also need to work on toning down the file system access for their fucking programs to open. Look at ICQ, everything the program needs/uses is contained in their library files. There's dozens of the fucking things to open and thus maintain in the file system. Office2k is the same way as are so many other Windows apps.
  • What the fuck are you suggesting? Why would I want to write toolkit code (GTK, Qt, ect) in a language that has to be run in a virtual machine? The point of writing shit small and fast is so they don't overuse your system resources. Why should something relatively simple like animating a widget interaction take 80% of your processing power? I can't believe someone marked this drivel up to 3 as interesting.
  • You've got to be joking? Java and/or Objective C is going to run slower in the long run than the equivilent C++ program (don't say C/C++ when comparing it to an OO language like Java, its bad semantics). The language running inside the VM has more overhead for external operations. Ever use emacs? FUCKING SLUG! Its written in Lisp oddly enough. What exactly does Java do that you aren't going to be able to do in C++? Don't blame C++ for the memory footprint of Konqueror and KDE.
  • ...as I've had absolutely no problem with the "need" to keep up with the latest hardware. The author of the story went through and looked at a selection of games, a _very_ small selection of games, and found two with system requirements close to his system. Then apparently he didn't buy them because he mentioned that they would most likely run fine on his system. His error is in not trying.

    The author himself mentioned that while he had a 486/66 system when he tried out Doom (a game that requires a 386, FAR inferior to that 486) he made the decision to go with a Pentium 90. Looking at system requirements won't do anything for you. Playing newer games that only require a 200-300mhz processor such as Alice, Serious Sam, No one Lives forever, or Mechwarrior 4 only requires a certain type/speed of a processor, but to make them run well (read high framerate), then you're going to need something quite a bit faster.
    When you're using a Pentium II 350, a GeForce2 will be a nice boost (depending on the existing video processor you just replaced), but it won't give nearly the boost that it would if you had a faster processor...so just slapping in a video card really isn't the answer.

    There are plenty of apps out there that will stress this guys system, but looking at system requirements ain't gonna show him jack. We all know that the new DOOM game is going to rock everyone's world and be that killer app, but until then, don't say there isn't anything comparable...because there's lots out there.



    -Julius X

  • I can definitely belive Cisco would be working on these bandwidth sucking apps, but I'm curious if you know of some companies that had been spun out of Cisco. Ironically, Cisco is known for buying tons of other companies, not spinning them out! :-)
  • Yes, linux can bring new life into old hardware... I used a little P90 laptop with a text console for the longest time....

    If you want to run netscape (or mozilla), play video/mp3, use all the gnome shit and whatnot, you need a decent machine, similar to what you run windows on. Sure you can cut corners here and there.. but you still need new hardware.

    The real point of this is that hardware has currently surpassed software; people don't need Ghz machines at home; the apps they have run just fine on something half that speed. We're at a point where the cycles are just plain wasted.

  • by LL ( 20038 ) on Saturday April 14, 2001 @10:14AM (#291572)
    OK, let's look at the typical user's attention span. My guess is that you're looking at the meat-space interval between 0.1 sec and 5 minutes (roughly time for a human to respond to an event and the time it takes to go for a coffe break). Outside this response range you're look at either either direct embedded computer connect or distributed systems. So the performance requirements is that in ther period of 0.1 - 300 sec, you got a certain CPU (say 1 GHz) for software to do its thing. Now the human bandwidth is typing, reading, manu-vector (mouse/joystick), psycho-visual processing, listening, voice, kinesthetics/ haptics ... if you look at the possiblie algorithms and the time-lines you note that it takes about 2 decades for CPU power to fully address each human IO mode. So 60-70s we had teletype + TERMCAP, 80-90s was pretty much the WIMP era, the next decades are probably going to be voice/sound combos (keep in mind the minimum 0.1 sec response requirement to signal feedback). I don't think we're going to see real VR in mainstream (ie the stage where your granny can use it) until maybe 2020+when the cost of development = the time for obsolescence (~ 5-6 years @ 2 CPU generations). Keep in mind the basic business model of the computer manufacturing buiness in that they need to recover plant costs before forcing an upgrade due to "lack" of parts. For the consumer to accept the disposable theory, it has to be within a certain price range ($1K-$5K / 5-6 years???). Now within this basic allocation, they need to divy up expenditure across hardware/software.

    The point is that Moore's Law goes on quite happily but our human limitations (until someone hacks in a direct brain-connect) restricts the requirements cost-performance range of computer devices. The supply of software is limited by (IMHO) flawed IP laws fo it makes sense for a company to be vertically integrated and self-contain its sofware internally rather than specialising in specific functions. Hence the inability to scale software complexity since the average high-tech firm just has too many hungry mouths to feed (hey the MBA's need a salary to match their ego) for the market to sustain. Frankly given that the current usage of the information economy is entertainment, news sensationalism, peer communication, telepresence, and trailing far far behind education, it's hard to see killer CPU-intensive applications which absolutely requires denser forms of media.

    The upside is that we're spared from 3D virtual spam for another 15 years.

    LL
  • Thank God for Java IDEs! Between NetBeans, JBuilder, and the (still not as good as Win32) state of Linux JVMs, I'm ready to keep the hardware industry booming for a LOOONG time.
    Ooh, and add to that the speed with which gcc compiles C++ code. Ever try to compile KDE from scratch??? Once they finish with precompiled header support, though, it'll be bye-bye workstation industry... ;)
    --JRZ
  • I don't see what the fuss is ... sure I don't get 100 FPS on my PII-400 and TNT2, but come on, this isn't a twitch game. Pretty amazing graphics for the framerate actually. The last Ultima was much worse.

  • Nope. What your cycles/RAM will be occupied with in the future is running 12 different versions of MSVCRT.DLL (or glibc) and flashing 3D-rendered banner ads at you.

    New software is like foam, it fills all available space. Even my tax program seems like it needs a 1 GHz machine to run properly. The problem is that most developers are divorced from thinking about performance problems because it's absolutely last on their list of TODOs ('cept for game programmers, obviously) and if their program swaps for a few seconds, who cares? Their users will just spend a little more time in the AOL chat room.


  • Bloated code is good for marketing! No one wants to spend $300 on a measly 1 MB .exe. No, I want to see a 23 MB executable, 42 megs of DLLs, and preferably help files with uncompressed 32-bit images embedded. Oh, and 3D animated renderings of my desktop assistant, Sal the Tufted Titmouse.

    Yes, the "Hello World" of the future will barely fit on a DVD.
  • The article makes a great point if we accept that a killer app has to be software. There really isn't any software out there right now that my 2-year-old Celeron 550 can't handle.

    But: I went over to a buddy's house to help him hack together an FTP server on his 1 gig Thunderbird system, and all I can say is wow. I want. Okay, he's not doing anything I can't do, but everything happens virtually instantly. Need to install a program that would take my machine around 30 seconds to unpack? His is done in three. Three! Dialog boxes that I see for about ten seconds appeared on his screen so fast they couldn't be read all the way through before they were gone.

    There's no software I want to use that requires this kind of speed, but I want it. I mean, they're not opening up any new highways in my area, and a Geo Metro will get me to all the same places as a Mustang, but I'd still rather have the Mustang.

    -N

  • I VC with my friends and family all the time... It takes me 3 rings at most to answer the call (it stops ringing after that anyhow and tells them I'm not around)... where did you get this 20 rings thing from?
  • I dunno... the big reason I tossed another 64M of RAM in my box was to appease X 4's memory-hogging nature for a little bit longer. That, and editing four multi-layer print-size images in the Gimp while listening to XMMS and running FoldingAtHome...

    -grendel drago
  • Wasn't there an article here a few days ago about how mind-bogglingly big the hardware requirements were for Black and White?

    The article seemed quite nervous at the idea that most people would see awful performance on their current hardware... but given the astounding hype surrounding B&W, it could be that killer app... especially given the developers' histories of making groundbreaking games. If the legions of fans need a P4-1500 to properly experience it, a large contingent of them will spring for the hardware.

    -grendel drago
  • by brianvan ( 42539 ) on Saturday April 14, 2001 @10:00AM (#291598)
    1. Internet Explorer

    Open one IE window on a PII-233 with 64mb of RAM, and you're okay. Now try that with 20 windows. One of the great advantages of having >200mb RAM is that you can open every application on your computer 5 times and not thrash your hard drive into oblivion. Same thing with Word, Excel, Winamp, or any other program where you might have more than 3 windows open for no good reason. Speaking of hard drives...

    2. CD burning + broadband

    MP3 files aren't enough to demand special investments in hard drive technology. But once you get a CD burner, you now need a faster harddrive so you don't burn as many coasters. (I mean, you don't want one turtle C: drive for everything - moving the mouse will screw up the burn) Then you run out of things to throw on CD... but then you get a cable modem, and you're downloading VCDs and giant MP3 collections. And that's when you need BIGGER as well as faster. At that point, if you're serious, you're looking at IDE RAID or big honkin SCSI drives. (I went somewhere in the middle and got a medium sized, blazing fast SCSI and a large, fast IDE drive... but I still have space problems, hehehe)

    3. Video capture

    This is where things start to get out of hand. Everyone has looked at an ATI All-in-Wonder at some point and thought, "*Sigh* if only it were GREAT at 3D and didn't have crap drivers." Well, I went that route anyway (I didn't have a decent video card at the time), and I haven't played too many games in a year and a half as a result... but it was worth it. Now I can watch TV on the computer and watch the computer on the TV, record TV programs onto the hard drive, get some decent performance in 3D games, etc. Only problem is, it's not good enough... it does a lot of things well, but overall it's not that impressive as it is convenient. The newer Radeon AIW does make me salivate, though... but it's not a GeForce2 in 3D, it's not like TiVO for recording programs, and it's not like a professional video capture/compression setup for making movies and stuff. It's just decent, that's all. But once you lose sanity and go for the gold, you can REALLY rack up some big price tags. Once you have the taste in your mouth, it's hard not to be hungry. To have a GeForce3, a TiVO, a professional TV tuner/video capture card, a PentiumIII for the processing requirements (cause I like 720x540 realtime MPEG2 encoding), and a nice set of hard drives to hold all the movies (yet another reason to pick up an IDE RAID or SCSI hard drive habit)... well, that's a LOT of money. A lot more than the $100 I paid for the AIW on eBay. Granted, you may have no need for most of this... but the TiVO and the GeForce3 are expensive enough.

    Maybe the next boundaries to push aren't in software functionality, but in software/hardware convenience. Running bloated code is one thing, running many bloated code programs at once and getting them to cooperate is another story. And I hate to say it, but right now we WASTE so many computer resources on absolutely nothing. I don't run SETI or RC5 cause I have no interest in it, so my processor sits idle and unentertaining. My DSL line also remains underused, even when Napster is going full throttle - there's plenty of low bandwidth applications that can work alongside a file download, yet I have no compelling reason to run any of them. Other than games and video compression, there is nothing that makes my computer work hard at all... but there's not much that I'm doing instead, either. Which is why it's about time that we started making lots of little flashy doo-dads and convenient background applications to use all that wasted processor time. For example, I've never seen a single good alarm-clock application for a PC. Also, why can't I have a simple yet powerful personal organizer program that looks like a Palm Pilot and that I can bring up by clicking on an icon in the taskbar? (in otherwords, a program that acts like a Palm Pilot but on the screen... maybe even an emulator, perhaps) What about a personal Internet radio station tracker? Or a TV program listings retriver and alert system? (instead of the TiVO recording it for you, you click on an icon in the taskbar and it tells you when your favorite programs are on that day - and alerts you, ICQ style, 5 minutes before any of them start) How about some SERIOUSLY snazzy looking Winamp plugins? Or flashy GUI stuff like active icons and mouse-over gradient animation highlighting? (or how about that Aqua stuff in the new OS X?) A lot of this stuff would really run well on Windows AND Linux... except you won't want to close everything just to run a game that needs your entire computer... so you just buy a faster computer to run the game AND all that stuff at the same time! Old idea, new implementation - how many of us bought a new computer because games didn't run fast unless you used a boot disk or something...
  • With a P200MMX with 96MB of ram, and a 6MB Hercules Stingray 128/3D PCI, running KDE on SuSE 6.3 ("upgraded" to 2.2.14 kernel).

    Plenty fast for me - it does what I want, fast enough.

    However, with prices dropping so much, I decided to upgrade - I have waiting in the wings a Celeron 366 with 256MB SDRAM and an AGP TNT card (haven't received it yet off an ebay guy, but will soon). Right now the test rig is sitting next to me on top of my scanner.

    I don't know when I will get around to installing it and booting up - maybe when I get that video card. I am not sure if it is really going to matter to me. I am thinking about getting an even better CPU (I bought the celeron only for testing - the MB was given to me by my employer, and I didn't know if it worked right or not), probably another celeron (maybe a 600), or a PII.

    At any rate, I haven't found anything that I use that taxes the system as it is (with the P200). If I played more games, I probably would, but I haven't bought a piece of software in ages.

    Just my two cents, probably worth less...

    Worldcom [worldcom.com] - Generation Duh!
  • by xtal ( 49134 ) on Saturday April 14, 2001 @08:58AM (#291602)

    Yeah, moz, video conferencing are all well and good, but two things drive the demand for consumer (computing) electronics - games and pr0n. What will make people get the GF3's and the Athlon DDR 1.5Ghz systems will be hardcore, 3D, interactive, good AI, 1GB of RAM suckin, 1280x1024, 120fps hardcore Pr0n. If I had the time and resources (I did a lot of 3D development), I'd be working on this right now, believe you me. The capabilities of a top-end system in 3 months graphic-wise are going to be previously unimagined in the consumer world.

    I'm not talking about Virtual-Valerie cheezy sleaze. I'm talking about a virtual chick you can interact with and, uh, experiment in lots of innovative (and probably criminal, heh) ways. People are animals, and they love their pr0n. This I've been waiting for for years, and I think the technology is there to make it happen :).

    And hey, you got bucks, I got a sick mind and OpenGL sk1llz :)

  • This guy is making his judgements by looking at the back of the box and seeing what the apps minimum requirements are.

    Now, if you haven't played a computer game in the last 5 years, let me tell you that requirements are a joke. Sure it will *execute*, the code will "run". Hell even the recommended system will be sluggish at times for alot of current games.

    The motive for game publishers to push the specs as low as possible is simple: the higher the Mhz number on the box, the fewer people will buy it.

    Now there may be a point to this article, I can't think of anything at the moment that requires a ghz chip, apart from my data analysis. Game developers are scared to target a Ghz platform because they know sales will be pitiful( and note that a game targetted at 1ghz will probably have a minimum speed of 500-600 on the box, so even then this fellow's "in depth" research would turn up nothing)

  • Your PC may be fine, but you are probably going to spend the $2000 dollars every 3 three years nevertheless, only this time on other sorts of hardware:

    - Wireless LAN basestations
    - You'll swap your PC for a faster laptop
    - Multitude of wireless devices (PDA's, WebPad's, internet enabled wristwatches, 3G/4G mobile phones, etc., etc.)
    - Faster modems to connect to the internet (paid per month)

    In essense, the killer app is not some software product, but it's the wireless internet.

    Microsoft has long recognized this, that's why .Net, Hailstorm, Windows CE, and why you see Microsoft is leaving Intel (e.g. no support for HomeRF, nor for USB in Windows XP). Microsoft now rather goes to bed with Sony, Philips, Panasonic, Nokia, etc.

    yo
    loz
  • ... for me anyway. Notice that the price of anything high quality is still in the thousands of dollars, for cameras (not little webcams of course), laser printers, good displays... When that stuff becomes cheap, then I'll buy.

    Another idea is to really get going with the home appliance concept, ie communicating with your stove to make sure it's off when your not there, checking your security system, etc. I don't know what's taking so long with these ideas, what's so complicated about tying an on/off switch into a computer?
  • by jesser ( 77961 ) on Saturday April 14, 2001 @10:48AM (#291627) Homepage Journal
    Intel has a department devoted to finding ways to use more CPU time.

    Cisco does something similar. They have a team whose sole purpose is to create applications that use lots of bandwidth in order to increase demand for bandwidth. Any successful application created by this team is spun off as a separate company. I doubt that they set up their software to waste a lot bandwidth, although they might not spend as much time on optimization as other companies.
  • Who was working at Bell labs as they were researching and developing video phones. The class was having a discussion on technologies that were more advanced, more useful, or just "better" than existing technologies but didn't take off.

    He said that in the studies they conducted (usability studies), it took people an average of 20 rings to answer the video phone, as compared to the 2 rings to pick up a normal phone.

    Moller
  • by moller ( 82888 ) on Saturday April 14, 2001 @02:01PM (#291632) Homepage
    What would video conferencing replace? The telephone. Does the telephone need to be replaced? No.

    Think about it, what percentage of the masses wants to have to look presentable when speaking on the telephone? A video phone completely destroys the anonymity of your appearance. The person you're talking to can see what you look like, what you're wearing, your facial expression, it adds a whole new dimension to communication that people don't want.

    A study was done on how long it takes people to pick up the phone vs how long it takes them to pick up the video phone. The average number of rings before someone answered the phone was 2 rings. The average number of rings before someone answered the video phone was 20 rings. Think about it, 20 rings. Where do you think all that extra time comes from? The person running around, smoothing their hair, straightening their clothes, checking their make up in the mirror...

    Video phones add unnecessary overhead to the communication process. There's simply no need for them.

    Moller
  • by jidar ( 83795 ) on Saturday April 14, 2001 @08:04AM (#291635)
    I think one thing that is contributing to this "problem" (heh) is that of the content development. There isn't any point in making an engine that supports even -more- detail than you have now when creating the detail for the last 2 generations of engines was almost too much work for your 200 man content creation team. Every advancement in engine tech his historically created more work for the artists, and at this point the sheer amount of time it takes to make a decent single player game is getting ridiculous. I don't think the art department is ready for it yet.
  • The quesion of "bloated code" reminds me of the posting I wrote in one of the KDE developers' lists a couple months ago. Perhaps code isn't all "bloated" nowadays, the problem that many developers have is that they develop so much more abstracted from the hardware than, say, five years ago.

    Speaking of KDE: It's a great environment, and in many ways it's faster than Windows. (In others it still lacks, but most of the criticism it receives is pure bullshit.) Anyway, if I remember correctly, I was using a much more powerful GUI in 1994, on a P60, with 16 MB RAM (instead of 192 now) which was just as smooth to use and fast like hell. (I'm talking about OS/2's workplace shell).

    The question is: Where has all that performance gone? Why can't you comfortably use Windows ME, or 2000, or Linux with KDE 2.1 on a P60 with 16 MB RAM? What are CPU cycles doing nowadays that they didn't need to do five years ago, although most apps had almost the same features? That is, while you are not watching your daily DivX porn ;)

    Some probable answers come to my mind.

    Unicode. Double every string in length, double the memory requirements of application resources. This makes for great internationalization, but it requires memory.

    OOP. I overheard in a PHP mailing list that when you develop PHP3 apps without objects, just flat procedures, you can gain up to 30% in raw performance. (This has greatly improved in PHP4, IIRC.) I don't know how representative this is but I suspect that in languages like C++ and Smalltalk (and Perl :-) some CPU cycles are needed to take care of all those relationships, overloaded items and whatnot.

    Standards. The good thing about standards is that there are so many of them. I.e. nowadays a browser (prime example) needs not only to render a little plain text with different fonts and one or two images, but it needs to know XHTML, XML, Javascript, ECMAScript, Java, CSS, cope with thousands of objects and plug-ins that mess up the system, and so on. Sure, these are features - but are they innovations? I don't think so, and I don't think most other apps received as much "feature bloat" as browsers did in the last couple years.

    What do you think? Why does a word processor need 128MB nowadays when it doesn't _really_ have more features than what was available in 1994?

    (Having said that, I have a K6-2 350 for my primary machine and I don't plan on buying a new one this year. It does what I want, it does it fast enough, and if I need more CPU power I can always ssh to our university cluster. ;)

  • Flight Simulations

    A good flight sim takes all the cpu, memory, and video performance you can throw at it. The higher resolution you can get, the more detail you can get, the higher color depth, the better anti-aliasing, the more immersive and realistic the simulation will be. And it all has to run at or above around 20-30 frames per second, otherwise it's disorienting and the realism is ruined.

    Flight sims will likely ALWAYS push the hardware envelope. The killer hardware for flightsims will be high-res 3D graphic capable VR goggles that can smoothly pan the view, but that is years away at the current pace of things.

    In the meantime, try running the latest microsoft flight simulator with all the graphic details full up and at high resolution. Then get on the web and order up your next upgrade.
  • by blakestah ( 91866 ) <blakestah@gmail.com> on Saturday April 14, 2001 @08:10AM (#291641) Homepage
    Gates Law: the speed of software will halve every 18 months.
  • I run my project studio (audio production) on a dual PII-300 with 512 megs of ram (win2k) ... Although secretly I dream of a a dual athlon 1.5hgz machine, this one has worked great since I bought it in early 1998 :) dan
  • good point man! Look at lucasarts, their PC games always shoot VERY low on the spectrum, in 1999 when grim fandango came out, the requirement was a pentium 133! in 2001 Curse of Monkey Island, required a pentium 200 and any 3d accelerator! [although it sucked:)]

  • We can use extra speed to bring value to people, rather than make software "bloated"?

    One thing we might do is write software which has checking to make sure certain kinds of common errors can't happen, and the programs don't crash. Specifically, we could do bounds checking on arrays (getting rid of some high percentage of security holes, and potentially catching memory corruption heisenbugs earlier) and automatic memory management (getting rid of memory leaks).

    I'm not saying that we necessarily need to pay the efficiency price to do this kind of thing, but if the speed is there, why not?
  • Until recently, I could never afford anything close to a recent machine. And when I replaced my computer's guys recently, I only sprung for the Athlon 750 to save a few hundred bucks.

    I never got to play Doom, Quake, etc., because my computer was always far from state of the art. When the Pentiums started shipping, we finally got a 486 (had a 286 when the 386 launched, a 386 when the 486 launched, etc). As a result, I couldn't play the recent games and always craved new hardware.

    Now, I could play most of my games on my K6-3 450 that I finally replaced and the main reason for the new system is Office 2000, and even that ran reasonably.

    At my office, we have the iPaq computers from Compaq with Celeron 466 processors, and everyone is happy with them.

    There isn't anything pulling us to faster machines. I don't mean bloated code, like the author, there is no exciting new applications. IE 5.5 will run on a 486 w/ Win95, it doesn't tax anything near current.

    Part of this is Intel's fault. By making the PII/PIII simply rehashed Pentium Pros, we're still sitting on what, 6 year old technology (ignoring the P4-currently useless system). The difference in performance from a P3-1Ghz and a PPro-200 is less than the 5 fold the system would indicate, it's maybe 2.5-3 times faster. But with the fast video cards, we can push some envelopes.

    However, I guess if the current tech is good enough, why risk lowered sales. Besides, there what the author forgets is some of the economics behind this.

    When the games for the 386 were running great on 486s, developers were paying a BIG price premium for 486s to develop for the consumer 386. Now we all use the same system and there isn't a BIG premium on CPUs...
  • by fractaltiger ( 110681 ) on Saturday April 14, 2001 @04:07PM (#291658) Journal
    Have you ever looked at how much RAM you needed to run System 7? It was about 2MB. I remember a laptop made around '92 or '94 where a RAM disk could be made big enough to dump your system folder and word processor, so you could work on a document for 7 hours of diskswap-less glory. That feature of the MacOs, will be missed, though:

    Then came OS 8, and the amounts of RAM have raised... 10 MB, 15MB, OS 9: 24MB... Just for the system! Try running something like Unreal Tournament on OS X, and you'll need nearly 256MB --128 for the OS and a little more for the game.

    God, I don't know where the mainstream OS industry is heading. The article mentioned that Win2000 should run on nearly 200Mhz, but my college campus has it on DELLs with 500Mhz pentiums that are as irresponsive as a Nintendo 64 emulator running without a 3D card.

    Somehow I refuse believe the recommended specs. Windows won't willingly say how much RAM programs take, -- START \ RUN \ mem always says all my RAM belongs to the system. Ok, I know sysmonitor may tell you about RAM, but I don't trust its figures either: Try and start windows 98, which should run fine on 16MBs of RAM, with virtual memory off. I've had to troubleshoot computers that won't load any DLLs because the kernel takes the whole RAM and as soon as the desktop starts, you get errors for every DLL and VXD possible because 32MB != enough once people turn off VM.

    In fact, here's something to think about:
    Since when have we been able to run a system WITHOUT disk swapping? I told a friend taking an OS class the other day that OSs are guilty for our wait problems because they have made Hard Drives a requirement for an ideally optional feature. Old literature for Dos used to say: "First, insert your system floppy into the drive bay. Now, push the ON button." There were no hard drives and therefore, no disk swapping. And now you have swapping, 100% necessary DLLs instead of the dos system EXEs, and god knows how many unnecessary things get loaded at every startup.

    A friend of mine said: Well, "what if a computer scientist like you built a system based on AI [so that] programs were 10k? [of plain-text human speech] The system would be huge, right?"

    (It would need billions of library commands and much "knowledge", and it would need to compile the 10k on demand.)

    I think AI will be the next killer app. If it were only true that we are closer to figuring it out, though... At least Clippy will be an unsupported feature in the next version of Windows.
  • by The_Messenger ( 110966 ) on Saturday April 14, 2001 @12:27PM (#291660) Homepage Journal
    I've read a lot comments in this article, and I don't agree with most of them. Many of you envision some sort of conspiracy between software and hardware vendors to sell Pentium IIIs by writing bloated code. I don't think this is the case at all.

    MS Word is often used as an example of bloatware. Yes, it is a fairly large program, but I don't hold its size against it, because it allows the non-computer savvy to create nice looking documents very quickly, with very little work. But MS Word is not what is pushing sales of 1GHz Pentiums. The truth is that nothing is pushing the sale of 1GHz CPUs. Intel and AMD make them, and the big OEMs sell them without question. Ever ask yourself why you it's difficult to find new OEM 500MHz machines on the market today? It's because the big OEMs know that consumers expect to spend $1500-2000 on a new machine, and aren't going to dissuade them if possible.

    I'm also going to note that this hardware manufacturer/vendor conspiracy seems limited only to CPUs. Look at what Dell and IBM are trying to sell consumers, and you'll notice how incredibly unbalanced these systems are. A 1GHz CPU with a fucking IDE disk? The disk was the bottleneck 700MHz ago, and it is now... just get yourself a 500MHz CPU for $80 and spend the money you saved on SCSI-3 hardware. But, as mentioned before, you can't buy a measly 500MHz CPU from the big OEMs anymore, so balanced PCs are now only available to relative "geeks".

    My dad is VP of Engineering in a large company whose name (a household name, I might add) I won't mention, and he does all of his work, including use of MS Office 2000, on a 133MHz ThinkPad. Doesn't sound like MS Office is selling new systems to me.

    The only software industry that sells new systems is the gaming industry. Even when the next generation of games doesn't require a new video card, many of us will go buy one just to make our old games even better. My primary workstation, which I upgrade about twice per year, currently has an 800MHz Thunderbird, 512MB of 133MHz Crucial SDRAM, and an ELSA GeForce 2 Ultra. In addition to gaming, I use this box for my development work, but you can bet your ass that I didn't buy a $400 video card for writing C++ (yes, Carmack might, but I don't develop games). I bought the card because, as a gamer, playing Tribes 2 (just picked it up yesterday, actually) smoothly at 1280x1024 in 32-bit color just r0X0rs.

    Incidentally, my firewall is a 170MHz SPARCstation 5, but I'm not going to be playing TFC on that anytime soon.

    I believe that the WWW is the real "killer app", and only revolutionary Internet client and server software will really push hardware sales noticably. (If IE5, Apache httpd, or Napster required a 1GHz CPU, hardware sales would be exponentially greater.)

    --

  • I dont understand this article at all. It starts off as an explanation of how computers are already pretty powerful for today's applications. Then he talks about how though Doom said it would run on a 386, he was forced to upgrade. Then, applying stellar logic, he proceeds to use the minimum system requirements listed on software boxes as a basis for his first point, simply bypassing his second assertion that there exists software that lists much lower requirements then it needs. He has two completely different, paradoxical arguments in his article.

    He also demonstrates an acute lack of knowledge about his subject. So the Unreal box says "200mhz". Has he ever tried it? Sure it runs. However it's like watching a slideshow. The same is true for a lot of the software he listed. Just because something runs doesn't mean it runs well. Of the things that ran well, like Diablo II and Baulders Gate II, those are based on the previous engines, which ran fine with his current setup, and hence are not a good basis for comparison.

    Essentially, what he is trying to say is that what it generally considered now as an older computer (mhz at or under 350 pretty much) can still run current software with the same hardware. Fine. Thanks for validating that. But, taking from his own Doom-upgrade theory, the current software will not run all that well on older machines and demand an upgrade (perhaps not to the latest stuff, but still an upgrade). Therefore, by his own logic, he has no reason to upgrade yet he does.

    Perhaps downloading all that porn caused the logic section of his brain to malfunction.

    .agrippa.
  • by emir ( 111909 )
    this guy obviously havent tryied playing divX movies on p2 350mhz....
  • by Animats ( 122034 ) on Saturday April 14, 2001 @08:33AM (#291674) Homepage
    Intel has a department devoted to finding ways to use more CPU time. I know some people there. (One writes on his resume: Created massive immersive 3D environments with high image quality based on real geospatial data to catalyze the demand for future, higher-performance hardware platforms. ) I do physically-based animation, so they like me down there.

    Microsoft is working on the business end of the problem. They have to find ways to force businesses to upgrade to Windows 2000 and the new revenue model, and businesses are resisting strongly. Refusing to put USB into NT 4 is a key part of the strategy.

    The .NET thing has potential as a time sink. Implementing RPC via XML will be hideously inefficient. And interpreters are involved, which typically means a 10x performance loss.

    Not that Java is much better. Swing seems to need upwards of 1GHz just to display menus as fast as a 20MHz Mac of a decade ago.

    So, clearly the industry is addressing the problem.

  • He talks about looking for the killer application that will make him go out and spend the big money on a whole new system.

    Actually I have three of them:

    • Solaris 8 -- either the incredibly picky x86 version, or just buying a damn (ultra)sparc to run the sparc version
    • Oracle -- this is the real killer. According to my friend Lynn who had the inclination to run it and the money to keep buying stuff until it was happy, you need 512 meg of ram and up as a practical limit. Not to mention a fast disk that's in the 8+ gig range you plan on devoting solely to Oracle. And this is just for a smallish installation he's using to teach himself. God only knows how much it would want for a big one (well, a Sun E10K is a good bet, I seem to recall that was what eBay used to run their Oracle on).
    • Enterprise Java -- anything in the java app server / servlet / J2EE category just soaks up the ram as fast as I can throw it at the machine...

    And all of these are not flashy, consumer, game-type 'ware, the usual suspects for driving hardware upgrades. My point being that even us CLI-only, minimalist sysadmin types are going to run into this phenomenon now and again. (Although in this realm I think the scaling axis is usually more ram / more processors as opposed to faster processors (and of course video cards aren't a factor at all), as an example see the configuration of the pretty-damn-busy-but-still-very-responsive ccwf [utexas.edu], where my skool account is...).


    --
    News for geeks in Austin: www.geekaustin.org [geekaustin.org]
  • We already have photorealistic graphics via the Geforce 3.

    No it isn't, you are just being sensationalistic. I think 3D is just now starting to get to the point where its really good. The playstation's graphics sucked ass. N64 was bearable 3D, dreamcast and PS2 are practical, and I think that the xbox and gamecube are finally into the realm of good 3D.

    Music is about as good as it can get with MP3, Vorbis, WMA, whatever.

    No it isn't, although I am not one of those people who says "mp3 sounds like crap, my ears are 31337!" (192 Kbs mp3 sounds great I think) there is still alot to be desired. What will happen when DVD audio comes out? You will need five channels and that means bigger files, but more importantly more redudancy between channels to be compressed which means more CPU cycles. Not to mention that audio codecs are still evolving.

    Maybe I am getting trolled, but this is the most sensationalistic thing I have read in a long time, it is people like you who sit on their asses while other people forge ahead because they are not content with what they have.
  • I have thnking about this off and on for sometime now (mr. anderson). Before I say anything, a thunderbird athalon is much faster than my 300 Mhz PII for even simple stuff like obsessivly reloading slashdot. That said, there might be a few platues along the way where not many people feel the need to upgrade. If you think about the trends though you can see, that one computer for one person is not really the way things are going to go for the average home. Home networking is going to be very mainstream in 2 years + or - a year. Wireless and other things will make it easy. Falt panel monitors are poised to become practical in a year or two too. Families want everyone to have their own computer, but that's expensive. The computer they also just bought is faster than anything they need. How to solve this problem ? Servers and terminals of course. This is where .NET is heading and is where Linux and Microsoft will eventually be fighting I think. Instead of buying everyone a computer, dad goes out and buys a server to be placed in the basement, and everyone gets their own terminal. Anyone could see that this could lead to faster single computers and more expensive ones too ( good for the hardware insustry, higher margins). Obviously this won't happen for a while but home networking will hit soon enough.(I am proud to say my home was networked with I was in 7th grade and I am now 19).
  • by donglekey ( 124433 ) on Saturday April 14, 2001 @08:02AM (#291690) Homepage
    I have said this a few times before but I think that one killer app for the masses is video conferencing. I have video conferenced with my friends in LA and it is alot of fun. You might say "Video Conferencing just requires more bandwidth." Well of course that's true to an extent but the codecs used in voice and video are made so that a computer can compress them quickly. Mpeg4 is very slow to compress and is not near real time in even a top computer. Mp3 is starting to become easily compressed in realtime, although I don't know about the second generation of good lossy codecs like vorbis, wma (gasp!) and whatever fraunhoffer is planning to cram up America's ass when they get their shit together and release their new codec. Mpeg4 looks nice, and with something with low movement like video conferencing video and optimizations like silence cut-offs, video conferencing should be a given for people with high-end systems and high bandwidth, eighther at home of work. Maybe mpeg4 isn't the way to go immediatly, but you get the point. That and maybe Doom3 when it's released.
  • I know there is a compiler online for the nes and the snes

    There are common assemblers for NES and SNES. NES's 2A03 is a 6502 (same arch as Apple II and C=64) with an on-die sound generator. SNES's 65816 is nearly the same as that of the Apple IIGS. Neither is C-friendly. The 32-bit 68000 in the Sega Genesis, on the other hand, has a version of GCC.

    but their cart based so you couldn't just trade them

    It's relatively easy to make an EEPROM cartridge for NES; start here [www.sfu.ca]. Edit, compile, emulate, edit, compile, emulate, ... burn on to EEPROM, test for bugs tripped up by emu inaccuracies. Just make sure you never use NESticle [everything2.com] for testing.

    It would be nice if they did opensource their development tools.

    Standard "why don't they just free the software" response: For one thing, they might have licensed technology and not licensed the right to sub-license it to the community. (This may be much of why NVIDIA hasn't freed the drivers for its video cards.)

    For another thing, game companies sell software. They don't want competition from software designed to run on their older consoles. This is why Nintendo is going after not only ROMs but also emulators, even when such emulators are used to develop free software [pineight.com] for old consoles.

    Also, there are trademarks and copyrights on the games' content itself. If you have a devkit, you can rip graphics from Mario, Zelda, and Pokemon and use them in your own games.

    the great thing about consoles is that the programmers can't just throw in a little extra and say "Oh, they'll upgrade".

    But that's exactly what Nintendo did for the Super NES. The programming model for the Super NES CPU and picture generator wasn't that much different from that of the NES. Even though the sound was radically different (NES had 20 registers in CPU address space; Super NES had a mini-DSP in the space of a separate processor with an extremely obscure instruction set), most game publishers just used Nintendo's sound driver from Super Mario World (it was provided with the dev kits). In fact, backwards compatibility with NES games was planned but later dropped.

    NESdev, the center of the NES scene [parodius.com]
  • Mpeg4 is very slow to compress and is not near real time in even a top computer.

    I'm using DivX ;-) with my USB video capture box, and my P3-900 compresses real-time 320x240x15fps captured video just fine.

  • Some audiophiles prefer DTS encoded surround to Dolby Digital because it isn't as compressed.

    DTS is not as compressed bitstream-wise as Dolby Digital, but it is more compressed sound-wise. When you send your audio off to the DTS people, they compress (remove dynamics from) the heck out of it, removing the punch. They attempt to add it back by turning up the bass really loud (Joe Sixpack thinks loud bass == good sound).

  • One thing we might do is write software which has checking to make sure certain kinds of common errors can't happen, and the programs don't crash. Specifically, we could do bounds checking on arrays (getting rid of some high percentage of security holes, and potentially catching memory corruption heisenbugs earlier) and automatic memory management (getting rid of memory leaks).

    Answer: Java

  • OOP gets the rap for bloated code all the time.
    But take a look at what goes into OOP:
    1. Overloading. Typically this is handled by the compiler through name decoration. No performance impact, no binary bloat.
    2. Encapsulation. This is just plain ol' typedef'ed structures with some additional rules for visibility and access. Most of this is handled at compile time, not runtime.
    3. Polymorphism. This is usually handled by a virtual table-type mechanism, which results in a jump, offset and another jump, plus the stack correction on the return.

    Polymorphism is the key element that can suck your cycles, but just as with any other tool the performance impact can be mitigated by profiling and using language features carefully. Unfortunately many developers do neither, and some dev environments/frameworks (MFC uggghhhh) actively work against efficient use of polymorphism.

    Polymorphism and the use of interfaces can so radically improve the structure and quality of code that many shops make the tradeoff knowingly. I've done it both ways myself and there's pain on both sides of the fence.

  • Back in the days shortly before Apple moved to the PPC processor, there was a company making a light-weight word processor. I purchased a copy, though I no longer use my old mac for much and I don't recall their name. In any case, it was a pretty darn good program, and they had series of ads in mac magazines touting its low resource requirements and fast speed, compared to all the other word processors. Indeed it was fast (my Mac is an 16 MHz '030 chip).

    But in the end, it didn't gain much ground and ulitmately is disappeared from the market in a year or two. Word 5.0 held the Mac market. Clearly, what the market considered important wasn't low resource usage and good performance on older hardware.

  • That's right, 15 minutes to compile a circuit diagram into a FPGA chip, using a 800 MHz Pentium3 (512 megs ram, 10k rpm SCSI drive, other high-end hardware) The design uses a XCS10XL chip, which is among the smallest devices they make today. It actually only takes 2 minutes to compile the circuit if the timing contrains aren't used in the placement and routing, but how useful is that?

    The design in question is a custom DRAM controller, DMA controller, IDE interface, and MP3 serial bitstreaming output (DMA based), in my little homebrew mp3 player project [pjrc.com].

    Ok, not exactly a killer app, running FPGA placement and routing, but that 1.5 GHz Pentium 4 can't come soon enough! I can't imagine how anybody ever manages to design with those really large FPGA chips!!

  • What's wrong with having the coolest games on the planet run on computers that only cost $500-700? I honestly have *no* problem with that at all! It means that you don't have to be making $50K+ just to afford such a toy. Even more interesting is that the lack of demand for the higher end machines means that you can get a bitchin' top-of-the-line computer for only about $1700 (that's Canadian too...), as opposed to about $5000 a few years ago. Hell, lately I've been having a hard time finding a generic-brand computer that costs more than $2000.

    Basically, what this means is that now we're giving processing power to the people! The lowliest gas jockey making minimum wage can play starcraft or use the web like the rest of us! Schools can actually buy *new* computers! A computer on every desk in every office and every home! This is good for all of us!
    ---
  • by kfg ( 145172 ) on Saturday April 14, 2001 @08:45AM (#291713)
    You Chevy only goes 100.

    Why is that?

    A device only has to be powerful and fast enough to do the job you expect of it. Period. The 8080 is STILL a very fine chip for certain applications. Hell, I've got a drawer full of 555 chips that suffice for many computational purposes much BETTER than even an 8080 would.

    As for the next "killer app" I've seen this one coming for a long time, and have even posted here a number of times. Just what else do you expect a computer to DO?

    It's a TV, stereo, recording console, data center, video game platform, web server, radio, external hardware controler, clock, printing press, fabber, and even * a computational device.*

    The time will come when all the "killer apps" have been thought of and implimented. That time appears to be pretty close to NOW.

    Sure, better, faster, cheaper computers will make some of these apps bigger, better, faster and more, but the computer as she is today DOES things, and does them well.

    SOFTWARE, on the other hand, has turned into badly written, buggy overpriced crap.

    Perhaps the next "killer app" is customer satisfaction?

    KFG
  • This is why I think document formats and the like should be open - by law

    I'll agree that having open document formats would level a LOT of the playing field. Doing this by force of law is just begging for trouble though. One of the problems we have now is the law mucking up copyright to the point of giving a nearly unlimited timespan to what should be a temporary monopoly. We do NOT want to bring the law into this.

    Also, let's just imagine this was a bill before the US Congress. Who do you think is going to have more lobbying power? A bunch of open source zealots with a few $5/share companies or a group of economy movers like Microsoft? Sorry if I'm coming down hard here, but even the suggestion that a governmental body dictate how you or a company should license your software is a dangerous wish.

    Lastly, we don't need the government here. MS is already making moves towards XML based docs. Even if they weren't, alternatives are finally popping up and maturing. Better products, customer friendly licenses, and an industry push towards interoperability are what is needed. The world is moving towards that now, with or without the help of either Microsoft or the muddling hands of government. Be patient, and thank whatever deity you pray to that our hands aren't tied by regulation.
  • Heh. I have a two year old PIII-550 which I run Windows 2000 on and really have no complaints.

    Try that on an AMD K6-450. Oh, and don't give it more than 128meg of RAM either. While you're still booting I'll have already started KDE 2.1 and be at the desktop. FreeBSD baby, yeah!
  • The disk was the bottleneck 700MHz ago, and it is now... just get yourself a 500MHz CPU for $80 and spend the money you saved on SCSI-3 hardware.

    I have to totally disagree with this (although I agree with everything else you said). Even the tasks performed by a typical "HARDCORE" user don't require SCSI... not would they even benefit from them considering how fast IDE drives are now. SCSI only helps when you have many users thrashing the disk at once, or if you absolutely need more throughput than the 45 MB/sec or so of real-world performance that modern IDE drives give you.

    Essentially, unless you're running a server (many users) or doing work with digital video (needs extreme, interrupted bandwidth) SCSI is a very expensive and not-too-useful luxury. You'd be better off spending all the extra money on RAM. :)


    http://www.bootyproject.org [bootyproject.org]
  • "not would they even benefit from them considering how fast IDE drives are now."

    Well... I should say, they wouldn't "benefit noticeably". I mean, yeah... sure, your favorite bloated office suite will load a teeny bit faster with SCSI. And your swap file will be faster, too. But if you spend the money on RAM instead, you'll hardly use the swap file anyway. :)


    http://www.bootyproject.org [bootyproject.org]
  • He will, however, notice the difference when the Word is first loading, and this is when most users will experience frustration

    I don't even know about that... even the "evil and bloated" MS Word loads in less than a second on my 800mhz Athlon. I mean... winword.exe is an 8MB file. With around 40MB/sec of throughput on an IDE drive, that doesn't take very long to load, and I don't think SCSI would show much improvement.

    OK, OK, before someone nitpicks, I'm sure that 8MB winword.exe loads plenty of other shared libraries, too... so total disk i/o is probably more than 8MB... but the whole thing still loads in less than a second. :)

    http://www.bootyproject.org [bootyproject.org]
  • by connorbd ( 151811 ) on Saturday April 14, 2001 @09:12AM (#291720) Homepage
    The big problem is that code always seems to be written for the latest and greatest hardware. MacOS X, for example. I don't mind Aqua. But I could live without translucent dragging and some of the dock's behavior -- why not an Aqua light that looks just as pretty but doesn't eat up as much processor power?

    What annoys me more than anything else is that there is absolutely no need for an operating system distribution or a basic office application to soak up massive amounts of system resources. I should still be able to get a copy of MSOffice that will at least run on a first-generation PowerMac (no reason on earth they can't dig up an old copy of CodeWarrior and keep it running on a 68K, for that matter). A simple *text editor* should not need that much space (sorry, Emacs junkies, but I'm a pico man myself).

    Now we have gHz+ processors on the market... well, I have a quarter-gigahertz Power Mac 6500. Boot ROM issues aside, is a 250mHz 603e all that wimpy a processor? Damn straight it isn't. 32MB of RAM is a nontrivial amount of memory as well, yet MacOS 9.1's performance can be charitably described as flaky on my hardware. There is no excuse for this, not when I can run a medium-sized production webserver on a Pentium 100 or less using a stripped-down Linux or BSD system.

    Okay, I personally do not need a spel czecher. A lot of people do; that's arguably a necessary feature. Mail merge, pretty useful as well. HTML filter, helpful (though I handcraft my HTML so I only rarely need it). But why do I need a fruit salad interface? Why do I need a word processor with anything more complicated than a ruler and justification controls across the top of the window? What purpose does a spreadsheet with more than four dimensions serve?

    I like GUIs. That's me; I guess I'm in a minority around here saying that, and that's fine. But I don't need the flash of rippling scroll bars; believe it or not, I find Athena widgets to be rather elegant sometimes (although the scroll bars leave a lot to be desired). Skinning is not a terribly useful thing, though it's nice to have the option; I was a serious Kaleidoscope junkie for a couple of years. But what excuse is there for Mozilla? Oh, we have bigger computers now...

    I HAVE NEWS FOR ALL OF YOU WRITING THE SOFTWARE.

    Some of us can't afford new hardware. I am unacceptably behind in both Mac and Linux expertise because I can't afford hardware newer than a couple of years old (and therefore can't afford a G3 or an Athlon). People are still using Pentiums. People are still using PPC601s. People are still using 486s, fer cryin out loud. Pretty soon the software march will have to slow down because people don't want to be bothered with keeping up with the Moores.

    Okay, that's my rant. I feel better now.
  • How many twelve-year-olds purchase the original Pokemon game these days? Almost none - the ones who would buy it already have it. People don't like to buy things they already have, but they like to buy things. If something inspires envy in others, people like to buy it. You can make your fellow geeks envious if you can afford to play Solitaire on a quad-1G machine.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • The simple fact that BonzoESC is foolish and clicks on the wrong Reply link makes it all flow together.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • by bonzoesc ( 155812 ) on Saturday April 14, 2001 @07:56AM (#291724) Homepage
    We can do with what we already have. I really don't need new hardware - my 1-year-old machine runs 3DS, Premiere, gcc, and TFC just fine. The hardware manufacturers *could* write bloated code, but if one manufacturer did what was right instead of bloat, they could put everybody else out of business. Instead of communally trying to shaft consumers, maybe businesses should strive for excellence. It's more profitable in the end, and the real world wouldn't go Atlas Shrugged as a result of it.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • I video conference all the time with someone who has a slow computer. I am a freshman in college and my girlfriend is a senior in high school who lives 2.5 hours away. I got us both webcams for christmas ($30 logitech quickcams) which give okimage quality. But her computer is only like a Pentium 400. And our netmeeting video conferences look great. Good enough that with the video stuff we can make 'phone sex' look plain silly. :)

    So you are right it is the next killer app, but you are wrong about requiring a huge amount of bandwidth.
  • The phrase "killer app" gets tossed around too lightly. Killer apps? Try spreadsheets. Try desktop publishing. Try e-mail. Those are revolutions.

    The PC market will just have to get used to growth margins that are the same as every other business.

    BOO HOO, my heart pumps purple piss for them.

    --Perianwyr Stormcrow
  • Maybe it's because Windows users know that the only person that wants to see videos of you is you.

    Bzzt, thanks for playing.

    Actually, I'm not getting it to do any videos of me ... the first project is a DVD version of the Seattle Aerobattle paragliding video. [epicsessions.com] Presumably I'll do a good enough job of that so further work just falls into my lap :)

    And moderators on crack can call me a troll all they want, but for doing that the apps I mentioned really ARE killer. Bring it on, bitches, my karma can take it :)

    And anyways, why should you upgrade?

    I still have a G3, the G4 Velocity Engine kicks collective industry ass for video optimized work like DVD encoding, and the DVD-R is only available internally. I'm just waiting for the twin 733 model that's coming out at WWDC in a month ... it will be *quite* the video encoding workstation, I confidently expect, as all relevant codecs and applications are MP-aware and Altivec optimized. Which means they should wipe the floor with dedicated workstations costing an order of magnitude more, never mind other mere PCs.
  • Speaking as someone who ran both NTS 4.0 and W2KS on a P-133/176MB/SCSI-2 machine, I can attest that Windows 2000 really is no slower for interactive use than NT4 + ActiveDesktop add-on, and both setups are certainly usable for the normal MSOffice/web/mail type tasks.

    Intel got Microsoft to put some ridiculous min CPU spec on the side of the W2K box of 350Mhz. They then went out in public and moaned about all the internal Pentium Pro workstations they had to upgrade. But the MIS public didn't buy their shit, because secretly the wished they had 1000s of PPros running W2K and not the low-end Pentium crap that was there, and no mass-upgrade occurred.

    Now Microsoft was getting into some deep trouble here and decided to steal some ideas from the Open Source community. The most effective project at stealing CPU cycles to date has been Mozilla.

    Microsoft thought about it and said "Ah Ha! Screw the browser window, What if we implement the entire Windows shell in DHTML, XML, and JavaScript! We've already got this "IE shell integration" -- time to crank up the flashing doo-dads and start using it. That ought to steal enough CPU cycles and make Intel happy (unfortunately, the IE rendering engine team had done their job a little too well, so a new major version was in order) And lo, Windows XP was borne.
  • As I mentioned, I'm aware of the difference between Flash and DHTML. However, the place I see Flash used most effectively in baseline web-design is as navigation widgets (such as flyout menus, mouse-overs, etc) which really should be done in DHTML if it were practical (meaning you wouldn't have to code it 2-3 times).

    Obviously, if you are using Flash as a animation or movie player, there's no current alternative. In the future, there will be W3C standards which do vector and time-based rendering. But not yet.
  • Absolutely. When I'm at work, I have a PII-400 idleing away on the end of my DSL line. I would love to run a FTP demon to make getting files more convienent, but frankly I haven't got the gumption to set my pager to BugTraq and chase patches all day.

    Give me a Java/SmallTalk/Whatever FTP server, and I'll happily run it. I don't care if it's 200 times slower than pure C - for 1 concurrant user on that box, it won't matter. Sure there might design flaws to worry about, but at least I won't have to deal with some hack's good idea of saving programmer time by exploding administrator time exponentially.
  • by MrBogus ( 173033 ) on Saturday April 14, 2001 @09:09AM (#291746)
    Face it man, if you have to say "Don't do that", YHL. The entire point of Mozilla is to be a platform for the standards-based dynamic web of the future. There's absolutely no point in treating it like Netscape 3, because you can still download Netscape 3 and use it if you want to.

    Tons of people here hate Flash. Well, the only reason Flash exists is because (ahem) Netscape refused to work with the W3C on standards-based DHTML. So people chose a proprietary solution because at least it works in every supported browser. (And I am 100% aware that Flash and DHTML are not the exact same thing.) If and when the browser features converge, Flash and most uses of Java as a doo-dad generator will go away.

    And it's easy to point at lame sites with a Flash splashscreen that you can't get past without having the plug-in installed. That doesn't mean that Flash can't be used very effectively for blinkenlights or navigation on web-pages. Face it, the average American luser is on the WWW, and he wants the web to be as flashy as possible. Enjoy your HTML2 Linux HOWTO sites.
  • Object oriented programming lends itself to bloated code. You take an object that has most of the functionality you need, declare a child and add the last of of the functionality you need. This methodology cuts developement time and helps inexperienced programmers to write application level programs. The result, horribly bloated code. A simple program that performed an important task I wrote in Borand C++ Builder created a 190K executable. Written in Borland Pascal 7 created a 30K executable. Written in Assembly Language (yes I still can) created a 800 byte executable. That is right 190K to .8K. Microsoft's compilers are worse. The functional overhead of loading and running these routines requires cpu cycles, lots of them. I remember when Wordstar ran just fine on a 2MHZ 8080 with 20K of memory and that included room for the program, CPM operating system, and Data.
  • Exactly. All my friends talk about how great their new >1GHz computers are and I just shrug. My year old 700 works perfectly fine in Linux. I can play Quake3 at very close to the framerate I enjoy in Windows and every other time-wasting desktop thing runs perfect, if not better than in Windows. Why would I put up M$'s bloat if I can do better in Linux? My typical desktop session consists of a lot of Netscape windows, XMMS, Gabber, Gnapster, and some various other applications in X4 plus I run Apache, MySQL, and SSH in the background and my CPU monitor is telling me it's 97% idle.

    The truth is that if you're not a hardcore gamer you don't need the latest-greatest-nifty processor simply because it's overkill Most users are content to have a web browser open and some MP3s playing, and you can do that with a computer that's much "worse" than the newer computers.

    -antipop
  • "However, I don't want to get to the stage where I'll have to buy the Intel version of a chipset in order to play my favourite game." I guess you missed out on the 2-3 year span in which AMD went from sharing the loser-CPU playing field with Cyrix to the point where it matched (and slightly overtook) Intel at every turn. Those of us that owned AMD CPU's back then took nothing but years of crap from friends who were somehow brainwashed into believing that AMD was a shoddy piece of junk and not the quality COST-EFFECTIVE CPU that is really is...

    Man, I used to hate those debates...

  • Bloated code will only go so far. There is only so much eye candy to do. After that, not many people are into multi media. Let's face for all of the poeple who use Napster, for example, they are *not* a large segment of the market. In the US the core market of people 18 to 45 may be about 100 million. Granted, not all of them have a computer, but not all of them are into Napster, or editing videos on the desktop, etc. a large portion of them simply use the computer for email, word processing, solitaire, and maybe some kids educational and kids games. The computers are basically there for most folks, they do what folks want, and there is no reason to upgrade. They have no urge to be cool, and they have no need for the extra cpu cycles. The tools they have work good enough.

    as time goes on, that is what is going to happen, things will be "Good Enough" to do the job, and why convert over? Why spend the bucks? As noted in the article:

    My trip to CompUSA makes me think that the people who screw the boxes together should be especially worried about the business customer. I for one spent most of the winter with a garage full of top-of -the-line computers from a failed dotcom I helped found last year. In fact, I think the dotcom bubble has given the entire PC industry a false sense of security over the past couple of years with a lot of fantasy money purchasing some not-so-fantasy hardware. The jig is definitely up and unless Microsoft comes out with a version of Word that can read your mind I don't see many companies going through the trauma of a hardware upgrade anytime soon.

    The XP machines may want to cash in on this, to be the only computer that people will ever need, because things will be good enough, along with the .net thingy. But ultimately, that becomes another nail in the PC coffin. Which is probably why MS wants ultimately to get out of the PC oriented market.

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • by GigsVT ( 208848 ) on Saturday April 14, 2001 @10:13AM (#291766) Journal
    You think that's bad, just think when they release it in color!
    -
  • Interesting, because this is exactly what Sun Microsystems does now. If you want a computer, they will sell you a computer, and you're going to pay top dollar for it. But! Once you have it, you're going to need some software for it. Need an OS? Hey, we've got one of them, here 'ya go, no charge. Oh? Now you need some sort of M$ Office clone? Here's Star Office, free of charge. Oh wait, this is a e-Commerce server? Well, we don't have those, but we pass you off to this company who will give you a big discount.

    Sun sells hardware and gives away the software necessary to make the hardware run. Kinda cool. Like buying a road to drive your new car on.

  • Linux users don't need to upgrade because they never right bloated.....ooo wait, Mozilla.

    Check that thought.


    And Linux users don't need Mozilla because they already have a decent brow... ooh, check that thought.

  • by tswinzig ( 210999 ) on Saturday April 14, 2001 @09:42AM (#291769) Journal
    The .NET thing has potential as a time sink. Implementing RPC via XML will be hideously inefficient. And interpreters are involved, which typically means a 10x performance loss.

    Not that Java is much better. Swing seems to need upwards of 1GHz just to display menus as fast as a 20MHz Mac of a decade ago.


    I realize this is supposed to be partly a joke, but the computer industry is not creating this kind of software in order to increase hardware sales -- they are generating this kind of software which has been made possible by faster hardware.

    But why are they inventing so-called "inefficient" code? Because it's really EFFICIENT -- for developers. It's also easier to maintain.

    Java/XML-RPC/etc. are all software inventions that make it easier to develop sophisticated programs.

    Heck, why don't we program everything to the metal anymore? Everyone, turn in your C/C++ compilers and stick to assembly programming.

    No, I don't think so. C/C++ makes it much easier to develop more complex programs. Java makes it easier to develop cross-platform programs. XML-RPC look to help make it easier to develop cross-platform programs that are centralized on a server and easy to upgrade/maintain.

    You are trading program inefficiency for programmer efficiency. The faster hardware gets, the more we are able to do with it.

    Do you want "the killer app" that's going to fuel hardware sales now and beyond? What about speech recognition technology that doesn't slow your system down to a crawl? The more advanced it gets, the more CPU power it's going to need.

    Twenty years from now, if I'm still clicking on a fucking icon I will shoot myself.
  • Why did they have to change the stupid file format each time, though. Oh, wait. I know. To promote non-interopability. To achieve and maintain monopoly.

    This is why I think document formats and the like should be open - by law, maybe. MS could then add whatever crap they wanted to Word. If it was really that great, people would buy it. Otherwise, they could use the old versions (or competing products!) with no problems. Competetion would be encouraged with open document formats.

  • by snoop_chili_dog ( 314897 ) on Saturday April 14, 2001 @07:58AM (#291839)

    Linux users don't need to upgrade because they never right bloated.....ooo wait, Mozilla.

    Check that thought.

  • by janpod66 ( 323734 ) on Saturday April 14, 2001 @12:30PM (#291843)
    Project like Gnome/Gtk, KDE/Qt, and Windows/MFC go out of their way to write things in "efficient" languages like C and C++. Many of the people on those projects look down their noses at any suggestion that one might use a high-level, slower, language for writing GUI stuff. And the Linux kernel seems to be a refuge of people who think that anything other than ANSI C is wastefully evil.

    Yet, at the same time, many GUI applications under Gnome, KDE, or Windows are huge, complex messes. Trying to modify their behavior is an exercise in patience and persistence, not just because of the mountains of code one has to wade through, but also just because of the lengthy edit/compile/run cycle. And the irony is that, while those systems start out really fast when they are small, taking full advantage of the "fast" languages they are built on, they actually get very slow when they grow, because their authors end up reinventing higher-level language constructs without being able to do a good job at the implementation ("GObject" in Gtk is a recent example).

    Even trying to install a sound card for the Linux kernel can take hours in trying to track down the right version and getting bits arranged just right for the superfast but dumb kernel to have its driver nuggets in all the right places.

    Let's use the spare cycles and memory to make our systems smarter and easier to deal with. That does not necessarily mean something as complex as "artificial intelligence". It may mean putting a scripting language into the kernel that lets people add simple kernel extensions simply. It may mean using a language like Objective-C to extend an existing C system. It may mean doing GUIs in Python or Smalltalk or even just Java.

    There are lots of things wrong with software: it's hard to install, it's hard to manage, and it fails a lot. Yet, both Linux and Windows developers still have an unhealthy obsession with performance (and, often, they don't even achieve it). Simplify your projects and deliver a better product: put those 1GHz+ machines to work by writing in languages that don't force you to optimize every bit. And if you can't get over worrying about performance when you look at that pretty but sluggish scripting language code, close your eyes and think about the good of the US economy.

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...