Lt Wuff writes "CNN has a story about how the newest/fasted/latest and greatest processors aren't selling like Intel and AMD hoped. Maybe people are wising up to the fact that you don't need the fastest processors on the market in order to open AOL..."
This discussion has been archived.
No new comments can be posted.
You heard me right. The people who are responsible for our current economic slump, are the game developers. You might say, "Game developers, I would think game players because their not buying this newfangled technology, but not game developers." Well, its the game developers' fault you chump. Always developing crap that runs on all last-gen vid cards and never embracing powerful new technologies. Their slow integration of more modern technologies has made people so disillusioned with newer technologies that they frankly don't give a rats ass anymore. I agree with them. Most games out now can play on most old hardware. Because everybody is so damn afraid that their software won't sell because it won't run on existing hardware that they just limit its capabilities to hardware that has already penetrated the market. So why upgrade. If I have a computer that can play any game out there (and had it for the last 5 years), why should I upgrade.
Developers should stand up to their patriotic duty and develop games which thrive only on new hardware, only on the fastest, biggest, brightest boxes under the sun. Only on machines which took mommy and daddy twenty years to save up enough money for and will be outdated in the inverse of that time. Only on the biggest capitalistic ventures of all computer fabrication history. Only when the developers step up, will our economy recover. But there is still the problem with the throw away society vs. the persistent friendly environment struggle raging in our computing worlds right now.
This is where Microsoft needs to step up to the plate. Require an entire internal hard-drive ONLY for the OS. And require all programs and documents to be stored in an external storage mechanism which when plugged into any existing windows workstation will automatically load in the registry, shortcuts, desktop, and what not (applicable to the users security context of course). This way when somebody decides: "Hey I can't play new games any more I better upgrade." They won't have the laziness factor breaking in with, "Yeah but then you'd have to get that geek from next door to help install all your programs and stuff like that and its just not worth the hassle." Because people in America are lazy and are all about how much effort they have to exert to get a task done. Using the approach I have stated all they gotta do is unplug their drive toss out the computer. And say hello to brand spanking shining new computer, and good bye $2600 cash. With which they can finally play all those newfangled games, unlike everybody stuck with the last gen computer.
Uh, Game developers aren't in the market to sell hardware. They are selling software. They want as broad a potential market as possible. If a person has to buy a new $2000 computer just to play the developers $50 game, then the developer isn't going to get a lot of sales. Games like that might make Intel happy, but if they really want games that suck up cycles like mad, then they should write them.
My guess is that with the XBox out, things are only going to get worse. Most PC Game companies won't want to write something that couldn't at least theoretically be ported to the XBox. Microsoft has basically removed most gamers primary reason for upgrading their PC.
As for the rest of your rant, what Microsoft really needs is a/home directory like UNIX. All of my files, settings, and everything else sits in that directory so that moving to a new machine is easy.
> Also, why aren't there three sub processors one that does the actual operations, > one that controls all data input making sure no cycles go without processing data, > and one with data output making all results immediately available for display or what > not next clock cycle (all three should be programmable with their own instruction > sets and use the same registers).
Sounds like fun, but the machine would be a living nightmare to program for. Witness the playstation 2 for a real-world example of this design mentality.
Actually, witness mainframes for a real-world example of this design mentality... Using VLIW and optimizing compilers, it is easy to abstract dedicated I/O processors as nothing more than opcodes of the CPU you are targeting. And since your compiler looks after the optimizations, you basically get a free lunch.
Neither do I. My year and a half old processor (Athlon 1.1 Ghz) still runs new games (read Unreal Tournament 2003 demo) at 1600x1200 at acceptable fps -- granted I have a Geforce 4 ti4400.
>>The economy is a huge driver in this, and if they don't see that, they are silly.
Good point. You wonder if they are falling into the same trap the recording industry has fallen into in overlooking the obvious.
Don't get me wrong, I want the fastest processor possible (if I can get it for free), but right now I just can't find anything I do that DEMANDS that I have 2+ Ghz power.
Amen brotha, my dual P3-550 still runs FreeBSD fine, and the 1.2GHz TBird I bought a few months ago runs Windows for my wife. No real need for the latest and greatest, maybe when someone comes out with a cool game (there hasn't been one since Half Life) I'll upgrade.
Why in the world would anyone want to spend the money on a top of the line processor when they can buy an entire computer based on a value processor for $299 at Walmart.com? Heck, instead of spending $1500 or more on a new computer, I can buy three computers over the next year and be pretty sure that the computer I buy six months from now will be faster than the expensive computer I am buying now. So what if these computers are crap. At these prices I can afford to purchase another.
Besides, I don't want to spend my money on a processor. I don't run an processor intensive apps. I want more memory, a bigger monitor, and a faster hard drive. Spending money on a fast processor is just a waste.
The funniest part about this is that the killer application that would drive people to buy new processors is multimedia sharing. Encoding and decoding multimedia sucks down cycles like crazy. Instead of making it easy for people to share multimedia files Intel and AMD are busy making it as hard as possible. If sales are bad now, imagine when Intel and AMD's new products come out that treat their customers like criminals.
The funniest part about this is that the killer application that would drive people to buy new processors is multimedia sharing. Encoding and decoding multimedia sucks down cycles like crazy. Instead of making it easy for people to share multimedia files Intel and AMD are busy making it as hard as possible.
Wrong. Multimedia encoding takes lots of processor power, but multimedia sharing takes comparatiely little. Just think about it. How many of the people on the net are actually ripping and encoding the files they're sharing, and how many are just downloading them and passing them along? I'd guess that there are dozens or hundreds of people who only copy for every one who actually encodes something.
Yes, I understand that fileserving (especially over the Internet) doesn't require a lot of processor power. I suppose I can't speak for those folks that are infringing on copyrights, but I have been ripping my CDs to Ogg, and I am starting to think a faster processor would be a good thing. And my buddy that is encoding all of his digital video is even more interested in a powerful processor.
And that's what Intel should be pushing. They should be running commmercials where people are sending video CDs of their kids to the grandparents. That requires a nifty processor, and it is precisely the kind of thing that gets normal people to upgrade their computer. Unfortunately Intel has been paying too much attention to Hollywood, who believes that they are the only folks that can make movies.
Besides, why should Intel care if people are downloading media? They aren't in the media biz. Multimedia files are big, and most people also purchase new computers when their hard drive gets full. Sure, you or I might simply pop in a new hard drive, but that's not at all normal. Computer sales mean processor sales. In other words it is in Intel (and AMD's) best interests to encourage people to share multimedia files.
The fact of the matter is that unless you are dealing with multimedia there is little reason to upgrade your Pentium II 500, and yet for whatever reason the hardware companies are going out of their way to make multimedia difficult to do on PCs.
I think it's a matter of diminishing returns. If a $75 CPU runs at 1.5 GHz and is fast enough for 75-90% of the computing tasks you do, and a 2.5 GHz CPU costs over $500... then why would you even consider the 2.5 GHz CPU? It's too expensive, and it would only impact on a small number of computing tasks (encoding/decoding, video capture, DOOM3).
Now that even value CPUs are ridiculously fast, there isn't much reason to buy the top of the line. I used to buy dual processor boards and populate them with two of Intel's second or third fastest workstation CPU. Those days are over, since I can't really imagine myself wasting so much money, just to get an additional few megahertz. Now I look to previous generation workstation CPUs, since they're being dumped on the market to clear stock. Plenty fast enough for me. My last purchase was two 1.2 GHz Athlon MPs, back when the 1.6 MHz (1800+ MP or thereabouts) MPs were being sold.
I've found the best way to build a system is to get the mid/high level chip rather than the top end, the savings are large enough to speed up the system in other areas (like lower latency RAM).
If you can build someone a decent computer, but keeping costs down (I don't mean getting crappy componants), they're far more likely to upgrade sooner and in the same manner so the cost is more spread. i.e. someone spends $2000-3000 on a brand spanking new computer (latest everything) but loses the ultra performance crown in 3-6 months, is going to be less inclined to get a new system 1-2 years down the line (unless they have cash to burn), wheras if the costs are under $800-1000, they never lose the ultra peformance crown 'cos they never had it in the first place.
I suppose the nice thing about new chip releases (esp. major revisions) is they knock the lower specced chips down nicely.
Ditto that. My PIII 550 is still my main machine and does what I do reasonably well (play games, coding, experiment with new OSes, etc...). The last upgrade I bought for it was a GeForce 4MX, and that increased my gaming "productivity" tenfold.
Then again, my (very) aging P75 NEC laptop with 40MB RAM still works quite well as a portable development platform with FreeBSD. Not the fastest thing in the world, but for taking my coding outside, it does the job I need it to.
The economy is a huge driver in this, and if they don't see that, they are silly.
Yeah, it's really so strange how being laid-off from a job that paid mid 5-figures up to near six-figures or more, scraping by on unemployment, really cuts down on consumers' willingness to plonk for newer technology. But companies saved staffing costs...
Especially when their "old" (2-3 years) home desktop or notebook PC works just fine for email and surfing job-search websites.
Henry Ford was a prime SOB, but one thing he did right was pay his workers $5/day (a high wage at the time), realizing that he'd never sell enough Model A's unless his workers could afford them. Today's overpaid and overprivileged corporate executive class seems to have lost sight of this. Refusing to pay more than rock-bottom wages destroys demand for their own high-tech products.
Did I say anything about "creating" wealth? Ford's goal was to create an economy where he commanded as many resources as possible towards the creation of automobiles. Ford wasn't minting money, he was transferring his wealth (i.e. current profits) to the workers.
A few rich CEOs can buy maybe a dozen or so cars before the marginal utility starts to wear a little thin, but if you can persuade those CEOs to transfer enough of their wealth to their employees so that the individual employees can afford a car, then you've got utility by the boatloads.
Indeed, history shows us [state.mi.us] that other automobile employers had to follow suit and Ford's profit sharing program resulted in much cheaper cars across the board.
Granted, he also raised wages in order to prevent turnover and that heightened efficiency led to the cheaper cars. Happy workers are productive workers and, yes, they do actually pull more natural resources from the ground and produce more sprockets under the right conditions. (Note: Even if inflation is looming, the workers don't know it yet and they're still happily working harder.)
That's rather the point. If all you ever do is keep lowering employee wages until you hit the sweet spot, you'll be in a whole lot of trouble when you realize that you were on the wrong side of the labor supply and demand curve the whole time. High turnover was a warning sign to Ford that he (and the rest of his industry) were on the wrong side of the curve and they needed to raise wages. Ford undoubtably realized that one of the happy side-effects of repairing the situation was that there would be a heightened market for automobiles among the working class. Everybody wins. Ford builds his automobile empire and the working people get their cars. (Well, everyone wins except for us, the 21st century recipients of the negative environmental and social effects of car culture.)
jpmorgan, you're only correct when an industry is operating on the far side of the labor supply and demand curve. I also suspect you're not much of a Keynesian.
I suppose this all has some relevance to the recent situation. There was, afterall, a very high turnover rate amongst tech workers during the 90s who were chasing pre-IPO dreams. That sounds like (one of many) dead canaries in a mineshaft to me. (Remember: Although a few won the IPO lottery, the majority of tech workers didn't and suffered grueling hours, draconian IP contracts and vaporware products as a result. Meanwhile, corporate propaganda was telling us that the supply of workers was low when, in fact, we now find most tech workers are out of a job.)
While this may not bode well for the likes of AMD and Intel, it is really good for the consumer. I'll be needing a laptop soon, and if I can buy one next summer, it should be fast enough to run things for at least 2 years and not feel overly slow and it will also be affordable.
Of course, the only problem with this is that there may be less money pumped into R&D.
It seems that intel and amd have recently made pretty large jumps in their processor speeds. And while windows XP is processor greedy, the benchmark for good performance in XP was surpassed a while back.
So I think we are just seeing the results of a software lag, where the current batch of software doesn't need or even work better with the highest end processors.
On the other hand, video cards are taking more and more load off of the CPU. And they cost about the same. I know I've upgraded my video more often than my CPU. I've got four videocards sitting on my desk right now, victims of perceived obsolesence.
Maybe the future trend is for other peripherals to start adding computational functionality, and further reduce the CPU load. Perhaps CPUs of the future will be used for nothing but scheduling and coordination.
I don't see any changes with computer games. They still usually require a lot of power. My Pentium III 600 Mhz system with a GeForce 2 Pro chogs with the newest games.
You totally missed the points. The article's point is that typical user applications haven't gotten more complex as processers have gotten faster. The poster's point was that even for the class of software that has gotten more complex (games), much of the complexity is being offloaded to the video card. Newer cards are doing things like transformation and lighting in the hardware, rather than having the CPU bust it out for you. So it might be that just upgrading to a gf4 or a radeon 9700 is all that will really help fps on games that are designed with this offloading in mind.
If Intel really wants to push sales of their chips, the best thing they can do right now would be to encourage developers of applications do default to using encryption, a CPU intensive process that hasn't yet been offloaded to dedicated hardware. Games are intensive but newer video cards dedicated in design to rendering outpace even a CPU with SIMD. Increasing CPU speed might be able to get you a faster load time, but you'd have to switch to something like [theproduct.de] generating textures on the fly
And while windows XP is processor greedy, the benchmark for good performance in XP was surpassed a while back.
I was actually writing a Slashdot submission some time ago that dealt with "minimum requirements" and how they are determined in the software industry. For instance, for Windows XP Microsoft states as the requirements "PC with 300 megahertz (MHz) or higher processor clock speed recommended; 233-MHz minimum required". I offer up the opinion that they pulled these numbers out of their ass, and that is the general routine of the software industry in general. While items such as memory or hard drive space can be actually metered and truly quoted on in minimum configurations (recommended becomes more of a suggestion as it is completely subjective: If you're willing to tolerate endless paging, Windows NT 4.0 will run on, and was originally specified as for, 12MB. If Microsoft re-released Windows NT 4.0 pre SP1 today, they'd claim that it required a minimum of 128MB, and a 300Mhz+ processor). I believe that software manufacturers simply find the middle to low end in the current marketplace and stick that on their box with the hopes that more detailed "requirements" makes it appear that the QA department did a better job, when all it's really doing is needlessly muddling and implying metrics that don't actually exist. Minimum CPU requirements for non-realtime applications are a farce.
Why do I bring up XP? Firstly, I've found XP to actually be significantly less demanding than Windows 2000 (for instance startup times have dropped dramatically as they optimized the kernel and ancilliary code). Windows 2000 specifies a "minimum" processor of a 133Mhz Pentium [microsoft.com], yet Windows XP specifies that you need a 233Mhz or higher processor [microsoft.com]. Why the jump of 100Mhz? Does it latently consume more resources? Checking my CPU meter I can see that it generally sits at 0%. Compare this to Windows NT 4.0, to which XP still shares a tremendous lineage (one can still run virtually all current software on an NT 4.0 machine) which only requires a 486 33Mhz [computerhope.com]. Claims that XP is a CPU hog are ridiculous: While it can be demanding from a video perspective if you have the "effects" on, and you should have lots of memory, it would likely run perfectly fine on a Pentium Pro 60Mhz, presuming you had the required memory.
Why do I say this little rant? Because I truly was interested some time back about the engineering foundation for determining and quoting on minimum, recommended, and optimal configurations, and how they are derived.
Maybe the future trend is for other peripherals to start adding computational functionality, and further reduce the CPU load. Perhaps CPUs of the future will be used for nothing but scheduling and coordination.
It seems like such things go in cycles. Originally, mainframes would do serial comms by twiddling outputs with CPU instructions directly. Then someone sez "hey, it's now possible to build a little buffer circuit that does it for me". Then later, as CPU's got faster, the wealth of extra clock cycles were put to use tiddling serial bits again because it was faster than the homebrew serial buffer. Repeat until you reach today, where the UART chips are fast enough and cheap enough that we'll not see direct serial manipulation by the CPU again. Right now it seems we've reached that point with video. The CPU can't even come close to what the vid card chipsets are doing, so that task is currently in an "offloaded" cycle. But who know what the future might hold? They may come up with something new that has so many clock cycles to burn that it can run circles around a GF4. Not likely, but also not impossible. Basically, it appears that functions go off-CPU permanently when the peripheral hardware that performs said functions becomes cheap and plentiful.
I blame lazy/inefficient programming for todays ever increasing processor demands. A good example would be Jedi Knight 2. On a my P3-800 with 256mb of ram and a geforce 2 ti it ran like ass anytime there were more than 3 guys running around or if I was in a big room. Really pissed me off. And that was at 800x600, I had to turn it down from 1024x768 because it was unplayable.
2 nights ago I downloaded UT2K3. I thought it was going to be worse than JK2. So I turn off all the effects, runs fine. Start turning settings up. 800x600 with medium effects runs fine. So I go to 1024x768 with full effects. Runs beautifully. Dropped 10 bots in, no drop in performance. I would've put more in but they were owning me.
Kind of off-topic, I know, but it really opened my eyes to what programmers can do if they honestly care about the their public and put good programming techniques to work.
On a my P3-800 with 256mb of ram and a geforce 2 ti it ran like ass anytime there were more than 3 guys running around or if I was in a big room. Really pissed me off. And that was at 800x600, I had to turn it down from 1024x768 because it was unplayable.
News flash: if changing resolution improves performance, then your problem is that you're fillrate bound on the graphics card. Nothing to do with your CPU, nothing to do with "lazy/inefficient programming".
If you were getting the same crappy performance regardless of resolution, then you'd have a point.
Well, my Athlon 650 (512MB RAM, GeForce 4 MMX 440) runs Warcraft III at 1280x1024 as smooth as silk, but UT2003 is choppy at any resolution, so it's CPU upgrade time in the next few months.
Or I may just hold off until Doom3 comes out and do a mobo/cpu/ram/gpu upgrade. I can wait...... I think.
At work we refer to this as Lazy Programmer Syndrome. In short, left to his own devices, a programmer will work on performance until it is just tolerable on his kick ass high end development machine. If he is developing a multiuser system he will only optimize until performance is tolerable with a single user (himself). Have pity on the server when 50 people start using the service simultaneously.
There is know known cure other than enlightened managers and these are hard to come by.
If lack of time due to deadlines is the only thing keeping software from being optimized to its full potential, then Duke Nukem Forver will run like the friggin wind on a 386.
:-) We had something similar last year on a pretty big web project. We had two servers - one running httpd and mail, the other a dedicated database server. Due to the huge and unnecesarily complicated database (the customer wanted the ability to add new columns/tables at any time via a web interface and still have it work ok - not very efficient since it ended up with a huge number of redundant tables) it was taking about 2.5-3 seconds average for a query.
This didn't seem too bad until it went live and tried to cope with over 1000 visitors all searching and messing about within the first hour. The phone was ringing with complaints just after lunch;-)
It was then that I happened to notice the dual P3 database server only actually had one processor fitted. The supplier swore that we'd only ordered one CPU which seems odd when we knew we needed two and it's slightly inefficient to buy a dual CPU motherboard with 1 Proc. Plus we'd paid for 2.
Didn't make much difference anyway; with another proc in there the load was still way above what it could handle;-) You live and learn...
There is know known cure other than enlightened managers and these are hard to come by.
Sounds like what you really mean is Unrealistic Deadline Syndrome. Try telling your boss that the software is feature complete, but it needs to be 50% faster. See if he gives you the extra month.
either they are uneducated because of the morons teaching them or they are just too lazy saying "hell there's plenty of processor cycles for this"
No, you see, when you have a JOB, you have to do what your BOSS says. PROFESSIONAL programmers usually have a BOSS which gives them a DEALINE. Kid, when you come down from your collegiate ivory tower and get a job, you'll see what I'm saying. A month for a few milliseconds? You're fucking crazy. That's like telling a company, "we can speed this program up for you by a few milliseconds, but it'll cost you about $10K".
So, because this stupid PROFESSIONAL is worried about his BOSS and DEADLINE, he does a crappy job programming. Consumers find this out ("Man, don't buy any games from xxx company! They run slow as hell!"), and stop buying games. Company goes out of business, and BOSS and PROFESSIONAL programmer are on the street.
And don't start with the "they'll make it as a mod for UT" because that's lamer than hell.
Plus, they got rid of the sniper rifle(for the most part, the delay on the lightning gun really cripples it), which is great in my opinion because it will eliminate all the damn facing world's players that NEVER left the roof.
I recently decided to upgrade one of my PCs and settled on a dual CPU. The motherboard and CPUs are all available for commodity prices and thus give far more value for the dollar than a 2+gig single CPU.
AOL Intel, and AMD enetered into a secret agrrement, code named "Show me the Money". In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
When you can wait a few months and get a cheaper processor, that will do the trick. Most of my friends, when building their own computers, will wait for the second or third generation chips (ie AMD 1800XP+). They can get a fast computer, for cheap.
Just take a look at Pricewatch. The Athlon XP 2200 is at $144, while the Athlon 2000 is under $100. Why would you spend that much more on a new processor, when you aren't getting alot more speed out of them.
With a few months turn over, it is worth the wait to save $50 or more on a slightly older processor, than on that latest processor.
Intel and AMD should just realise that it is id Software that drives the early adopter market segment of chips. id should be getting a cut of all cpu's sold. Pay the MS Tax, and the id Tax.
View Quake (and soon Doom) releases relative to chip sales, and I'm convinced there will be a correlation. There is wider macro economic factors, but the key driver is Frames Per Second for the latest id software release.
Those of us on/. who know better can put together a nice system with yesterday's parts, but I think the average user still equates processor speed to overall performane. Even when Joe claims to consider RAM, he seldom considers the speed of that RAM, and never the FSB or the hdd speed.
Intel marketed it's processors on the basis of the clock speed. While the 2.8 Ghz did have a 533 Mhz FSB, for the most part, the common joe-driven PC market has grown up thinking that CPU speed makes the biggest difference.
I think perhaps people have just gotten tired of buying new computers--it's just not the next 'big' thing like it has been for the last 10 years.
Ive been running a 350Mhz PII for the past 3 or 4 years, and its been fine - just a matter of tweaking it here and there to get as much performance as possible. I upgraded just yesterday to a 2.26Ghz, but not because my machine is painfully slow, or that I want to start playing games (other than soitaire, I don't), but just to make my overall experience of using the machine more comfortable - and to have another machine to network with.
For what I want to do, its been perfectly fine. But occasionally I try out some cool screensaver, or have to do a kernel compile or something, and only then is when I notice the difference.
With enough memory, modern operating systems can function quite well on older processors. They have fairly advanced memory managemet, scheduling etc. Microsoft Word XP starts in about a second on this older machine - how much faster do you really need?
Well, of course it depends on what you do with it.
For instance, at home I've got a PII 450MHz w/ 384 MB RAM (bought just over four years ago), with the occasional upgrade (e.g. GF2MX400/64MB instead of the OEM STB nVidia Riva TNT card it came with). For writing, it's fine. For coding, it's fine. For work... since I do a lot of number crunching, faster would be better, but it's not my work box. For gaming, for/my/ tastes (mostly turn-based strategy, e.g. _Dominions_) it's OK, although CM:BB is going to push it (because it does a lot of math, I'd suspect, to compute those 60-second turns out in the steppes with long LOS and numerous vehicles). Shogun:TW was a bit dodgy on it (and massive musket battles were an absolute no-no), 'tho, and for an action gamer it'd be a really lousy system.
My work (a lot of statistics -- number crunching) could obviously use more CPU speed, and, depending on task, RAM (but I reduce the need by using online approximate quantile algorithms to "sample" the data, so that processing occurs on a mere subset). That's where I can/really/ use faster CPUs, disks and memory -- running tests which take multiple days of computation ain't pleasant.
As a reason to upgrade your computer to a 2+ GHZ machine. The funny thing is, most people don't know any better and assume that buying a new computer will make the Internet faster. The FTC needs to start coming down hard on computer companies who advertise that a new pc will make the Internet faster.
Agreed, "downloads" is misleading, as it could also refer to downloads from you, zipping across the net to other people. That's very important if you upload a lot of content regularly, as I do.
But it's true! For example, those flash-animation ads suck up a LOT of CPU, and a newer processor would render them faster. Flash ads are the primary reason why I'm thinking about upgrading...from my P-166.
You know, the Internet is more than how fast bits flow into your computer. The speed of the processor directly affects how fast your pages render. In fact, I recently upgraded my inlaws to a much faster computer, and they commented on how much faster "the internet" was. (their normal home page renders ridiculously slowly for some reason)
In other words, the Internet is not much good without applications to use it, and faster applications == faster Internet.
Actually, the most common reason for improved performance of "the internet" that I've seen is increased RAM. Browsers tend to chew memory, and if you don't have enough, the page swapping will slow even a fast CPU to a crawl. There are plenty of people upgrading 400MHz machines to GHz+ machines, who really just need to pop in another DIMM. (Of course, they're not capable of doing that, so buying a computer may be the best choice... Ignorance costs.)
Well if I had to choose between having a P-II w/ 128megs and an Oc3 line or a P4 w/ 512 megs and a 56k modem, I would choose the P-II for faster internet.
well I recently upgraded my roommate to XP from 2k (his choice, not mine) and we found that one of his 32mb sticks had gone bad. So for the install and a day he was stuck at 64mb.
Pages DO NOT render faster between 64 and 128 it seems. The god damn programs load faster and the OS is quite a bit more responsive but the Internet is not any faster.
If you ever had to use Internet Explorer on the iMacs at my school you would definitely believe that a faster machine would 'make the internet run faster.'
Render times for pages and the speed of your connection could both be construed as factors on how fast you load things on the web, but after a certain point (long passed, 500mhz+ machines) you don't need anything more.
If you're running MacOS 9 with IE, don't even kid yourself, that's not browsing. That takes longer than Photoshop rendering (no lie).
Especially tables. The worst part is it locks the machine in MacOS 9. I click on/. "Read More..." links and go take a healthy dump and it's still not ready when I get back. Then all my AIM windows scroll out of control due to messages building up. Since that sucked up memory for all the incomming packets that needed to be saves while my machine was frozen, AIM bites the big one and then takes IE and the rest of the system with it.
Netscape, on the other hand, has a sluggish UI but at least it doesn't lock down my system durring such routine tasks as page rendering.
Sometimes I wonder if Microsoft does it on purpose...
The funny thing is, most people don't know any better and assume that buying a new computer will make the Internet faster.
For a lot of people it *does* appear to be faster, but it's not just simple CPU speed. If the average CPU updater is creaking along on a 1st or even 2nd generation PII (233-450Mhz) system, chances are they also have little RAM (as little as 32MB), a much slower memory bus (66Mhz on 1st gen PIIs) and slow disk drive, a weak video card and a creaky OS and browser
When you move up to a contemporary machine you end up with a better Windows (2K and XP are superior to 95 and 98) running a newer browser, more RAM which reduces swapping, and when swapping DOES happen it happens to a faster disk at ATA-66 or even ATA-100 speeds with little fragmentation.
All this does add up to a browsing experience that seems faster to most people. The cognescenti realize that its just a better machine and that the internet pipe is no better, but to the other 99.5% of the computer-using population the internet got faster..
Yes I know about browser speed and such, but people assume that a new computer will speed up their network connection, even though 56k modems have been with us for quite some time. I bet the great majority of people here would choose broadband and a 800 mhz Athlon over a 2.8 GHZ P4 with a modem hands down.
We already determined in a previous slashdot article that gaming pushes computer tech forward. Since the minimum requirements for most games are still a 500mhz cpu with a 32MB AGP video card, nobody has a need to upgrade their pcs except for the most bleeding edge gamers, and other power users who do video encoding or AutoCAD type applications. I remember back in the day Virtual On came out for PC and the minimum sysreq were higher than any available pc on the market, unless you had 5 grand. When the minimum amount of power required to use new software goes above the power of most people's pcs then they'll start buying faster CPUs. Heck, even the people who are already buying faster CPUs don't buy the fastest processor available. The money:speed ratio makes it so much more worth it to buy the second or third fastest AMD, even though the fastest P4 is the best you can get.
AOL is extremely resource-intensive. When connecting, I can't even get the Ctrl+ALt+Del window to open (don't flame me: I'll use AOL until Saturday when I go to college).
So, most people seem to be in agreement that you don't need a faster processor to run today's applications. I would agree with that for the most part. I'm able to subsist on a desktop P3-800 and a laptop P3-600.
However, it's important to realize that the drop in sales will also result in a corresponding drop in research.
I'm kind of not happy about that... since I think it will slow down the pace of technology, at least on the client-side (versus server side, which was just beginning to be penetrated by the desktop architectures).
It may have 2 very cool side-effects, though:
1) Pervasive computing may become more... pervasive? It will be possible for the embedded computing to catch-up to the desktop power because more time will be allowed for miniaturization -and- embedded platforms will last longer (example: AMD is killing it's AMD K6-2 line because it's too slow... this will hurt alot of embedded products because the market isn't strong enough to allow redevelopment onto newer platforms)
2) Network/Telecom/etc infrastructure can finally catch-up. I strongly believe one of the things that caused the Internet boom was that a majority of people had access to modern telephone lines and most could scrounge up a computer. Since then, computing technology has outpaced infrastructure development (by that I mean -many- people currently still can't get xDSL, and yet your average new computer could completely swamp a T3). If things slow down and stabilize, we can again let the infrastructure mature and saturate the market, which is often the recipe needed for a new technological boom.
However, I am going to be upset if I can't buy a 32/64bit Hammer in a year at a decent cost, just because I want it:)
> However, it's important to realize that the drop in sales will also result in a corresponding drop in research.
Perhaps, but I think much of the research will shift to making existing technology smaller and cheaper. Perhaps next year when they we are discussing the latest-and-greatest here on Slashdot, it will be part of a $299 machine at Walmart. Ok, maybe not quite that soon:)
Seriously though, I don't think we need all that much innovation on the desktop anytime soon. I would like to see them focus on software innovations and technology like making USB more reliable. Remember, mainstream 64-bit is just around the corner too, which will probably mean more spending once the prices reach acceptable consumer levels.
I'm still waiting for an open-source version of Visual Studio to show up on SourceForge.
I browse/. at a threshold of 4, so if this is redundant, I appologize.
I have an infinite appatite for more toys. The only thing preventing me from buying quad Xeons, dual Athlons and a bunch of Sparc hardware is that I'm broke. This last year has been very difficult, and I think even more so in the technology sector. If we all start getting rich off of killing foreigners or something, then maybe the demand for more power will return. In the mean time I'd be more impressed if they could show that people were spending the same (inflation adjusted) money on lower-end hardware.
The article itself does mention the economic slump, but doesn't actually provide any real facts or data, just anecdote and fluff.
I think AMD and Intel should send their R&D dollars to Western Digital, IBM, and Maxtor. Instead of wasting the money building processors nobody wants, they could all be working on hard drives that spin faster than 7200 RPM and have bigger than 8 MB in their cache.
THEN I'd be happier with the speed of my computer.
Hmmmm...I wonder if we will keep seeing chip speeds double every 18 months, or if the ChipMakers(tm) will only offer consumers faster chips when slower chips see lost sales?
I started making system purchasing decisions for other reasons around the 800 MHz point. My main issues now are size and silence. I could not care less that I could double my SETI@Home work units. The thing that allowed me to do that would still be noisy and clunky. Give me small and quiet any day.
I hear that from friends. I'm in the same camp. I waited for 1GHz piii's to drop below $150 so I could run Linux fanless with a matrox 2d card. I'm a coder not a gamer -- I want quiet!
I looked at the cappucino (at thinkgeek.com), but it was too expensive (yeah, I'm a cheapskate!). You might be interested in it.
The clock of the processor may be faster, but overall PC performance is not. Your typical low end machine sees little difference between a 1 ghz and a 2 ghz processor, the memory and hard drive are bottle necks.
Looking at toms hardware benchmarks, http://www.tomshardware.com/cpu/02q3/020826/p4_280 0-09.html , the fastest Athlon is just over double as fast as an Athlon 850 and much of that is due to the faster FSB and ram. I am running an Athlon 1.4 ghz and there is nothing on the market which is worth upgrading to. Why upgrade for a 25 to 40% increase in performance. Ever super overclocked, none of the processors offer double the performance. If I want more real world speed, I could spend the same money on a scsi card (have one already) and a small scsi disk for windows to sit on. it's just like a car, you can only do so much with the engine, then you have to worry about the chassis, heat, and traction.
I'll buy the high end CPU's when I don't have to worry about chipping or cracking the silicon. Fool me once, shame on me... I know the P4 has a nice slab of nickel on top, but I don't care for the performance/price I get for the high end CPU's. That leaves AMD, and I'll be damned if I spend top dollar for something I can crush that easy (again). With much fear and trembling, I got my dual MP CPU's mounted in my workstation. I spent ~$100 for a 'disposable' CPU and ~$80 for mainboard, which was an AMD XP 2000 (1.66?) and a cheap Asus board with the works last week, but no way will I bite for the top end processor for my gaming box until I get a no heat sink whammy guarantee. When I see something I can lapp, I'll pull out my wallet for something that can run 1942 w/o lagging.
I know its coming... I've seen (pictures) of the engineering samples...
I thought intel et al were supposed to be using software as a loss leader to sell their custom hardware. Not much use if all the software runs on old hardware!! (So how come they haven't started pushing Linux XP yet?)
I do too need more CPU power, I want to play around with four dimensional fractals, and I need at least a 25x-30x speed increase to get anything near real time renderings. Hell if I want to work with them in a resolution higher then 320x240 then I need at least a 100x increase in CPU power.
This is a good thing. Maybe the big chip vendors will stop focusing on Mhz numbers and start making smaller, cooler and more efficient chips.
If I have the choice between a giant-ass chip with multiple fans that takes 100+ Watts vs. a slightly slower, fanless, power sipping version, I'd definately take the latter. I'm definately keeping my eye on the Via C3 [via.com.tw]...
I'm running a Z-80 computer system (1 Mhz) and I was wondering if I should upgrade to the new Z-80B processor. They go 2.5 Mhz, and supposedly really scream. But is the extra speed worth the $4 that the new processor would cost? Thanks.
I for one got a dual Athlon 1.4Ghz back around april, thinking it'd be really cool and fast. Instead, it's horribly underutilized. I think I'm going to hold off buying computer crap for at least a year or two.
My boxes cover the spectrum from an old 400Mhz PII laptop to a 2Ghz development box. Then all my boxes I use at work. I say for typical desktop computing 1Ghz is the point of diminishing return. Even my 2Ghz dev box with 1GB RAM is overkill. Two things are controlly this, the speed of the other cards and devices, and software. Unless running games, simulators, or rendering graphics computers today have more CPU power than the software can use.
I've been talking about this issue lately with a friend of mine, with whom I am trying to do some interesting home automation stuff. For those applications, a machine that runs in the many megahertz is fine based on current uses, but...
The way I see it, the computer in the home should be a lot more like HAL or any of those other scifi computing devices. A lot of processing power today goes into drawing the GUI quickly and tracking user movements on the internet and whatnot, but where's the beef?
A 3 GHz processor should be recognizing speech, figuring out who really lives in the home and who is breaking and entering, which hot spot is the family pet and which is an iron that was left on, etc. It goes without saying that a single 3 GHz cpu should meet most of the comuting needs of a typical family.
I suggest that in order to sell high performance to the mainstream, something more useful than a Windows service pack will have to be available to soak up that performance (and this has already been suggested in other responses to this topic). The computer should stop being a thing that the family goes to a particular room to use... personally, I think the so-called "data furnace" or other similar approach is where the mainstream will begin to adopt this real computing power, when the home server starts doing really amazing things. Things more impressive than whatever it is WindowsXP does for people, anyway.
Voice recognition (that works very well), handwriting recognition (that works even better than that), maybe real time language translation, some simple learning algorithms, agents (web downloads should already happen automatically), intelligent security systems, family health monitoring, car-home networking... the list of applications to take advantage of this stuff is long and probably getting longer.
1. Lessen the MHZ race. 2. Allocate your engineering resource to make the processor/system run cooler instead, to the point that it no longer need the active cooling measure(fan) on processor and (hopefully) whole board/system. 3. Make a new small, low power, quiet PC form factor standard(or push the less known existing standard or join others) accomodating this advantage and invite every other in the industry with no/minimal IP restriction. 4. Make this combo your main production, push other heatmaker to the niche.
(Okay. This is what VIA already tries to do but following item is what only intel probably can do)
4. MARKET IT HEAVILY. It would be easier than current marketing based on speed because you no longer need to deceive the customer. And it is the OBVIOUS BENEFIT to average customer - small, quiet, power saving PC with standard parts that one could leave it always on without stress/anxiety -, and to industries - always on -> new usage -> new software and hardware -> new market!- 5. You've just created a whole new market. keep chugging along, 800lb!
As a scientific user of commodotized x86 hardware, this has me a little worried. We've been happily riding the x86 performance-per-dollar wave on the backs of video gamers. If gamers and other large groups of users quit underwriting high-performance cpus, the scientific community may find itself back in the old "big-bucks workstation land".
Thats what everyone is forgetting. These suckers were subsidising the latest and greatest for those of us who actually need (yeah I need 100fps in jk2) these processors. How fast would your latest kernel of choice compile on that old k6-II?
Well, bear in mind that most machines are purchased by businesses -- and they follow some weird rules.
Consider the 3-year-old Dell 450 PII on my desk. High end when I got it, low-end now. I don't need to do any heavy processing, but some of the apps I use consume a lot of RAM, and I'm always short of disk. So I requested an upgrade.
It had almost gone through, when my boss told me that I was making things difficult by not requesting a new machine. Computers are amortized over three years (at least by anybody who pays federal taxes), and our IS department takes the attitude that a fully-amortized computer costs more to support than it's worth.
Of course, as soon as I changed my upgrade request to a new computer request, there was a purchasing freeze....
In Europe the amortization time (tax write-off time) is generally a very unrealistic five years. In Germany at least, a lot of professional PCs are on 3 year leases. This way you don't have to worry about write-offs.
However, with a lot of front-end stuff moving to languages like Java, 450MHz sucks big time so at my last assignment, we jumped to 1.6GHz Dells. Oh we use lots of RAM too but the local disks are usually almost empty (i.e., 1-2GB used) as apart from the local O/S (Win 2K pro or XP Pro), the JRE and some other local stuff like X, everything was served via the net.
One of my friends's work just bought him a new laptop. His old one (P233, under DOS. Old Borland IDE) worked perfect for what he does. (home and field support on embedded software in automobile testing equipment)
They bought him a new one, latest etc., just because the warranty on his old one ran out. They didn't want to support a laptop that wasn't supported for them.
You can upgrade that Dell with a CPU from powerleap.com. I recently upgraded my old PII 400 to a Celeron 1.3Ghz and WOW what a difference. Cost me around $150, I probably could have gotten it cheaper but I didnt shop around. The upgrade game just isnt worth the cost (and yes I do play games). I'd have to spend $400+ upgrading all the major components including the power supply for maybe another 30% of performance ontop of what the celeron is giving me now. Simply wasnt cost effective.
I have the same experience in running my mission critical gateway machine, an Intel Pentium Pro 180 that is collocated in my parents' basement running linux. You might scoff at my use of the term "mission critical", but you don't want to see how mad mum gets when she can't read her recipes!
That machine does samba, apache, printing, xdm, and NAT. It is always responsive, although it does only serve three people. I could see it powering a large corporation (around 200 users) with a good administrator that keeps open source bloatware away from it.
I refuse to run software like KDE / Gnome (aren't these the same now on redhat?), mozilla, vim, and openoffice. I have ensured that my users are happy with twm, lynx, and emacs - with some hacked up training courses I put together.
My 600 MHz P3 laptop does good enough for all my Photoshopping needs. Of course, my 1.4 GHz Athlon desktop does even better. You shouldn't need anything more than a 1 GHz unless you're really ramping up your movie production needs. With most games, you just need a faster/bigger video card (which is why you don't need a faster/bigger CPU). Since most people aren't doing molecular simulations or whole-earth atmospheric studies, they aren't pushing the envelope of their CPU anyways. Even gamers and so forth are tied down more by their video abilities than their CPU.
Ok I'll bite. Try putting gradients on a 24x36" photoshop document at 150dpi with your dinky 600Mhz P3. Hmmmm.... didn't think so. And thats just the graphic design work I do which is small change compared to my movie production work (3DS Max, Combustion, etc.) where I'm desperately scrounging for more performance out of a dual-proc Athlon 2000 w/ 2Gb of RAM. Yes you need it when rendering at 1920x1080 resolution.
This also explains why Apple isn't dead yet. For checking my e-mail, browsing the web, and writing papers for english my G3 600 iMac works just fine
Well the AMD 600 Machine I built for my dad nearly 3 years for $650 works just fine too, running Linux and windows XP. There is no advantage for Apple as far as aging hardware is concerned. Good luck running osx on that G3 imac.
Have you tried it? It works fine. I run MacOS X 10.1.5 on a G3 233 MHz (granted, it does have 320 mb ram), and for what I use it for (Netscape 7/Office X/ssh/mp3 playing), it is quite sufficient.
I do think that the previous poster had a point, though. You don't NEED 2+ GHz for most computer use. If you did, Apple wouldn't sell any machines.
The article is not necessarily good news for Apple. People may be trending for price over MHz but Apple loses on both MHz and price. People who buy Macs are more concerned with usability or they are oriented toward a particular "Mac" application. Thrown in a few fashion statements as well. In other words nothing has really changed.
Actually, Apple isn't dead yet for the opposite reason.
You're correct that they don't force average users to upgrade every year like Microsoft/Intel/AMD try to do. But Apple also has a much larger percentage of it's total users who actually need a lot of power. A lot of people who are still using Macs are doing so for video or desktop publishing work which benefits from things like dual processors and 1000+ MHz G4s. So it's worthwhile and cost-effective for them to upgrade often, paying top-of-the-line prices each time.
Great to see it gets some hard proof!..or DO each of us drive 400hp cars? No? Why not? If we can't live without 2.8GHz, why should we "punish" ourselves with cars below 100hp?
Well I bought a 350hp car because I didn't see that it was worth waiting a year on the waiting list to be able to buy the supercharged version.
However I do notice that many of my neighbors have cars with equivalent capacity (4 litre or above) but give only half the power and less than half the mpg.
I just upgraded my son's computer with the cheapest components I could get from Frys that I could be confident would last a couple of years (having previously bought grotty PCs and regretted same). For $350 I got an Intel motherboard, 1.9 GHz processor, 1/4 gig Ram and a pretty nifty video card.
I agree that there is not much reason to upgrade from need these days. Even gaming is no longer a power users issue since the copy of tombraider angel of mercy you buy in the store next week will have been developed on hardware that is already close to obsolete.
The only mass market, power application I see about at the moment is digital video editing. That will pretty much soak up cycles on anything you throw at it. But the market is fairly specialist still.
As the hardware gets cheaper I am much happier to accept machines with everything integrated on the motherboard.
Umm, I have *never* had a problem installing a distro on an old P-Pro running at 166MHz and 64MB of memory. It is a lot slower (decompression takes performance) but once loaded, it works very nicely as a file server. As distributed with Gnome, the GUI sux, but I don't login there very often.
yeah (Score:5, Insightful)
Plus, I don't have as much throw-away money like I used to. The economy is a huge driver in this, and if they don't see that, they are silly.
The money I would spend on frivolous things is now being shoveled into the bank so I can save for things I really need(TM).
Treason! (Score:5, Funny)
GAME DEVELOPERS are the true terrorists!!! (Score:3, Funny)
Developers should stand up to their patriotic duty and develop games which thrive only on new hardware, only on the fastest, biggest, brightest boxes under the sun. Only on machines which took mommy and daddy twenty years to save up enough money for and will be outdated in the inverse of that time. Only on the biggest capitalistic ventures of all computer fabrication history. Only when the developers step up, will our economy recover. But there is still the problem with the throw away society vs. the persistent friendly environment struggle raging in our computing worlds right now.
This is where Microsoft needs to step up to the plate. Require an entire internal hard-drive ONLY for the OS. And require all programs and documents to be stored in an external storage mechanism which when plugged into any existing windows workstation will automatically load in the registry, shortcuts, desktop, and what not (applicable to the users security context of course). This way when somebody decides: "Hey I can't play new games any more I better upgrade." They won't have the laziness factor breaking in with, "Yeah but then you'd have to get that geek from next door to help install all your programs and stuff like that and its just not worth the hassle." Because people in America are lazy and are all about how much effort they have to exert to get a task done. Using the approach I have stated all they gotta do is unplug their drive toss out the computer. And say hello to brand spanking shining new computer, and good bye $2600 cash. With which they can finally play all those newfangled games, unlike everybody stuck with the last gen computer.
The End
Re:GAME DEVELOPERS are the true terrorists!!! (Score:3, Insightful)
Uh, Game developers aren't in the market to sell hardware. They are selling software. They want as broad a potential market as possible. If a person has to buy a new $2000 computer just to play the developers $50 game, then the developer isn't going to get a lot of sales. Games like that might make Intel happy, but if they really want games that suck up cycles like mad, then they should write them.
My guess is that with the XBox out, things are only going to get worse. Most PC Game companies won't want to write something that couldn't at least theoretically be ported to the XBox. Microsoft has basically removed most gamers primary reason for upgrading their PC.
As for the rest of your rant, what Microsoft really needs is a /home directory like UNIX. All of my files, settings, and everything else sits in that directory so that moving to a new machine is easy.
P.S. I know you were trolling.
Re:GAME DEVELOPERS are the true terrorists!!! (Score:2)
> one that controls all data input making sure no cycles go without processing data,
> and one with data output making all results immediately available for display or what
> not next clock cycle (all three should be programmable with their own instruction
> sets and use the same registers).
Sounds like fun, but the machine would be a living nightmare to program for. Witness the playstation 2 for a real-world example of this design mentality.
Re:GAME DEVELOPERS are the true terrorists!!! (Score:3, Insightful)
Re:yeah (Score:2, Insightful)
Neither do I. My year and a half old processor (Athlon 1.1 Ghz) still runs new games (read Unreal Tournament 2003 demo) at 1600x1200 at acceptable fps -- granted I have a Geforce 4 ti4400.
>>The economy is a huge driver in this, and if they don't see that, they are silly.
Good point. You wonder if they are falling into the same trap the recording industry has fallen into in overlooking the obvious.
Don't get me wrong, I want the fastest processor possible (if I can get it for free), but right now I just can't find anything I do that DEMANDS that I have 2+ Ghz power.
-Pride
Re:yeah (Score:3, Insightful)
Precisely (Score:5, Insightful)
Why in the world would anyone want to spend the money on a top of the line processor when they can buy an entire computer based on a value processor for $299 at Walmart.com? Heck, instead of spending $1500 or more on a new computer, I can buy three computers over the next year and be pretty sure that the computer I buy six months from now will be faster than the expensive computer I am buying now. So what if these computers are crap. At these prices I can afford to purchase another.
Besides, I don't want to spend my money on a processor. I don't run an processor intensive apps. I want more memory, a bigger monitor, and a faster hard drive. Spending money on a fast processor is just a waste.
The funniest part about this is that the killer application that would drive people to buy new processors is multimedia sharing. Encoding and decoding multimedia sucks down cycles like crazy. Instead of making it easy for people to share multimedia files Intel and AMD are busy making it as hard as possible. If sales are bad now, imagine when Intel and AMD's new products come out that treat their customers like criminals.
Re:Precisely (Score:2, Insightful)
Wrong. Multimedia encoding takes lots of processor power, but multimedia sharing takes comparatiely little. Just think about it. How many of the people on the net are actually ripping and encoding the files they're sharing, and how many are just downloading them and passing them along? I'd guess that there are dozens or hundreds of people who only copy for every one who actually encodes something.
Re:Precisely (Score:4, Interesting)
Yes, I understand that fileserving (especially over the Internet) doesn't require a lot of processor power. I suppose I can't speak for those folks that are infringing on copyrights, but I have been ripping my CDs to Ogg, and I am starting to think a faster processor would be a good thing. And my buddy that is encoding all of his digital video is even more interested in a powerful processor.
And that's what Intel should be pushing. They should be running commmercials where people are sending video CDs of their kids to the grandparents. That requires a nifty processor, and it is precisely the kind of thing that gets normal people to upgrade their computer. Unfortunately Intel has been paying too much attention to Hollywood, who believes that they are the only folks that can make movies.
Besides, why should Intel care if people are downloading media? They aren't in the media biz. Multimedia files are big, and most people also purchase new computers when their hard drive gets full. Sure, you or I might simply pop in a new hard drive, but that's not at all normal. Computer sales mean processor sales. In other words it is in Intel (and AMD's) best interests to encourage people to share multimedia files.
The fact of the matter is that unless you are dealing with multimedia there is little reason to upgrade your Pentium II 500, and yet for whatever reason the hardware companies are going out of their way to make multimedia difficult to do on PCs.
Re:yeah (Score:3, Interesting)
Now that even value CPUs are ridiculously fast, there isn't much reason to buy the top of the line. I used to buy dual processor boards and populate them with two of Intel's second or third fastest workstation CPU. Those days are over, since I can't really imagine myself wasting so much money, just to get an additional few megahertz. Now I look to previous generation workstation CPUs, since they're being dumped on the market to clear stock. Plenty fast enough for me. My last purchase was two 1.2 GHz Athlon MPs, back when the 1.6 MHz (1800+ MP or thereabouts) MPs were being sold.
Re:yeah (Score:4, Insightful)
Re:yeah (Score:4, Funny)
it means youre pretty bad at math.
Building Systems (Score:5, Insightful)
I've found the best way to build a system is to get the mid/high level chip rather than the top end, the savings are large enough to speed up the system in other areas (like lower latency RAM).
If you can build someone a decent computer, but keeping costs down (I don't mean getting crappy componants), they're far more likely to upgrade sooner and in the same manner so the cost is more spread. i.e. someone spends $2000-3000 on a brand spanking new computer (latest everything) but loses the ultra performance crown in 3-6 months, is going to be less inclined to get a new system 1-2 years down the line (unless they have cash to burn), wheras if the costs are under $800-1000, they never lose the ultra peformance crown 'cos they never had it in the first place.
I suppose the nice thing about new chip releases (esp. major revisions) is they knock the lower specced chips down nicely.
Re:yeah (Score:3, Interesting)
Then again, my (very) aging P75 NEC laptop with 40MB RAM still works quite well as a portable development platform with FreeBSD. Not the fastest thing in the world, but for taking my coding outside, it does the job I need it to.
Re:yeah (Score:5, Insightful)
Yeah, it's really so strange how being laid-off from a job that paid mid 5-figures up to near six-figures or more, scraping by on unemployment, really cuts down on consumers' willingness to plonk for newer technology. But companies saved staffing costs...
Especially when their "old" (2-3 years) home desktop or notebook PC works just fine for email and surfing job-search websites.
Henry Ford was a prime SOB, but one thing he did right was pay his workers $5/day (a high wage at the time), realizing that he'd never sell enough Model A's unless his workers could afford them. Today's overpaid and overprivileged corporate executive class seems to have lost sight of this. Refusing to pay more than rock-bottom wages destroys demand for their own high-tech products.
You are smoking crack... and it shows. (Score:3, Informative)
A few rich CEOs can buy maybe a dozen or so cars before the marginal utility starts to wear a little thin, but if you can persuade those CEOs to transfer enough of their wealth to their employees so that the individual employees can afford a car, then you've got utility by the boatloads.
Indeed, history shows us [state.mi.us] that other automobile employers had to follow suit and Ford's profit sharing program resulted in much cheaper cars across the board.
Granted, he also raised wages in order to prevent turnover and that heightened efficiency led to the cheaper cars. Happy workers are productive workers and, yes, they do actually pull more natural resources from the ground and produce more sprockets under the right conditions. (Note: Even if inflation is looming, the workers don't know it yet and they're still happily working harder.)
That's rather the point. If all you ever do is keep lowering employee wages until you hit the sweet spot, you'll be in a whole lot of trouble when you realize that you were on the wrong side of the labor supply and demand curve the whole time. High turnover was a warning sign to Ford that he (and the rest of his industry) were on the wrong side of the curve and they needed to raise wages. Ford undoubtably realized that one of the happy side-effects of repairing the situation was that there would be a heightened market for automobiles among the working class. Everybody wins. Ford builds his automobile empire and the working people get their cars. (Well, everyone wins except for us, the 21st century recipients of the negative environmental and social effects of car culture.)
jpmorgan, you're only correct when an industry is operating on the far side of the labor supply and demand curve. I also suspect you're not much of a Keynesian.
I suppose this all has some relevance to the recent situation. There was, afterall, a very high turnover rate amongst tech workers during the 90s who were chasing pre-IPO dreams. That sounds like (one of many) dead canaries in a mineshaft to me. (Remember: Although a few won the IPO lottery, the majority of tech workers didn't and suffered grueling hours, draconian IP contracts and vaporware products as a result. Meanwhile, corporate propaganda was telling us that the supply of workers was low when, in fact, we now find most tech workers are out of a job.)
Well.. (Score:2)
Of course, the only problem with this is that there may be less money pumped into R&D.
software lag and video cards (Score:5, Insightful)
So I think we are just seeing the results of a software lag, where the current batch of software doesn't need or even work better with the highest end processors.
On the other hand, video cards are taking more and more load off of the CPU. And they cost about the same. I know I've upgraded my video more often than my CPU. I've got four videocards sitting on my desk right now, victims of perceived obsolesence.
Maybe the future trend is for other peripherals to start adding computational functionality, and further reduce the CPU load. Perhaps CPUs of the future will be used for nothing but scheduling and coordination.
How about games? (Score:2)
Re:How about games? (Score:2)
If Intel really wants to push sales of their chips, the best thing they can do right now would be to encourage developers of applications do default to using encryption, a CPU intensive process that hasn't yet been offloaded to dedicated hardware. Games are intensive but newer video cards dedicated in design to rendering outpace even a CPU with SIMD. Increasing CPU speed might be able to get you a faster load time, but you'd have to switch to something like [theproduct.de] generating textures on the fly
Re:software lag and video cards (Score:5, Insightful)
I was actually writing a Slashdot submission some time ago that dealt with "minimum requirements" and how they are determined in the software industry. For instance, for Windows XP Microsoft states as the requirements "PC with 300 megahertz (MHz) or higher processor clock speed recommended; 233-MHz minimum required". I offer up the opinion that they pulled these numbers out of their ass, and that is the general routine of the software industry in general. While items such as memory or hard drive space can be actually metered and truly quoted on in minimum configurations (recommended becomes more of a suggestion as it is completely subjective: If you're willing to tolerate endless paging, Windows NT 4.0 will run on, and was originally specified as for, 12MB. If Microsoft re-released Windows NT 4.0 pre SP1 today, they'd claim that it required a minimum of 128MB, and a 300Mhz+ processor). I believe that software manufacturers simply find the middle to low end in the current marketplace and stick that on their box with the hopes that more detailed "requirements" makes it appear that the QA department did a better job, when all it's really doing is needlessly muddling and implying metrics that don't actually exist. Minimum CPU requirements for non-realtime applications are a farce.
Why do I bring up XP? Firstly, I've found XP to actually be significantly less demanding than Windows 2000 (for instance startup times have dropped dramatically as they optimized the kernel and ancilliary code). Windows 2000 specifies a "minimum" processor of a 133Mhz Pentium [microsoft.com], yet Windows XP specifies that you need a 233Mhz or higher processor [microsoft.com]. Why the jump of 100Mhz? Does it latently consume more resources? Checking my CPU meter I can see that it generally sits at 0%. Compare this to Windows NT 4.0, to which XP still shares a tremendous lineage (one can still run virtually all current software on an NT 4.0 machine) which only requires a 486 33Mhz [computerhope.com]. Claims that XP is a CPU hog are ridiculous: While it can be demanding from a video perspective if you have the "effects" on, and you should have lots of memory, it would likely run perfectly fine on a Pentium Pro 60Mhz, presuming you had the required memory.
Why do I say this little rant? Because I truly was interested some time back about the engineering foundation for determining and quoting on minimum, recommended, and optimal configurations, and how they are derived.
Re:software lag and video cards (Score:3, Insightful)
It seems like such things go in cycles. Originally, mainframes would do serial comms by twiddling outputs with CPU instructions directly. Then someone sez "hey, it's now possible to build a little buffer circuit that does it for me". Then later, as CPU's got faster, the wealth of extra clock cycles were put to use tiddling serial bits again because it was faster than the homebrew serial buffer. Repeat until you reach today, where the UART chips are fast enough and cheap enough that we'll not see direct serial manipulation by the CPU again. Right now it seems we've reached that point with video. The CPU can't even come close to what the vid card chipsets are doing, so that task is currently in an "offloaded" cycle. But who know what the future might hold? They may come up with something new that has so many clock cycles to burn that it can run circles around a GF4. Not likely, but also not impossible. Basically, it appears that functions go off-CPU permanently when the peripheral hardware that performs said functions becomes cheap and plentiful.
Lazy Programming (Score:5, Interesting)
2 nights ago I downloaded UT2K3. I thought it was going to be worse than JK2. So I turn off all the effects, runs fine. Start turning settings up. 800x600 with medium effects runs fine. So I go to 1024x768 with full effects. Runs beautifully. Dropped 10 bots in, no drop in performance. I would've put more in but they were owning me.
Kind of off-topic, I know, but it really opened my eyes to what programmers can do if they honestly care about the their public and put good programming techniques to work.
Re:Lazy Programming (Score:4, Informative)
On a my P3-800 with 256mb of ram and a geforce 2 ti it ran like ass anytime there were more than 3 guys running around or if I was in a big room. Really pissed me off. And that was at 800x600, I had to turn it down from 1024x768 because it was unplayable.
News flash: if changing resolution improves performance, then your problem is that you're fillrate bound on the graphics card. Nothing to do with your CPU, nothing to do with "lazy/inefficient programming".
If you were getting the same crappy performance regardless of resolution, then you'd have a point.
Re:Lazy Programming (Score:2)
Or I may just hold off until Doom3 comes out and do a mobo/cpu/ram/gpu upgrade. I can wait...... I think.
Lazy Programmer Syndrome (Score:2)
There is know known cure other than enlightened managers and these are hard to come by.
Re:Lazy Programmer Syndrome (Score:3, Funny)
Re:Lazy Programmer Syndrome (Score:2)
This didn't seem too bad until it went live and tried to cope with over 1000 visitors all searching and messing about within the first hour. The phone was ringing with complaints just after lunch
It was then that I happened to notice the dual P3 database server only actually had one processor fitted. The supplier swore that we'd only ordered one CPU which seems odd when we knew we needed two and it's slightly inefficient to buy a dual CPU motherboard with 1 Proc. Plus we'd paid for 2.
Didn't make much difference anyway; with another proc in there the load was still way above what it could handle
Re:Lazy Programmer Syndrome (Score:2)
There is know known cure other than enlightened managers and these are hard to come by.
Sounds like what you really mean is Unrealistic Deadline Syndrome. Try telling your boss that the software is feature complete, but it needs to be 50% faster. See if he gives you the extra month.
Re:Lazy Programming (Score:2, Insightful)
No, you see, when you have a JOB, you have to do what your BOSS says. PROFESSIONAL programmers usually have a BOSS which gives them a DEALINE. Kid, when you come down from your collegiate ivory tower and get a job, you'll see what I'm saying. A month for a few milliseconds? You're fucking crazy. That's like telling a company, "we can speed this program up for you by a few milliseconds, but it'll cost you about $10K".
Get a job.
Re:Lazy Programming (Score:2)
Re:Lazy Programming (Score:2)
Yeah, that makes a lot of sense.
Re:Lazy Programming (Score:2)
And don't start with the "they'll make it as a mod for UT" because that's lamer than hell.
Plus, they got rid of the sniper rifle(for the most part, the delay on the lightning gun really cripples it), which is great in my opinion because it will eliminate all the damn facing world's players that NEVER left the roof.
Upgrade to dual CPU (Score:2)
in other news (Score:3, Funny)
In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
Re:in other news (Score:3, Funny)
In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
In an unrelated story, aol will be adding new features into aol 9, its min. req will be dual 3 Gig processor, 512 megs ram.
Why go for the newest? (Score:3, Informative)
Just take a look at Pricewatch. The Athlon XP 2200 is at $144, while the Athlon 2000 is under $100. Why would you spend that much more on a new processor, when you aren't getting alot more speed out of them.
With a few months turn over, it is worth the wait to save $50 or more on a slightly older processor, than on that latest processor.
Re:Why go for the newest? (Score:2)
Sure I know it make sense to wait, and you know it, but if everybody does it we're screwed.
id Software owns the Chip Market (Score:3, Interesting)
View Quake (and soon Doom) releases relative to chip sales, and I'm convinced there will be a correlation. There is wider macro economic factors, but the key driver is Frames Per Second for the latest id software release.
Intel bit itself in the a** here (Score:2, Insightful)
Those of us on /. who know better can put together a nice system with yesterday's parts, but I think the average user still equates processor speed to overall performane. Even when Joe claims to consider RAM, he seldom considers the speed of that RAM, and never the FSB or the hdd speed.
Intel marketed it's processors on the basis of the clock speed. While the 2.8 Ghz did have a 533 Mhz FSB, for the most part, the common joe-driven PC market has grown up thinking that CPU speed makes the biggest difference.
I think perhaps people have just gotten tired of buying new computers--it's just not the next 'big' thing like it has been for the last 10 years.
My old box (Score:2)
For what I want to do, its been perfectly fine. But occasionally I try out some cool screensaver, or have to do a kernel compile or something, and only then is when I notice the difference.
With enough memory, modern operating systems can function quite well on older processors. They have fairly advanced memory managemet, scheduling etc. Microsoft Word XP starts in about a second on this older machine - how much faster do you really need?
Re:My old box (Score:2)
For instance, at home I've got a PII 450MHz w/ 384 MB RAM (bought just over four years ago), with the occasional upgrade (e.g. GF2MX400/64MB instead of the OEM STB nVidia Riva TNT card it came with). For writing, it's fine. For coding, it's fine. For work... since I do a lot of number crunching, faster would be better, but it's not my work box. For gaming, for
My work (a lot of statistics -- number crunching) could obviously use more CPU speed, and, depending on task, RAM (but I reduce the need by using online approximate quantile algorithms to "sample" the data, so that processing occurs on a mere subset). That's where I can
I love when they use the Internet (Score:4, Informative)
Re:I love when they use the Internet (Score:4, Funny)
Hello! Do you know how fast lightning moves? That's just not even close to ATT Broadband/RCN/@home speeds.
Re:I love when they use the Internet (Score:2)
Re:I love when they use the Internet (Score:3, Funny)
=)
Re:I love when they use the Internet (Score:2, Insightful)
You know, the Internet is more than how fast bits flow into your computer. The speed of the processor directly affects how fast your pages render. In fact, I recently upgraded my inlaws to a much faster computer, and they commented on how much faster "the internet" was. (their normal home page renders ridiculously slowly for some reason)
In other words, the Internet is not much good without applications to use it, and faster applications == faster Internet.
Re:I love when they use the Internet (Score:4, Insightful)
Re:I love when they use the Internet (Score:2)
Re:I love when they use the Internet (Score:2)
Pages DO NOT render faster between 64 and 128 it seems. The god damn programs load faster and the OS is quite a bit more responsive but the Internet is not any faster.
Maybe that was just in this case though.
Re:I love when they use the Internet (Score:2)
Render times for pages and the speed of your connection could both be construed as factors on how fast you load things on the web, but after a certain point (long passed, 500mhz+ machines) you don't need anything more.
One Word: (Score:2)
If you're running MacOS 9 with IE, don't even kid yourself, that's not browsing. That takes longer than Photoshop rendering (no lie).
Especially tables. The worst part is it locks the machine in MacOS 9. I click on
Netscape, on the other hand, has a sluggish UI but at least it doesn't lock down my system durring such routine tasks as page rendering.
Sometimes I wonder if Microsoft does it on purpose...
Real vs. perceived speed of the internet (Score:2)
For a lot of people it *does* appear to be faster, but it's not just simple CPU speed. If the average CPU updater is creaking along on a 1st or even 2nd generation PII (233-450Mhz) system, chances are they also have little RAM (as little as 32MB), a much slower memory bus (66Mhz on 1st gen PIIs) and slow disk drive, a weak video card and a creaky OS and browser
When you move up to a contemporary machine you end up with a better Windows (2K and XP are superior to 95 and 98) running a newer browser, more RAM which reduces swapping, and when swapping DOES happen it happens to a faster disk at ATA-66 or even ATA-100 speeds with little fragmentation.
All this does add up to a browsing experience that seems faster to most people. The cognescenti realize that its just a better machine and that the internet pipe is no better, but to the other 99.5% of the computer-using population the internet got faster..
nitpicking (Score:2)
Gaming (Score:4, Informative)
I remember back in the day Virtual On came out for PC and the minimum sysreq were higher than any available pc on the market, unless you had 5 grand. When the minimum amount of power required to use new software goes above the power of most people's pcs then they'll start buying faster CPUs.
Heck, even the people who are already buying faster CPUs don't buy the fastest processor available. The money:speed ratio makes it so much more worth it to buy the second or third fastest AMD, even though the fastest P4 is the best you can get.
Sure? (Score:2)
Devil's Advocation (Score:5, Interesting)
However, it's important to realize that the drop in sales will also result in a corresponding drop in research.
I'm kind of not happy about that
It may have 2 very cool side-effects, though:
1) Pervasive computing may become more
2) Network/Telecom/etc infrastructure can finally catch-up. I strongly believe one of the things that caused the Internet boom was that a majority of people had access to modern telephone lines and most could scrounge up a computer. Since then, computing technology has outpaced infrastructure development (by that I mean -many- people currently still can't get xDSL, and yet your average new computer could completely swamp a T3). If things slow down and stabilize, we can again let the infrastructure mature and saturate the market, which is often the recipe needed for a new technological boom.
However, I am going to be upset if I can't buy a 32/64bit Hammer in a year at a decent cost, just because I want it
Re:Devil's Advocation (Score:3, Insightful)
Perhaps, but I think much of the research will shift to making existing technology smaller and cheaper. Perhaps next year when they we are discussing the latest-and-greatest here on Slashdot, it will be part of a $299 machine at Walmart. Ok, maybe not quite that soon
Seriously though, I don't think we need all that much innovation on the desktop anytime soon. I would like to see them focus on software innovations and technology like making USB more reliable. Remember, mainstream 64-bit is just around the corner too, which will probably mean more spending once the prices reach acceptable consumer levels.
I'm still waiting for an open-source version of Visual Studio to show up on SourceForge.
I'd be buying hardware if I had the money (Score:4, Interesting)
I have an infinite appatite for more toys. The only thing preventing me from buying quad Xeons, dual Athlons and a bunch of Sparc hardware is that I'm broke. This last year has been very difficult, and I think even more so in the technology sector. If we all start getting rich off of killing foreigners or something, then maybe the demand for more power will return. In the mean time I'd be more impressed if they could show that people were spending the same (inflation adjusted) money on lower-end hardware.
The article itself does mention the economic slump, but doesn't actually provide any real facts or data, just anecdote and fluff.
Stop working on faster processors... (Score:2)
THEN I'd be happier with the speed of my computer.
So buy a 15K rpm drive. (Score:2)
They do exist [maxtor.com], you know
Just as with 7200 rpm drives, it's just a question of time before it migrates down the foodchain, and ends up in ATA drives.
Moore's Law meets Marketing Dept. (Score:2)
Not a big surprise... (Score:2)
Re:Not a big surprise... (Score:2)
I hear that from friends. I'm in the same camp. I waited for 1GHz piii's to drop below $150 so I could run Linux fanless with a matrox 2d card. I'm a coder not a gamer -- I want quiet!
I looked at the cappucino (at thinkgeek.com), but it was too expensive (yeah, I'm a cheapskate!). You might be interested in it.
Speed up but not performance (Score:2, Informative)
Looking at toms hardware benchmarks, http://www.tomshardware.com/cpu/02q3/020826/p4_280 0-09.html , the fastest Athlon is just over double as fast as an Athlon 850 and much of that is due to the faster FSB and ram. I am running an Athlon 1.4 ghz and there is nothing on the market which is worth upgrading to. Why upgrade for a 25 to 40% increase in performance. Ever super overclocked, none of the processors offer double the performance. If I want more real world speed, I could spend the same money on a scsi card (have one already) and a small scsi disk for windows to sit on. it's just like a car, you can only do so much with the engine, then you have to worry about the chassis, heat, and traction.
Not till I see nickel on the core…. (Score:5, Interesting)
I know its coming... I've seen (pictures) of the engineering samples...
Wait a minute!!! (Score:2)
-a
It's cause were all F@CKIN UNEMPLOYED (Score:4, Funny)
Grrr (Score:2, Interesting)
So yah, I'm looking to upgrade. . . .
Cheaper, cooler, more efficient (Score:2)
If I have the choice between a giant-ass chip with multiple fans that takes 100+ Watts vs. a slightly slower, fanless, power sipping version, I'd definately take the latter. I'm definately keeping my eye on the Via C3 [via.com.tw]...
Looking for advice (Score:4, Funny)
Embarassment of riches... (Score:2)
I for one got a dual Athlon 1.4Ghz back around april, thinking it'd be really cool and fast. Instead, it's horribly underutilized. I think I'm going to hold off buying computer crap for at least a year or two.
1 GHZ is point of diminishing return (Score:5, Insightful)
Time for home uber servers (Score:3, Insightful)
I've been talking about this issue lately with a friend of mine, with whom I am trying to do some interesting home automation stuff. For those applications, a machine that runs in the many megahertz is fine based on current uses, but...
The way I see it, the computer in the home should be a lot more like HAL or any of those other scifi computing devices. A lot of processing power today goes into drawing the GUI quickly and tracking user movements on the internet and whatnot, but where's the beef?
A 3 GHz processor should be recognizing speech, figuring out who really lives in the home and who is breaking and entering, which hot spot is the family pet and which is an iron that was left on, etc. It goes without saying that a single 3 GHz cpu should meet most of the comuting needs of a typical family.
I suggest that in order to sell high performance to the mainstream, something more useful than a Windows service pack will have to be available to soak up that performance (and this has already been suggested in other responses to this topic). The computer should stop being a thing that the family goes to a particular room to use... personally, I think the so-called "data furnace" or other similar approach is where the mainstream will begin to adopt this real computing power, when the home server starts doing really amazing things. Things more impressive than whatever it is WindowsXP does for people, anyway.
Voice recognition (that works very well), handwriting recognition (that works even better than that), maybe real time language translation, some simple learning algorithms, agents (web downloads should already happen automatically), intelligent security systems, family health monitoring, car-home networking... the list of applications to take advantage of this stuff is long and probably getting longer.
Next step:Quiet, cool running small PC STANDARD (Score:3, Insightful)
1. Lessen the MHZ race.
2. Allocate your engineering resource to make the processor/system run cooler instead, to the point that it no longer need the active cooling measure(fan) on processor and (hopefully) whole board/system.
3. Make a new small, low power, quiet PC form factor standard(or push the less known existing standard or join others) accomodating this advantage and invite every other in the industry with no/minimal IP restriction.
4. Make this combo your main production, push other heatmaker to the niche.
(Okay. This is what VIA already tries to do but following item is what only intel probably can do)
4. MARKET IT HEAVILY. It would be easier than current marketing based on speed because you no longer need to deceive the customer. And it is the OBVIOUS BENEFIT to average customer - small, quiet, power saving PC with standard parts that one could leave it always on without stress/anxiety -, and to industries - always on -> new usage -> new software and hardware -> new market!-
5. You've just created a whole new market. keep chugging along, 800lb!
Uh-oh (Score:3, Insightful)
Well, it was fun while it lasted.
-Paul Komarek
I'll bet they could... (Score:2)
Re:Happy but fearing (Score:2)
Microsoft is working on it...it's called palladium.
Re:Happy but fearing (Score:2, Insightful)
We should be encouraging them!
Re:just get everyone to upgrade to win 2k (Score:2, Funny)
Three years to death (Score:4, Funny)
Consider the 3-year-old Dell 450 PII on my desk. High end when I got it, low-end now. I don't need to do any heavy processing, but some of the apps I use consume a lot of RAM, and I'm always short of disk. So I requested an upgrade.
It had almost gone through, when my boss told me that I was making things difficult by not requesting a new machine. Computers are amortized over three years (at least by anybody who pays federal taxes), and our IS department takes the attitude that a fully-amortized computer costs more to support than it's worth.
Of course, as soon as I changed my upgrade request to a new computer request, there was a purchasing freeze....
Re:Three years to death (Score:2)
However, with a lot of front-end stuff moving to languages like Java, 450MHz sucks big time so at my last assignment, we jumped to 1.6GHz Dells. Oh we use lots of RAM too but the local disks are usually almost empty (i.e., 1-2GB used) as apart from the local O/S (Win 2K pro or XP Pro), the JRE and some other local stuff like X, everything was served via the net.
Warranted by the warranties (Score:2)
They bought him a new one, latest etc., just because the warranty on his old one ran out. They didn't want to support a laptop that wasn't supported for them.
Re:Three years to death (Score:2)
Peter
Rama Lama (Score:2)
But what's the point in taking the RAM with you? By the time you leave, it'll be worthless, and will only work on obsolete machines!
Re:Free monitors (Score:3, Funny)
This is quite understandable (Score:2)
That machine does samba, apache, printing, xdm, and NAT. It is always responsive, although it does only serve three people. I could see it powering a large corporation (around 200 users) with a good administrator that keeps open source bloatware away from it.
I refuse to run software like KDE / Gnome (aren't these the same now on redhat?), mozilla, vim, and openoffice. I have ensured that my users are happy with twm, lynx, and emacs - with some hacked up training courses I put together.
Re: (Score:2)
Re:hrm (Score:2)
Re:hrm (Score:2)
Re:hrm (Score:2)
You shouldn't need anything more than a 400 MHz unless you're really ramping up your movie production needs.
Re:This is why Apple isn't dead (Score:2)
Well the AMD 600 Machine I built for my dad nearly 3 years for $650 works just fine too, running Linux and windows XP. There is no advantage for Apple as far as aging hardware is concerned. Good luck running osx on that G3 imac.
Re:This is why Apple isn't dead (Score:2)
Have you tried it? It works fine. I run MacOS X 10.1.5 on a G3 233 MHz (granted, it does have 320 mb ram), and for what I use it for (Netscape 7/Office X/ssh/mp3 playing), it is quite sufficient.
I do think that the previous poster had a point, though. You don't NEED 2+ GHz for most computer use. If you did, Apple wouldn't sell any machines.
Apple loses on MHz and Price ... (Score:2)
Re:This is why Apple isn't dead (Score:2)
You're correct that they don't force average users to upgrade every year like Microsoft/Intel/AMD try to do. But Apple also has a much larger percentage of it's total users who actually need a lot of power. A lot of people who are still using Macs are doing so for video or desktop publishing work which benefits from things like dual processors and 1000+ MHz G4s. So it's worthwhile and cost-effective for them to upgrade often, paying top-of-the-line prices each time.
Re:i think it's a big scam anyways (Score:2)
Re:Oh what a surprise!... (Score:2)
Well I bought a 350hp car because I didn't see that it was worth waiting a year on the waiting list to be able to buy the supercharged version.
However I do notice that many of my neighbors have cars with equivalent capacity (4 litre or above) but give only half the power and less than half the mpg.
I just upgraded my son's computer with the cheapest components I could get from Frys that I could be confident would last a couple of years (having previously bought grotty PCs and regretted same). For $350 I got an Intel motherboard, 1.9 GHz processor, 1/4 gig Ram and a pretty nifty video card.
I agree that there is not much reason to upgrade from need these days. Even gaming is no longer a power users issue since the copy of tombraider angel of mercy you buy in the store next week will have been developed on hardware that is already close to obsolete.
The only mass market, power application I see about at the moment is digital video editing. That will pretty much soak up cycles on anything you throw at it. But the market is fairly specialist still.
As the hardware gets cheaper I am much happier to accept machines with everything integrated on the motherboard.
Re:And meanwhile... QWZX (Score:2)