Centrino Duo, Buy or Wait? 251
pillageplunder writes "BusinessWeek Columnist Steven Wildstrom answers a readers question on whether or not to buy a laptop with the new Intel Centrino Duo processor. The reader wanted to know if the new chip would be up to handling the Graphic requirements of Microsofts new Vista OS, and whether or not it would cost more. His take? Regarding price, probably not, about performance, right now there is no real way to know for sure. He does a decent job of outlining bug issues with new chips, and what the various vendors say/feel about this chip."
Will it last long enough to see vista? (Score:5, Funny)
Re:Will it last long enough to see vista? (Score:3, Insightful)
I own 3 laptops:
- Dell Inspiron (1998)
- vpr Matrix (2002)
- Apple PowerBook (2004)
The oldest (Inspiron) had to make a daily commute back-and-forth to my school in Newark, and even back-and-forth to work for a while. Sure, there's the occasional scratch or skuff mark but otherwise it's fine. The only problem is the battery on the Dell Inspiron is toast,
Bloated Vista system requirements? (Score:2)
Re:Bloated Vista system requirements? (Score:2)
Re:Will it last long enough to see vista? (Score:3, Interesting)
Laptops often use custom chipsets that require particular drivers. Often drivers that never get updated for comaptibility against newer O/S's. Upgrading the O/S in those cases becomes a fool's errand.
I'm still using a Toshiba Tecra from 2002 (4 years now). It has an upgraded hard drive and a full loa
Re:Will it last long enough to see vista? (Score:2)
I was a little annoyed when the Versa quit out after only 7 years... if I had one quit on me after 12 months, I'd be seriously pissed.
Okay, so I am using Anecdotal evidence too... (Score:2)
You sound like you are lucky with your machines or really careful. You definitely carefully picked out some good machines, because those are two outstanding models I didn't include on my brief list above. I have a 75 mhz vesa that is still ticking away -- even the battery works! And it spent much of its worklife in a dairy barn!
I ha
Re:Will it last long enough to see vista? (Score:2)
-nB
Anecdotal evidence is not Data (Score:5, Insightful)
It was a joke, with a grain of truth. Basically a laptop's life expectancy is 1 - 3 years and more realistically a year of serious professional duty. How long does your battery last? Over 4 hours, still? That's usually the first to go... How about the optical drive and floppy? Can it read every burned disk you throw at it? In my experience, and I have a shelf full of old laptops, these things probably don't work too well. Laptops die young. This is why most manufactuers have never given them a long warranty. It's probably great for hobbist stuff, but would you still have your job if you tried issuing 7 year old laptops as standard corp. issue?
Your seven year old laptop is going to be hard pressed to run XP and I don't think any sensible admin is going to want to have a 98 book in the wild with sensitve data. How many minutes would it take me to own your computer if it's hooked up to the internet? If you really want to extend your laptop's life, get a copy of Solaris on the thing. I am running Solaris 8 for intel on an old stinkpad of the same vintage and it is as good as XP on a new machine with a gig of ram.
Now that I have explained the premise of the joke and expressed my sympathies with your concerns, I will continue with the punchline... How long has MS been telling us they are coming out with Longhorn, now Vista? A dang long time.
In reality it might come out this year, but it might be another year or two at the rate things are going. It's been delayed for easily a good three years now. See, that's why it is funny. If you bought a laptop for longhorn/Vista when it was supposed to be released it'd probably be dead right now especially if you bought a gateway, emachine, HP or sony. In anycase, it'd be slow and underpowered.
And yeah, you're better off waiting for the OS to be released and get a machine made for the OS because if the graphics card don't work, your not going to be able to swap it out... and there are a lot of components that might be questionable under the new trusted computing/closed A(nalog)-hole/DMCA/**AA design Microsoft is going for. Your best bet would be to wait. If you need a laptop buy a $500 Acer (they have a great warranty and build good gear) and save your money for the machine you really want.
And the name of my laptop? Why I use an Aristocrat!
Re:Anecdotal evidence is not Data (Score:3, Informative)
We have numerous laptops that are 3-5 years old and still run WinXP/Win2k just fine. Mostly because we made sure to max out their memory configurations (either with 512MB or 1GB of RAM). Heck, my system is a 1GB Tecra that is from early 2002 and I still use it 12-15 hour
Requirements (Score:5, Informative)
System Requirements:
Minimum system requirements will not be known until summer 2006 at the earliest. However, these guidelines provide useful estimates:
512 megabytes (MB) or more of RAM
A dedicated graphics card with DirectX® 9.0 support
A modern, Intel Pentium- or AMD Athlon-based PC.
So, I am guessing that a Centrino will fly.
Re:Requirements (Score:2)
ps: Anybody seen review of a total silent desktop pc based on the duo chip?
Re:Requirements (Score:5, Insightful)
If you really, really need a new computer now, buy one now.
If you don't, don't.
No matter what, there will be something new computers can do next year that the one you buy today can't do. C'est la vie. Don't buy computers you don't need, and this will never be a problem.
Re:Requirements (Score:2)
Re:Requirements (Score:2)
Vista would have been developed and tested on hardware that's less capable than what will be released today or tomorrow, so why is this even a question?
Re:Requirements (Score:2)
Am I the only one who is really bothered by this requirement from an OS .
Re:Requirements (Score:4, Funny)
-nB
Re:Requirements (Score:3, Interesting)
Re:Requirements (Score:2)
The issue is not the processor but the graphic capacities of the i945M chipset (with integrated graphics and shared memory).
Given the fact that Vista supposedly use CG and advanced graphics a lot, the guy wonders whether the 945M will be able to give you a "full vista experience" compared to a standalone graphic card.
Re:Requirements (Score:2)
And my question is still... what would possess a fellow to upgrade the Windows O/S on a laptop? There are no must-have features in Vista (nor were there for WinXP). These aren't Macs where it's cool to constantly upgrade to the latest version of OS X.
Re:Requirements (Score:2)
I don't see which Vista feature is going to push me from 2000 / XP to it.
CPU maybe, but video? (Score:2)
Re:Requirements (Score:2)
Does that still count as flying?
Re:Requirements (Score:2)
I also used a newer gateway recently with a dual-core intel desktop chip and onboard intel x-treme graphics. It was so xtreme it crashed trying to run openGL screen savers (really slick screen savers).
Re:Requirements (Score:3)
That is only a requirement if you want to run the Aero user interface (it must also support Windows Display Driver Model). I can't believe I haven't seen any "Score:3+" comments mentioning Vista's "Classic" UI mode, which doesn't require a powerful GPU. In fact, it looks a lot like Windows XP with its "Luna" interface deactivated [wikipedia.org]. According to that Wikipedia article (don't use as a final source), Vista's "classic mode" only has the same graphics c
New Duo Prices for Dell (Score:5, Informative)
Re:New Duo Prices for Dell (Score:2)
Re:New Duo Prices for Dell (Score:3, Informative)
I'm actually hoping to get a la
Re:New Duo Prices for Dell (Score:2)
I went through and spec'd a 17" inspiron, duo core, 1 gig of high speed memory, and 60 gig hard drive & DVD burner. After the $400 off it added up to right at $1300. Pretty sweet deal if you ask me.
Centrino Duo: Buy or Wait? (Score:5, Funny)
Long answer: Wait
Bad Move (Score:2, Informative)
Re:Bad Move (Score:2, Informative)
Re:Bad Move (Score:2)
And when it comes, there'll be a mad rush for 64-bit boxes, and everyone'll be wishing they'd made the jump sooner. Face it, it's only a matter of time, and time runs pretty fast in this business. Might as well get your bitness on now, so you can complain about how slow your computer runs the newest stuff, instead of complaining about how it won't run it at all.
Re:Bad Move (Score:3, Interesting)
http://www.linuxhardware.org/article.pl?sid=05/02/ 24/1747228&mode=thread [linuxhardware.org]
On AMD processors, Povray seems to experience a 25% performance improvement by going 64-bit. If you were rendering lots of complex scenes, a 25% performance improvement merely by switching from a 32-bit to a 64-bit OS is incredible.
Especially if you are a POV-ray buff; the 64-
Re:Bad Move (Score:3, Insightful)
In a s
Re:Bad Move (Score:2, Insightful)
Re:Bad Move (Score:2)
Win64 is not the only 64-bit operating system out there.
Unlike them, I agree. (Score:2, Insightful)
Re:Bad Move (Score:2)
Which world would that be? Personal computers are moving to 64-bit at about the same rate that IPv6 is being adopted right now: glacially.
Re:Bad Move (Score:2)
How about this one? By 2H2006 it appears that virtually all of Intel and AMD's processors will be 64-bit, and likely with virtualization technology.
And the major security feature of the NX bit doesn't seem to be being backported to 32-bit archtectures.
Re:Bad Move (Score:5, Insightful)
RAM will be the deciding factor for when we move to 64-bit processors.
Don't believe me? Ask yourself this: why is it all of the big room server clients wanted a 64-bit chip years and years ago? So that they can saturate their servers with multiple gigs of ram; CPU archetectures might change day to day almost, but RAM archetectures usually last a long, long time, and as time passes, prices go down. So that big iron server that you purchased with 4GB of extremely expensive ram at the time, you can now saturate with 16GB of dirt cheap ram and still be in the top 80% performance bracket.
How does this translate to home users? When home users hit, and can no longer exceed the 4GB limit, then and only then will we see a desktop push to 64-bit. And we've still got a lot of ground to cover until then; some top end computers are running 4GB now, but by and large 512MB is the standard, with 1GB now being the recommended ram total. Ram scaling-wise, I predict we won't hit that "need for 64-bit" number until 2009, but by 2008 or earlier, all desktop CPUs will be 100% 64-bit anyways.
How does that tie into today's discussion? Perfectly; by 2008, your laptop will be obsolete, that's a given. So that means purchasing a system now will likely carry you until the 64-bit revolution. All and all, this means that 64-bit is a non-selling point to a Laptop consumer at this date.
re: exactly! (Score:2)
But yeah, precisely. Despite all the hype over 64 bit, it doesn't necessarily make code run any faster than it can on a good 32-bit CPU. The only "tangible" advantage is the ability to manage more system RAM. As developers have said repeatedly, 64-bit applications require shuffling around larger numbers, and only in specific instances does 64-bit give you a speed advantage with your code.
I also predict that before we start seeing the
Yes, but Vista changes everything... (Score:2)
Re:Yes, but Vista changes everything... (Score:2)
That user is unlikely to change the O/S on that notebook in the next 2-3 years.
Users who change O/Ses on notebooks are more likely to be tech literate and unlikely to pick Vista to run on an old notebook.
Re:Bad Move (Score:2)
Power users get a new laptop every 2 years or so. Their old one gets reassigned to an average user and the average user's laptop gets handed off to a light user.
That "light" user's laptop can still handle basic docum
Re:Bad Move (Score:5, Informative)
In 32-bit code where SSE optimization is implemented, a lot of 64-bit gains disappear. This is particularly interesting for the Mac since their baseline Intel spec will always have at least SSE3, so all apps can target it from now on. Doing 64-bit math doesn't require a 64-bit chip either, as SSE goes up to 128-bit. The real reason you'd want 64-bit is if you're running a server that needs a very high amount of memory.
64-bit gaming has been the most amusing to me, watching as CryTek and AMD teamed up to sell more chips and desperately advertised 64-bit Far Cry as better than its 32-bit version by adding higher-res textures here and there and tweaking the visuals, even though absolutely none of that has to do with being 64-bit and everything to do with your video card. 64-bit Half-Life 2 is actually slower than its 32-bit version according to the benchmarks. Slashdot has an article in its archives about how 64-bit gaming has been overhyped to gamers.
There are times I wonder if 64-bit will die as a fad this year and become an unused set of instructions that only server admins use. It's certainly got all the makings of a tech fad. I think the novelty is wearing off and people are realizing 32-bit is just fine and that there is nothing inherently better about being 64-bit, other than giving AMD and Intel a marketing reason to sell you new chips. I can't think of any reason a desktop computer user today needs a 64-bit chip. Microsoft, of course, is very vocal about wanting to put everyone on 64-bit chips, and the reason for that is that the majority of Windows sales come from pre-installations on OEM computers, so if they can convince people to buy new computers that have new chips in them, they sell more copies of Windows. I think they'll have as much success with that as they did with the XBox 360 launch. Ahem.
As a sidenote, Apple handled 64-bit in OS X Tiger by keeping the GUI 32-bit, but allowing 64-bit processes to be spawned in the background. This means your app is 32-bit but you communicate with a spawned 64-bit console process (it has to be a console process because the GUI libraries are still 32-bit code). It's so little used that it took a while for anyone to notice when one of the 10.4 updates accidentally disabled 64-bit support...
Re:Bad Move (Score:2)
Do you still have an 80286 around as well while waiting to see if 32-bit processing is more than just a fad?
It can't run 64-bit Windows Vista (Score:3, Informative)
Rather glaring ommission by BusinessWeek.
Re:It can't run 64-bit Windows Vista (Score:4, Insightful)
It can't run 64-bit Windows Vista and the Intel GPUs the Centrino Duo notebooks usually use are very poor.
Nothing can run 64-bit Windows because the existing versions suck so badly with driver and software incompatibilities. No one I know with a 64 bit processor is running a 64 bit version of Windows on it anymore. Everyone has given up and switched back. Vista will support 32 bit for longer than most laptops will last and I don't see any reason why someone would switch in the foreseeable future for their laptop.
As for graphics, what the hell are you talking about? There are a handful of Centrino Duo machines for sale right now and looking at the selection I see both ATI and nVidia graphics cards in them. Acers ship with ATI and Sony with nVidia.
Do you enjoy misleading people by making crap like this up, or are you just very misinformed?
Re:It can't run 64-bit Windows Vista (Score:2)
http://www.nvidia.com/object/winxp64_81.98.html [nvidia.com]
http://www.nvidia.com/object/nforce_nf4_winxp64_am d_6.69.html [nvidia.com]
https://support.ati.com/ics/support/default.asp?de ptID=894&task=knowledge&folderID=367 [ati.com]
http://h10010.www1.hp.com/wwpc/pscmisc/vac/us/en/s m/network_software/universalprintdriver_overview [hp.com]
Re:It can't run 64-bit Windows Vista (Score:2)
you should have no problem finding 64 bit drivers for most modern hardware
I've heard the same complaint from a sysadmin, a photographer, and a security expert. Brand new hardware does not have drivers for Win64, even in one case when they claim they do on their Website. Also, both an image processing program and two databases that either will not run, or crash unacceptably often when running 64-bit, but run just fine on 32 bit.
Linux is beside the point, since the question was specifically about Windows
Re:It can't run 64-bit Windows Vista (Score:2)
When mentioning nVidia, he was talking about Turion systems, not Centrino systems.
Umm, sure, the part where he said, "the Intel GPUs the Centrino Duo notebooks usually use are very poor" should have tipped me off to the fact that he was talking about Turion systems. You can get the same video cards in either system and claiming Centrino Duo's should be avoided because the nVidia cards in the Turion are better is just plain wrong. He tried supporting his statement with an untrue example. Get a clue miste
Re:It can't run 64-bit Windows Vista (Score:3, Interesting)
Who cares, who has a retail copy of 64-bit Windows Vista laying around. Oh, who's that? Nobody? Well then. And who will have a copy in a year? Who's that? Hardly anyone? That's right. Face it, 64-bit will be slow to adopt until we truly hit the 4GB ram barrier (right now we're averaging right under the 1GB mark; most PCs ship with 512, most recommend 1GB), and Vista will help that push, but we won't likely see a need for 64-bit Windows/OS X arrive until 2008 or later, w
Re:It can't run 64-bit Windows Vista (Score:2)
64-bit Linux shows about a 20-25% performance improvement on math intensive apps versus the same configuration on 32-bit Linux.
I guess part of it is that 64-bit Linux is pretty painless, compared to 64-bit Windows.
Re:It can't run 64-bit Windows Vista (Score:3, Insightful)
As someone else here also mentioned, all the people I know who were running 64-bit Windows gave up and now run the 32-bit version. Guess what, it's faster for them and runs better. There is little inherently be
Intel integrated graphics RAM usage? (Score:3, Interesting)
Merom (Score:2, Interesting)
Re:Merom (Score:3, Informative)
Maybe just wait a bit? (Score:2)
Sure it will be able to handle Vista, but it w
Posting from (Score:3)
Re:Posting from-Excuse Me (Score:2)
Like that's never happened?
Re:Posting from (Score:3)
And that's different from previous Windows versions how exactly?
6 months (Score:5, Insightful)
Re:6 months (Score:2)
Re:6 months (Score:3, Interesting)
I found the opposite with CRTs a couple years ago. My 19" $150 monitor died after a year, and was now going for $200. No sales or rebates involved. I thought it was maybe just a fluke, but other monitors of various sizes all went up around $50 as well.
More recently, I've been looking for a DVB-S card (satellite). It's incredibly annoying to read a post from 2 years ago a
Sage advice says: (Score:2)
Then, be prepared to put it on the inheritance
Re:Sage advice says: (Score:3, Informative)
Yeah, if only we had something that let us work on two different programs at the same time. Oh, right, we do, its called a multitasking OS. Even if you don't do anything like ripping CDs, chances are good that you're running multiple widgets, all doing their things at the same time. You're checking emails, running an RSS gatherer, indexing your
Re:Sage advice says: (Score:2)
Think of your processor as a road, with each core being a lane. On an empty road, you're limited by the speed limit (theoretically). Adding a lane does nothing whatsoever for you. Now add one guy turning left in front of you; on a 100 mile trip, its statistically insignificant, but a two lane road would have meant that you wouldn't have had to slow down at all and a single-lane road might have broug
This presumes many things that might not be true.. (Score:2)
It's a beautiful day.
And NOTHING guarantees this at all. Indeed job queuing is pretty much random unless the OS has native tendencies. You won't get a stochastic job distribution among the processors, except by luck, and perhaps ph
Bad guess (Score:2)
You must work for a chip or hardware vendor and have rose coloured glasses. I'm a critic.
It may feel faster, but in actuality, work performed is never 2x or close. Yes, faster. 2x: not even close. This isn't to say that a multi-core isn't a bad idea for performance. But realistically, it's not the panacea that some think it is. There's graphics subsystems, internal bus chip
Wait (Score:2, Interesting)
Comment removed (Score:3, Informative)
Bah (Score:5, Insightful)
I had someone say that a Dell rep told them that they really should get that Hyper-hot $350 GeForce ultra-platinum video card, because she'd need it to retouch photos on the computer. That's pretty reprehensible IMHO. A $30 graphics card or mainboard graphics would have done just fine. I say they practically stole $300 from her.
Sorry for going OT.
It's simple. (Score:2)
Second core doesn't help much (Score:5, Insightful)
Overall, the only thing I've really noticed that is significantly faster is Java. Most Java apps use threads, and if nothing else the GC seems to run on the 2nd CPU. For example, the graphics demo takes 100% of both cores if you set the delay to 0ms between frames. That's about the only program I've seen actually use both cores.
As a side note, I predict with more cores we will see greater use of things like Java. It may run at say 80% C speed, but 80% + 80% is still much more than 100% on one cpu and 0% on another.
Re:Second core doesn't help much (Score:2)
It's night/day with me. A dual core system feels so much more responsive, effortlessly gliding from one application to the next as one is entirely isolated to one CPU and the other to the other.
Of course, I don't own a DC system anymore; the last dual processor system I owned was a dual proc 500mhz pentium III system (might've been xeon, I can't recall; it was my Dell) and I miss it. If it could run today's applications at any kind of speed I would still have it as it was such
Re:Second core doesn't help much (Score:2)
I have a pair of Opteron systems at my desk (in addition to the laptop). The dual Opteron 246 unit almost never has responsiveness issues, the Opteron 148 is constantly waiting.
Just wish the prices on the 265/270s would drop to something reasonable (say $200 each)... I "want" to upgrade my dual 246 to a pair of 270s.
Re:Second core doesn't help much (Score:5, Interesting)
However, CLR does not have the potential to be as fast as Java on a multi-core processor since, due to it's native code interface and unsafe code, the GC can run less often in parallel with the other threads (it blocks significantly more often getting access to the pinned object memory). Also, CLR based applications have fewer opportunities for hotspot-like optimization due to its bytecode format being difficult to interpret efficiently; Java can run an optimizer on another core and get more use from the other processor in that way, and faster code. In addition, betas of the new JVM put temporary objects on the stack automatically (often detected as a result of optimization). This also allows the GC to run in parallel more often (.NET can only do this with value classes, ie structures, that the programmer has to explicitly declare... much like 'register' or other archaic attempts at optimization).
Vista Reqs (Score:3, Informative)
Buy based on what you need.... (Score:2)
And once you do buy something, don't check prices.
and will it run linux? (Score:2)
I want to run FC 5 on it, although I haven't figured out whether linux kernel is ok. Anyone know?
Re:The snail (Score:5, Informative)
Centrino != Celeron
The processor used with the Centrino chipset is a core duo, exactly what Apple is using.
Re:The snail (Score:2)
Well, centrino is in fact a full platform/certification (at least processor + chipset + wireless chip, probably a few other things too) so I'm not sure.
Not that Apple cares, Centrino is useful as a brand/quality indication, and Apple won't use it.
Vista will run - GPU is needed, not CPU (Score:3, Informative)
The demanding requirements of Vista come from the Quartz-clone, Aero Glass. This is like Apple's quartz, only pure XML instead of Adobe PDF based (an XML/Forth hybrid/melange).
In doing so, it is between 500% to 1000% less efficient, requiring the highest end GPUs, with minimums of 128 MB VRAM.
In the end, it accomplishes little more than Quartz - with the exception of easire X-Style remote window in
Re:Vista will run - GPU is needed, not CPU (Score:3, Insightful)
My laptop has dedicated graphics that consume significant power when clocked at full speed. To save power, my machine underclocks the graphics module when the performance isn't needed, sort of like PowerNow/Speedstep for the GPU. I understand that Nvidia cards actually power down some of their 3D circuitry when it's not needed, as well.
If Vista requires the GPU to be fully powered up all the time, that could put a significant burden on laptop ba
Re:The snail (Score:3, Insightful)
Apple has a pretty good track record of:
(a) Managing switches to new architectures in an efficient (seamless?) way;
(b) Dealing with recalls, upgrades and problems; i.e.: they have good customer support.
So I would say if you want to be "bleeding edge" in this case, do it with Apple, who will "hold your hand" and smooth out many of the rough ed
Re:The snail (Score:3, Informative)
Re:The snail (Score:4, Insightful)
Re:The snail (Score:3, Informative)
Except that it's not available yet. And even when it will be, Intel ports of Mac software will still be mostly missing in action (unless all you need is basic stuff from Apple) - and no, Rosetta does not always cut it, heck, some programs can't even run under it at all[*]. So depending on your needs, a MacBook might just be a slick brick for a while. The key concept here is think before you buy.
[*] preempti
Re:The snail (Score:2, Interesting)
Re:Works fine with OS X (Score:5, Insightful)
The Vista GUI (if I recall) is going to rely on DirectX 10 (or whatever version). In theory, so long as ATi and nVidia keep up and their cards have good DX10 implementation, the CPU shouldn't matter as much. Of course, it's may not just be a matter of how "graphic intense" the two OSes are - it depends on how efficiently they are implemented. OS X is well built. Vista, we'll have to see when it comes out.
Re:Works fine with OS X (Score:5, Informative)
OS X uses some OpenGL stuff; a lot of 2D compositing. It doesn't totally bury the system, however, and it can move a lot of that to software rendering as well; that's why it works just fine on my Powerbook with a GeforceFX 5200, 32 MB ram.
Vista, on the other hand, uses boatloads of 3D, everywhere. Lots of texturing. The main issue with Vista is not having enough graphics ram. For the full "Avalon" "experience", you'll need 256 MB in a 32-bit environment [extremetech.com], and possibly more in a 64-bit environment. Fill rates will also be important, in order for you to keep your windows flying around the screen in 3D.
God knows why so much is needed; Project Looking Glass provides a similar display with far more modest requirements, and thats a JAVA window manager. Not to mention that Xorg is getting really, really close to alot of these things. Xgl is currently running with all kinds of interesting shader/geometry effects [digg.com], and KDE's got the window manager refraction/reflection (take a look at the CrystalGL, the big cousin of Crystal, which does it in software).
Ultimately, Linux will get there, but the problem is integration; most of these features are avaliable on X, but few of them play nicely with OpenGL, and they often don't play well together. We'll have to see a big, combined push between the KDE 4 effort, GNOME's next generation Metacity, the freedesktop XGL/Xorg 7+ people, and NVIDIA/ATI. As I understand it, much of this is occuring now; but we probably won't see releases till near the time Vista is released, and we won't see proper integration into distributions till late 2006/early 2007.
The best part is, however, that once it DOES get into Linux, it'll run just fine on 32/64 MB cards, and most likely will degrade much more gracefully than Vista; there'll be a finer set of non-functional options, rather than 3/4 main settings.
I have no fear that we'll see plenty of desktop eye candy in the near future on Linux; this is mainly attributable to the freedesktop people, who have saved X with Xorg, a product that is making progress now after years and years of stagnation.
I'm much more worried about DirectX 10 (WGF 2.0). Will OpenGL keep up? I hope so, otherwise we'll see the few Linux/Mac gaming houses there are out there (in addition to Transgaming) fail completely as they become unable to port over Windows graphics features. NVIDIA, ATI and Apple seem to be keeping the OpenGL group moving, though.
Re:Works fine with OS X (Score:3, Informative)
http://forums.gentoo.org//viewtopic-p-3081186.htm
Re:Works fine with OS X (Score:2)
I apologize for more of the tired M$ bashing, but I don't see how Vista could possibly be well built. Considering that it's based on a Windows code base that has 20 or so years of legacy code that was all a hodge podge to begin with, combined with the 5 year old code base for just the current revision, it seems very very unlikely that it could be well built. Most likely it was a huge pool of great ideas that didn't work and a last ditch effort
Re:Works fine with OS X (Score:2)
Re:Works fine with OS X (Score:5, Insightful)
RTFA, the /. headline is stupid and misses half the facts. The article is about the i945M integrated graphics, not the Core Duo itself, and whether the integrated graphics will be able to handle the load of Vista.
The iMac/x86 are bundled with ATI's X1600 and the Macbook pros have an ATI Mobility X1600, they're not using integrated graphics from the chipset.
Re:my source of free PCs is going to dry up. (Score:3, Funny)
I heard many people just buy a new PC instead of having their old one disinfected. Why? It costs about the same.
You either pay $500 for labor or $500 for a new PC.
Re:Whatever you do... (Score:2)
If you wait for the next best things, that's all you'll do, is wait. There is always something better around the corner.
In honesty though, today's computers don't change that frequently anymore. Once a computer speed freak overclocking upgrade die hard, I have used the same computer system for 3 years, and only now I am starting to decide if I want to upgrade because I want to start storing My DVD collection on my computer for easier
Re:After checking Core Duo specs, the verdict is (Score:2, Informative)
Come over to the shiny side!!
Re:After checking Core Duo specs, the verdict is (Score:3, Insightful)
Integrated graphics are good enough for just about everything but gaming. Most laptop buyers actually use their laptops for work, surfing the net, email, etc. Longer battery life is more important than frames per second to that large market segment.