It's Official - AMD Buys ATI 508
FrankNFurter writes "It's been a rumour for several weeks, but now it's confirmed: AMD buys ATI. What implications is this merger going to have for the hardware market?" In addition to AMD's release, there's plenty of coverage out there.
Tomorrow (Score:3, Funny)
Re:Tomorrow (Score:3, Insightful)
*shudder*
Re:Tomorrow (Score:4, Insightful)
Re:Tomorrow (Score:3, Insightful)
I hate how people write off ATI and Nvidia as Open Source scrooges since their drivers are closed. The reality is that their code isn't all home grown and they couldn't open source it even if they wanted to. The copyright and patent holders on their licensed technologies wouldn't let them.
Comment removed (Score:5, Insightful)
Re:Tomorrow (Score:5, Insightful)
1. They have large enough staff to decompile and perform clean reverse engineering of NVidia's drivers, e.g., one team analyzes the decompiled code and takes notes (without copying code of course), another team designs improvements and implements based on that analysis
2. Their competitors own electron microscopes, making analysis of the chip internals relatively simple.
Now tell me: why are the likes of NVidia and ATI keeping their products undocumented and their drivers closed?
And to counter your argument: what happens in two years when ATI and NVidia decide your card is too old to support, and yet it still performs very well but you NEED the features in the latest kernel and latest x.org? Go ahead, buy a new video card -- oops, nope, sorry, they changed slot specs again, and PCI Express cards are no longer available because PCI-X finally gained market share in the consumer market and PCI-E ended up as short-lived as VLB did in the VLB vs. PCI war.
(do I expect PCI-E to die? No, it was a hypothetical example showing the potential problem with proprietary drivers)
Think about what you just said (Score:5, Insightful)
Because, if they DO PROTECT THEIR IP, The OTHER GUY has to waste TONS OF MONEY on reverse-engineering teams and highly-qualified people to reverse-engineer the processor via electron microscopes.
It's not the EQUIPMENT that is expensive, it is the PEOPLE. And, as you Linux zealots know FULL WELL, reverse-engineering is EXPENSIVE in terms of PEOPLE and TIME.
If you publish the specifications of your latest graphics chip for all to see, suddenly your competitors don't have to divert staff from working on next-generation architectures just to reverse-engineer your system. Instead, they can analyze your documentation in a fraction of the time.
It's a two-way street, so stop deluding yourself that there's only one side to the story. Publishing full specs for your graphics chips is like writing your competition a blank check. Intel is the only one who doesn't have an issues doing this because their graphics technology is always following.
And to counter your argument: what happens in two years when ATI and NVidia decide your card is too old to support, and yet it still performs very well but you NEED the features in the latest kernel and latest x.org? Go ahead, buy a new video card.
Yes. There are still many well-supported video cards sold in AGP. In fact, you can still get well-supported video cards in PCI, a fifteen-year-old technology. They're not top-performers, but beggars can't be choosers.
The video card market is transitioning to PCIe with surprising speed precisely because they do not want another VLB fiasco. The PCI -> AGP transition was slow because PCI still had a future for other types of cards, but the AGP -> PCIe transition was rushed to avoid market confusion. You can still buy plenty of AGP cards, but the big players have made it clear: there won't be any more improvements for AGP.
Re:Think about what you just said (Score:5, Insightful)
Bullshit, sorry. We don't want their beloved silicon blueprints of their latest GPUs, just information on how to make them work. Want to draw a polygon? Send this command to the card. Do a hardware T&L? This other one. You can learn only so much from a driver sourcecode or techincal specifications on how to program a GPU. Don't beleive me? Check the information released by both nVidia and ATI for their older GPUs, and see how much you can infer from them.
Re:Tomorrow (Score:5, Informative)
Re:Tomorrow (Score:3, Interesting)
It's amazing how so many slashdotters who work in the "real" world are purposely argumentive for solely the purpose of being dif
Re:Tomorrow (Score:5, Interesting)
if they both buys graphic chipsets companies, does this means nvidia's technology on ATI GPUs and the other way around ?
or will they shield the newly aquired techs from the setlment ?
Re:Tomorrow (Score:5, Informative)
As far as I can tell this deal only covers patents made before 2001 (section 2). I could be wrong though, not very good at legalspeak, and didn't read the entire contract. AFAIK they have another cross-licensing agreement as well, but it only covers all x86 extensions and improvements. This is the deal that you're probably talking about as SSE and AMD64 are x86 extensions. So to answer your question: no they would not need to share tech acquired from ATI.
x86-64 is not part of the IP sharing (Score:5, Informative)
Once AMD got Microsoft's cooperation building support for x86-64 into Windows, they hardped on about the open standard. This protected AMD from Intel, who were already secretly working on their own implementation of x86-64. Normally, once Intel realized how potentially powerful x86-64 was, they were sure to create their own incompatible version (ala SSE and 3DNOW!) to try and derail AMD.
But the open standard stopped Intel from doing this. Microsoft pointed to the open standard, and told Intel flat-out that they were not going to support two versions of 64-bit x86.
x86-64 is an open standard. AMD's copyrighted implementation of x86-64 is called AMD64. Inte;'s copyrighted implementation of x86-64 is EMT64.
Re:x86-64 is not part of the IP sharing (Score:3, Informative)
Windows used to use some really moby hacks with thunks to get 16-bit libraries working with 32-bit code, but they don't use it for NT, and opted for virtualization (WOW/NTVDM) instead. It's not perfect virtualization, but it's enough to count. Presumably they do the
Re:Tomorrow (Score:3, Interesting)
EM64T [wikipedia.org] is Intel's version of x86-64[1]. It is slightly different from AMD's implementation, but most code compiled for one will work on the other.
Can we please stop calling it AMD64? It's a small number of extensions to IA32 - smaller than SSE - and AMD intro
Don't really know.. (Score:5, Interesting)
But on the other hand, this could split the market and get things like todays uncompatible browsers. (Which is VERY annoying somethimes)
And we have a psychic [slashdot.org]
Re:Don't really know.. (Score:4, Interesting)
Re:Don't really know.. (Score:5, Insightful)
The AMD-fans/nerds are more linux-minded then Intel (IMHO), and AMD probably knows this. They can really make a business-blow by releasing this, in the mind of open-source.
Re:Don't really know.. (Score:3, Insightful)
AMD knows that, whatever market share it has in the desktop arena, Linux is a major player in the HPC and 2P+ spaces and knows that L
Re:Don't really know.. (Score:2)
Re:Don't really know.. (Score:2)
Re:Don't really know.. (Score:2)
It's a bit of an odd buy, considering nVidia's chipsets contributed so much toward the rise of the Athlon and are still (afaik) the lead in performance chipset-wise. But I guess if they merge their teams (or at least make them consult eachother a lot) we could see some perf improvements.
I wouldn't mind seeing a mini-gpu inside of the CPU dedicated to massive vector ops - it would certainly blow the pants off SSE2 for larger datasets (video/sound codecs?). Of course stuff can already use the GPU but I be
Re:Don't really know.. (Score:4, Interesting)
could be good.. (Score:5, Interesting)
also, not official yet, as government regulatory bodies need to approve it.
It WILL Be Good! (Score:5, Insightful)
Since (in my opinion) NVidia has taken the lead in GPUs, I hope that ATI will be boosted back into a competitive state and price wars ensue.
Again, to me this is nothing but great news for the end-consumer.
Re:It WILL Be Good! (Score:2)
2 years ago it looked like Nvidia was dead meat, now they've come back strong. I only get worried when one of these companies can't get their sh*t together for 2 or 3 generations in a row...then you know they've stagnated.
Probably Not Going To Happen... (Score:5, Funny)
Nah, weren't you reading yesterday? ... (Score:4, Interesting)
On the other hand, releasing either open source drivers, or a combination of binary drivers, along with documentation (so those who want to write their own CAN), would certainly be the best of both worlds.
Driver code not the issue (Score:4, Interesting)
The drawback would be a lockout for experimental 3D APIs. But it would be no worse than the binary driver situation we have now.
Maybe (Score:2)
System on a chip or at least integrated GPU and CPU cool.
I just wish it was Nvidia.
Re:Maybe (Score:5, Funny)
A die holding an AMD core and an ATI GPU may be 'neat', 'fab', 'brill' or even 'ace' - but 'cool' - I think not!
Re:Maybe (Score:5, Insightful)
Probably because most Slashdotters are not driver hackers nor OSS purists, they are developers, gamers, and power users -- and Nvidia's hardware (and driver support for the hardware) is phenomenal.
Your gripe is not baseless, though: would it kill Nvidia to open up a bit? Perhaps the renewed competition will encourage them to do so, although it's equally likely that they will take the opposite tack and circle their wagons ever more tightly. As long as they provide excellent binary drivers for Linux, I doubt that they will feel much incentive to go Open Source...
Re:Maybe (Score:5, Interesting)
I could swear that's they way that it is, but I can't find any definitive reference to the settlement.
Re:Maybe (Score:3, Interesting)
Re:Maybe (Score:3, Funny)
Re:Maybe (Score:5, Informative)
Because they've supported Linux with binary drivers for a long time, and their drivers work.
ATI is months behind, and half of the time the drivers are too buggy to actually use.
Philosophy of openness aside, that's an important difference.
Re:Maybe (Score:3, Funny)
Go into your xorg.conf and change the "Driver" line to "vga" or "vesa".
You now have an unaccelerated frame-buffer display that will be as stable as the day is long. It will also suck donkey cock, but it is exactly what you just asked for.
I'd say I told you so.... (Score:5, Funny)
http://www.theinquirer.net/default.aspx?article=3
-Charlie
AMD & ATI (Score:5, Funny)
DAAMIT!
This is a very good thing. (Score:5, Interesting)
*head asploded*
I'm getting the 'gist' of why this transaction needs to happen. AMD needs GPU functionality on the CPU. I think everyone kinda expected that to happen at some point. The Inq. then takes a left turn in the plot and mentions 'mini-cores' which are multi-cores with massive amount of threads. Sort of but not really like Intels' hyperthreading times 32x. Shitloads of threads.
Bottom line?
ATI will work on AMD's new cores. I don't know if they'll work on something that'll plug into a PCIe slot still like nVidia.
nVidia will still be around making graphic cards for AMD. Just won't necessarily be anything remotely similar to what's out on the stores today. AMD doesn't like closed technology like Intel does. So it'll be an open platform still which is a 'good thing' (tm).
Forget about GPU's and chipsets. The main innovation has to come from these new GCPU's.
ATI was going to lose its Intel chipset business anyway with or without this takeover. So no big loss here.
Intel has about a year lead on this tech and probably be first out to market with it.
CPU cores change radically every 5 years or so. With GCPU's, think more in terms of GPU's and radical changes every year to 18 months. Crazy shit.
Plenty of space at FAB 36 to build the new cores and the recently announced plant they are building in New York. So no more costly production runs in Taiwan.
If AMD didn't do this, they'd be out of business in 5 years. Period.
Re:This is a very good thing. (Score:4, Interesting)
-Charlie (the author of the Inq article)
Re:This is a very good thing. (Score:2)
See Hackers Dictionary: "Wheel of reincarnation" (Score:5, Informative)
See the entry in the Hacker's Dictionary / Jargon File for "Wheel of reincarnation [catb.org]":
-Mark
Graphics in software (Score:3, Informative)
Re:Graphics in software (Score:4, Interesting)
Once you can fab a processor large enough to contain 40 functional cores - how big a GPU do you think you could fab on the same process? The simple fact is that a GPU is completely crippled compared to a CPU. There are huge tradeoffs in the design to get that kind of performance. Stream processing is very limited compared to a von Neumann architecture if you care about latency in the slightest. But for graphics - it's perfect. Throwing completely independent parallel chunks of data through an array of vector processors is a much simpler challenge than attempting to extract parallelism from sequential code. The sequential code has pesky things like control-flow that is missing in the gfx shaders, and I don't mean the rubbish that ATi/Nvidia are selling as control-flow in their current designs. That is sheer marketing given the size of the shader batches and the depth of the pipelines.
So I don't think the big 'ol wheel of reincarnation is going to move rendering back into software anytime soon. But what people forget is that AMD is not really a processor company. They are a fab company that just happens to design some kick-ass processors. Their main business is silicon, and buying ATi is the biggest chunk of vertical integration you can imagine...
Re:This is a very good thing. (Score:2)
I would like to see the first 128MB of RAM into the CPU housing as well as the GPU and the minimal southbridge. This should bring motherboard prices lower at the cost of a higher CPU, the overall cost should still be lower. Even better, it should allow for some serious speeds.
At the minimum, the
Re:This is a very good thing. (Score:3, Interesting)
I could see this perhaps in the mobile/embedded market, but not in the server/workstation space. At least not for a LONG time. It's just not a good idea.
Re:This is a very good thing. (Score:2, Interesting)
An interesting hypothesis that came to mind during and after the confirmed speculation and in light of AMD's announced 4x4 platform: plug-in GPU modules on the motherboard. With the way 4x4 works, you would be able to dedicate determinable and upgradeable RAM to the GPU. And since ATI and nVidia have both been working integrating a PPU core in future GPUs, there are interesting possibilities on the horizon.
Having a bank of RAM slots on the motherboard in dedication to an socketed GPU has its drawbacks, I'
Re:This is a very good thing. (Score:2, Insightful)
Don't believe it myself (Score:5, Interesting)
I could see perhaps that they'd stick a cheap and crappy GPU into a cheap and crappy CPU for the low end of the market, but with Vista coming out with all its eye-candy that may not even be viable for rendering the Vista desktop, let alone games.
Re:Don't believe it myself (Score:4, Informative)
Hello? My 7800GS card has a memory bandwidth of 40GB/second from on-board RAM. It would be utterly crippled by a measly 10GB/second shared with the CPU.
One correction. (Score:4, Informative)
Actually Intel has been a big supporter of OSS. They helped port Linux the Itanium and have provided all the documentation to their video chips.
I think you are confusing Intel with Microsoft. Intel has been one of the most open hardware companies.
AMD has also been very good. ATI like nVidia.... Well let's say not so good.
I really don't get this.
AMD could use some good chip-sets but they have made their own for the Opteron so I don't see what they gain from ATI.
AMD could use a good low end integrated video solution for low end desktops and servers. Yes it is true but servers almost never use nVidia or ATI graphics cards. When I set up a server I only plug the monitor in when I do the install and if something really bad happens.
I have to think this comes down to laptops. AMD has not done well in that market and a one stop shop for a laptop solution like Intel offers might be a good solution.
I wouldn't hold my breath on the good open source ATI drivers for Linux. Of course if it happens I might dump my nVidia based motherboard and Video card. I have been buying nVidia just because of their better Linux support for years.
Re:This is a very good thing. (Score:2)
Man... (Score:2)
It wouldn't be so bad if every nVidia based product I have ever tried to use hadn't been DOA.
Well... at least I can still stick with Intel chipsets... there is no way I am using a third party northbridge/southbridge I don't care if I can't use SLI.
Re:Man... (Score:2)
Is there any one (or two) nVidia based card manufacturers that you trust? I, as you can see, have been bad at picking winners so far.
Makes me uneasy (Score:5, Interesting)
I can't see this being good for customers. As we all know, ATI's products tend to be miserably supported, though this hasn't been the case for AMD thus far. How will this affect the nForce line of chipsets? Given ATI's past I'd much rather have an nForce than whatever ATI kicks out.
On the other hand, perhaps AMD will drag ATI out of it's rut, but I think it's just as probable that ATI will drag AMD down, and that's good for nobody.
Re:Makes me uneasy (Score:2)
By once ATI gets into the picture, the terrain changes. Arguably, ATI is bought by AMD not the other way round. But the expertise in supporting the chipset graphics card etc is in ATI. I b
Re:Makes me uneasy (Score:3, Funny)
Oh tell me more, NVIDIA fanboy! Tell me a opposing tale of the wonderful NVIDIA happy land, with a gumdrop house on lollipop lane!
Re:Makes me uneasy (Score:3, Informative)
nVIDIA came out of nowhere about 5-6 years ago, whilst ATI has been firmly entrenched in the marketplace for a much longer time.
nVIDIA was able to grow so quickly, because their products were faster, less buggy, and better supported than anything on the market at the time. ATI was just barely able to keep up, and everyone else bit the dust.
The consumer-end graphics industry has been known for buggy drivers for almost its entire existance. nVIDIA's biggest innovatio
AMD designs (Score:5, Insightful)
OK, so not very close to reality considering what would be involved. AMD bought into ATI because it wants to focus on CPUs, not chipsets.
However, it does make for an interesting point of interest: the three primary components of PC architecture today are the CPU, GPU and chipset that bind the two together. AMD had two parts of the equation, and ATI has two parts as well, though one of these parts overlap. Now AMD is one company that has end-to-end solutions? There's got to be something interesting coming out of that marriage.
Re:AMD designs (Score:4, Insightful)
What I can buy from Intel:
Server chassis + power supply
Motherboard
CPU(s)
NIC
RAID
What I can buy from AMD:
CPU(s)
Small-medium OEMs are going to like Intel because it gives them one point of support for most of their major components. It also gives them a single "partner" with which to negotiate pricing; the larger volume of product means they can get overall better pricing.
Taking on ATI might be AMDs move to start fixing that shortfall in their business model. If they put a solid OEM-friendly motherboard on the market, it will be a huge step in the right direction. With Conroe presently beating the pants off AMD's offerings, this is well-timed.
Re: (Score:2)
Goodbye ATI? (Score:5, Interesting)
I wonder if this means no more ATI cards in Macintosh computers, seeing as how Apple uses Intel now? Or, even more interesting, could it mean Apple switching over to AMD?
Re:Goodbye ATI? (Score:3, Insightful)
Why would AMD do that?
No company would kill off a profitable product line just to spite their opposition. Undoubtedly ATI's deal with Apple is profitable, and just because Apple uses Intel processors doesn't mean that such a transaction is any less profitable than it was before.
Companies don't act in that way, they look out for their bottom line. Unless there's something that would cause that business to become less profitable, ATI is unlikely to give up the block of sales they get from Apple. Is it bet
Re:Goodbye ATI? (Score:3, Insightful)
iBooks always used only ATI graphics.
iMacs have used both ATI and Nvidia graphics.
PowerBooks have used both ATI and Nvidia graphics.
PowerMacs have used both ATI and Nvidia graphics.
The Mac mini and MacBook are currently using intel integrated graphics (high volume products)
The MacBook Pro and iMac both currently use ATI graphics (high volume products)
The PowerMac currently uses Nvidia graphics (low volume product)
Apple
Intel has killed gaming...but AMD has restored it! (Score:2, Interesting)
Re:Intel has killed gaming...but AMD has restored (Score:4, Informative)
The interesting thing to watch will be... (Score:5, Interesting)
So, we'll see how this shakes out. If, as others have said, AMD forces ATI to produce better drivers, and good Linux drivers, that may be a good outcome...
The other interesting aspect is (as it often is) Apple. Now AMD gets an instant slice of the Apple pie (sorry) since ATI makes most current Apple graphics chips. Interesting development there... Intel can't be happy.
I suspect the tension level just notched up at NVIDIAs headquarters as well.
Re:The interesting thing to watch will be... (Score:2)
Worst case (Score:2, Interesting)
A second worst outcome is Intel enters a pact with NVidia, so next gen NVidia cards are so integrated with Intel chipsets that they do not run well on AMD. If you buy an AMD platform, you can only buy an ATI video card. If you buy an Intel platform, you are bound to NVidia. This would suck bad as well
Ugggh (Score:5, Insightful)
Chipsets (Score:2)
I wonder how this will play out... (Score:2, Insightful)
Wishful thinking (Score:2, Interesting)
And the cycle begins again. (Score:2, Interesting)
The next 10 years will consist of a new type of external graphics hardware being built, which will of course, be folded into the CPU at the end of it.
Stock (Score:2)
Uh oh (Score:2)
it's simple, really (Score:3, Insightful)
I read thru most of the comments on this page, and several people came close to what I think the real reason for this deal is, but no one nailed it. To me, this is a simple example of business 101. AMD has always been a niche vendor. Recently they have begun to spread out, but it is obvious from all the comments on this page that they are still a "gamers" chip. Where Intel and Dell made it big was low-end, mass sale business computers. Intel has their crappy but good enough integrated video chipset which is a part of the vast majority of motherboards. In order for AMD to really be a big player, it needed to a) build it's own integrated chipset from scratch or b) buy a company that already makes integrated video chipsets. Option b won, and while it might cost more initially, it should pay off in the long term.
I believe this will not stop nVidia from making nForce boards, and it would be stupid of AMD to stop production of ATI 3d cards. I think this may increase the quality of ATI's support for Linux, but I don't think it will be anything drastic.
Re:it's simple, really (Score:3, Insightful)
Are you smoking crack? AMD has most certainly NEVER been a niche vender...
CPUs
FLASH
SRAM
PLDs
Embedded Processors
Microcontrollers
Ethernet Controllers and PHYs
What niche exactly are you talking about here?
Just one question (Score:3, Insightful)
Execs overly optimistic (Score:5, Interesting)
AMD is covering the remaining $2.5b of the deal with a commitment letter from Morgan Stanley Senior Funding, with the debt secured by "a pledge of the capital stock of certain material units of the company, accounts receivable and proceeds from any sale by Advanced Micro of its equity interest in Spansion Inc." The CFO is overly optimistic that the company can get rid of that debt "quickly," without layoffs, and with savings of $75m and $125m over the next two years. DJ Newswires says ATYT will no longer work with Intel, and the execs say that they can make up the lost sales with the severing of Intel-ATI ties. Pretty lofty goals, I'd say.
AMD up to their eyes in debt (Score:3, Interesting)
AMD has been loosing money for a lot of years (only in the last 2 years they started making profit)
Now they have a price war with Intel and they have to compete with Conroe, so they can't even count on making any profit from the next few quarters.
Looks like they are living on the knife edge.
ATI no longer competes directly with NVidia (Score:4, Interesting)
AMD, ATI, NVidia and Intel *all* make motherboard chipsets.
ATI, Nvidia and Intel all make video processors.
So do SIS, S3, and VIA.
Yet they all work (relatively) well with each other.
This isn't about marketshare, it's about technology. ATI does something that AMD wants, so AMD is acquiring the company for the tech. The market won't feel a thing, I promise you. Competition will continue, just like it did when Micron acquired Rendition (wipes a tear for his Verite v2200) and when NVidia bought out 3dfx (wipes another for his Banshee).
Since everyone's got their prognosticator's caps on today, I'm going to come out and say that, within 5 years, we'll be seeing GPU processors integrated into the motherboard, accessable to both ATI and NVidia (and Matrox, and S3, and
I think we're seeing a move back to specialization. We've already got separate Audio chips, separate networking chips, even chips to handle I/O for RAID and such. With the new market for Physics co-processors, I'm sure we'll only see more for tasks such as AI, and when the next big UI design is unleashed (either some kind of brain-reading technology, or a true 3d input system -- the WII is just the tip of the iceberg!) another co-processor will be made to handle that. With AMD's focus on integrating external processors with technologies such as HyperTransport, undoubtedly they'll be able to compete for a long time.
And the best part is, we get to choose from strong market competitors. As long as there is innovation, we win.
Hmmmm, Consoles (Score:4, Insightful)
A.) Xbox, Nintendo
Analysis.....Good move.
AMD going into embedded devices (Score:3, Interesting)
The Q&A session is apparently already up at The Pirate Bay (though I didn't manage to download it yet): http://thepiratebay.org/details.php?id=3506714 [thepiratebay.org]
Interesting that they think they'll be able to continue having a good relationship with nVidia. I'd guess it's just PR speak though for "as soon as the merger is complete, you're unimportant to us".
The CEO Hector Ruiz went on and on like a drone, repeating the same fluff over again (like background noise) and it wasn't until those few moments where his minions were allowed to speak something intelligible was said.
Better ATI drivers... (Score:3, Interesting)
One reason why AMD may have bought ATI (Score:5, Interesting)
Next year, AMD will be shipping quad core Athlons and Opterons. But, if they wanted to they could replace one CPU with a GPU and have video on die. And if they wanted to they could replace a second CPU with sound, USB, SATA, Gigabit, wireless etc etc etc, and have an entire computer on a chip.
VIA has been trying to do this for years. AMD has the fab capacity to pull it off.
AMD could be the first company to enable the $150.00 PC to exist (by saybe 2009). Smaller than a mac-mini, dual core, and all you need to get it to run is slap some flash memory on board for a hard drive substitute, some DDR2, a cheap DVD drive and Voila! Instant computer.
Imagine a Dual Core Athy with a gig of ram, 20GB flash disk all in the form factor of about twice the size of an IPOD.
Oh you could put a screen on it too, DGMS.
This could be a great thing. My only advice for AMD / ATI is: Dedicate some resources to drivers, or better yet, open source the GPU API.
Raydude
Re:Linux (Score:5, Interesting)
Re:Linux (Score:2)
Re:Linux (Score:5, Insightful)
They may become a market leader for Linux desktops (GPU's aren't needed in servers where Linux is popular).. but Linux desktops are only 1-2 percent of the desktop market...
so even if they gain all of it.. they still won't be a market leader in GPUs.
Re: (Score:3, Insightful)
Re:Linux Support ? (Score:4, Informative)
I'm running OpenSuSE 10.1 on my Thinkpad R51 with a pretty standard ATI Mobility Radeon and can I get the ATI drivers working? Can I hell. Always "no device for screen" or some such error. So I'm stuck with the OSS drivers which although are great for 2D, don't perform well enough for anything other than TuxCart.
On the other hand, the NVidia FX5900 in my desktop machine (also running OpenSuSE 10.1) was a breeze. Drop to run level 3, run installer, reboot, job done.
Bob
Re: (Score:2)
Re:Linux Support ? (Score:2)
Bob
Re:Linux Support ? (Score:4, Interesting)
NVidia seems to make better blobs than ATI, but it is still a blob [openbsd.org]:
Re:Linux Support ? (Score:2)
Bob
Re:Linux Support ? (Score:3, Insightful)
Another benefit would be that if nVidia's drivers were GPLd, they could be included with the Linux kernel and X(org|Free86) if they were to a high enough standard, completely eliminating the current issue of having to kill X to install the drivers, and reinstall with
Re:Linux Support ? (Score:3, Interesting)
Not everything has to be done now, now, now. At least not in my world.
Bob
Re:Should we welcome our new overlords? :D (Score:3)
Dear AMD:
We, your faithful processor purchasers (yes, we have many), have long been forced to buy nVidia hardware because of ATI's poor quality drivers under Linux. Please work the same magic you did with the AMD64 and give us something we can be proud of.
The undersigned.
Re:nVidia Counter Offer? (Score:3, Insightful)