Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

It's Official - AMD Buys ATI 508

FrankNFurter writes "It's been a rumour for several weeks, but now it's confirmed: AMD buys ATI. What implications is this merger going to have for the hardware market?" In addition to AMD's release, there's plenty of coverage out there.
This discussion has been archived. No new comments can be posted.

It's Official - AMD Buys ATI

Comments Filter:
  • Tomorrow (Score:3, Funny)

    by glebd ( 586769 ) on Monday July 24, 2006 @06:53AM (#15768510) Homepage
    Intel buys Nvidia. Let the war continue!
    • Re:Tomorrow (Score:3, Insightful)

      by d3bruts1d ( 639027 )
      Good lord... I hope not.

      *shudder*
      • Re:Tomorrow (Score:4, Insightful)

        by vhogemann ( 797994 ) <victor@MOSCOWhogemann.com minus city> on Monday July 24, 2006 @08:30AM (#15769037) Homepage
        Consider for a moment, that Intel does provide usable OpenSource drivers for their Video Chipsets.
        • Re:Tomorrow (Score:3, Insightful)

          by ivan256 ( 17499 )
          Ok, considered.... And dismissed.

          I hate how people write off ATI and Nvidia as Open Source scrooges since their drivers are closed. The reality is that their code isn't all home grown and they couldn't open source it even if they wanted to. The copyright and patent holders on their licensed technologies wouldn't let them.
          • Comment removed (Score:5, Insightful)

            by account_deleted ( 4530225 ) on Monday July 24, 2006 @11:09AM (#15770245)
            Comment removed based on user account deletion
    • Re:Tomorrow (Score:5, Interesting)

      by C0vardeAn0nim0 ( 232451 ) on Monday July 24, 2006 @07:27AM (#15768626) Journal
      IIRC there's a mutual deal between ADM and Intel (part of the setlement of a lawsuit) that allows one company to use the other's technologies. that's what allowed AMD to integrate SSE 1/2/3 in athlons and Intel to integrate AMD64 in pentium4/xeon.

      if they both buys graphic chipsets companies, does this means nvidia's technology on ATI GPUs and the other way around ?

      or will they shield the newly aquired techs from the setlment ?
      • Re:Tomorrow (Score:5, Informative)

        by ichigo 2.0 ( 900288 ) on Monday July 24, 2006 @08:08AM (#15768871)
        http://contracts.corporate.findlaw.com/agreements/ amd/intel.license.2001.01.01.html [findlaw.com]

        As far as I can tell this deal only covers patents made before 2001 (section 2). I could be wrong though, not very good at legalspeak, and didn't read the entire contract. AFAIK they have another cross-licensing agreement as well, but it only covers all x86 extensions and improvements. This is the deal that you're probably talking about as SSE and AMD64 are x86 extensions. So to answer your question: no they would not need to share tech acquired from ATI.
        • by default luser ( 529332 ) on Monday July 24, 2006 @11:04AM (#15770196) Journal
          This is because x86-64 is an open standard. AMD released it as open when they announced it, because it was the only way to gain industry acceptance.

          Once AMD got Microsoft's cooperation building support for x86-64 into Windows, they hardped on about the open standard. This protected AMD from Intel, who were already secretly working on their own implementation of x86-64. Normally, once Intel realized how potentially powerful x86-64 was, they were sure to create their own incompatible version (ala SSE and 3DNOW!) to try and derail AMD.

          But the open standard stopped Intel from doing this. Microsoft pointed to the open standard, and told Intel flat-out that they were not going to support two versions of 64-bit x86.

          x86-64 is an open standard. AMD's copyrighted implementation of x86-64 is called AMD64. Inte;'s copyrighted implementation of x86-64 is EMT64.
  • Don't really know.. (Score:5, Interesting)

    by Roy van Rijn ( 919696 ) on Monday July 24, 2006 @06:57AM (#15768524) Homepage
    ..if this is a good thing or not. It might be good for the development and cooperation. Better integration == better graphics/faster machines?

    But on the other hand, this could split the market and get things like todays uncompatible browsers. (Which is VERY annoying somethimes)

    And we have a psychic [slashdot.org]
    • by babbling ( 952366 ) on Monday July 24, 2006 @07:07AM (#15768561)
      There's a slight chance that AMD might be smart and release hardware specs for ATI cards, or make the drivers Free Software. If either of those things happen, this would be a very good thing, in my opinion.
      • by Roy van Rijn ( 919696 ) on Monday July 24, 2006 @07:11AM (#15768572) Homepage
        Well, that could be a very good thing. Those specifications will also help driver-makers a lot. It might also help to get the linux drivers which are pretty poor for ATi at the moment.

        The AMD-fans/nerds are more linux-minded then Intel (IMHO), and AMD probably knows this. They can really make a business-blow by releasing this, in the mind of open-source.
        • It's not just the nerds being more Linux minded - AMD has, to some extent, bet the farm on the K8 being the king of the server room, since the entire core was designed from the off to be highly scalable across multiple CPU's. ANd now we're seeing that most of the big advances (new "enterprise" sockets, K8L stuff) are going to benefit the servers before they benefit Joe Public.

          AMD knows that, whatever market share it has in the desktop arena, Linux is a major player in the HPC and 2P+ spaces and knows that L
      • > There's a slight chance that AMD might be smart and release hardware specs > for ATI cards, or make the drivers Free Software. If either of those things > happen, this would be a very good thing, in my opinion. AMD has, so far, been very open source friendly by releasing hardware documentation. ATI used to be more open source friendly than they are now. Hopefully the merger will lead to more hardware documentation beeing released for ATI.
      • ATI, was releasing specs and even publishing OSS drivers to XFree86, and the morons at XFree86 spurned ATI. So ATI took thier highend opengl drivers, tweaked them sothat they would work on thier other cards and published them so that all Linux users would at least have something.
    • It's a bit of an odd buy, considering nVidia's chipsets contributed so much toward the rise of the Athlon and are still (afaik) the lead in performance chipset-wise. But I guess if they merge their teams (or at least make them consult eachother a lot) we could see some perf improvements.

      I wouldn't mind seeing a mini-gpu inside of the CPU dedicated to massive vector ops - it would certainly blow the pants off SSE2 for larger datasets (video/sound codecs?). Of course stuff can already use the GPU but I be

    • by sbrown123 ( 229895 ) on Monday July 24, 2006 @07:54AM (#15768782) Homepage
      Think multi-core CPU where one (or more) of the cores could serve as a graphics processor on demand. Oh, the power...
  • could be good.. (Score:5, Interesting)

    by Tokin84 ( 919029 ) on Monday July 24, 2006 @06:59AM (#15768532)
    this could be real good if AMD's acquisition of ATI allows them to produce full chipsets in the same fashion Intel has with its Centrino line. let the competition begin!

    also, not official yet, as government regulatory bodies need to approve it.
    • It WILL Be Good! (Score:5, Insightful)

      by eldavojohn ( 898314 ) * <eldavojohn@gm a i l . com> on Monday July 24, 2006 @07:13AM (#15768576) Journal
      this could be real good if AMD's acquisition of ATI allows them to produce full chipsets in the same fashion Intel has with its Centrino line. let the competition begin!
      Yeah, the part that really sweetens the deal for us end consumers is that ATI will now get to benefit from the research that AMD inherits from IBM [nytimes.com] for chipsets. Hopefully ATI can make some better video cards with all the research that the other two have benefited off of. I hope that the same chipmaking technologies AMD has been using can now be used to improve ATI's GPUs and chipsets.

      Since (in my opinion) NVidia has taken the lead in GPUs, I hope that ATI will be boosted back into a competitive state and price wars ensue.

      Again, to me this is nothing but great news for the end-consumer.
      • Outside of the top end bleeding edge 7x00 cards -- you know, the ones that cost more than a motherboard + cpu + memory -- ATI competes fairly well on function/price. Its a good time to remember than probably ~80% of the market is made up by the low and mid tier.

        2 years ago it looked like Nvidia was dead meat, now they've come back strong. I only get worried when one of these companies can't get their sh*t together for 2 or 3 generations in a row...then you know they've stagnated.
  • by FinchWorld ( 845331 ) on Monday July 24, 2006 @07:01AM (#15768539) Homepage
    ... But hopefully they'll kick the ATI driver team up the arse and get a decent set of drivers out (for Windows and Linux).
    • by powerlord ( 28156 ) on Monday July 24, 2006 @07:57AM (#15768804) Journal
      The linux crowd (or at least a vocal minority of them), don't want drivers, they just want documentation for the card, they'll make their own drivers.

      On the other hand, releasing either open source drivers, or a combination of binary drivers, along with documentation (so those who want to write their own CAN), would certainly be the best of both worlds.

  • System on a chip or at least integrated GPU and CPU cool.
    I just wish it was Nvidia.
    • Re:Maybe (Score:5, Funny)

      by Linker3000 ( 626634 ) on Monday July 24, 2006 @07:27AM (#15768628) Journal
      "System on a chip or at least integrated GPU and CPU cool."

      A die holding an AMD core and an ATI GPU may be 'neat', 'fab', 'brill' or even 'ace' - but 'cool' - I think not!

  • But I did.
    http://www.theinquirer.net/default.aspx?article=32 197 [theinquirer.net]

            -Charlie
  • AMD & ATI (Score:5, Funny)

    by digitaldc ( 879047 ) * on Monday July 24, 2006 @07:03AM (#15768547)
    AMD combines with ATI and has announced a new name for their company:

    DAAMIT!
  • by NXprime ( 573188 ) on Monday July 24, 2006 @07:04AM (#15768551)
    http://www.theinquirer.net/default.aspx?article=33 219 [theinquirer.net]
    *head asploded*

    I'm getting the 'gist' of why this transaction needs to happen. AMD needs GPU functionality on the CPU. I think everyone kinda expected that to happen at some point. The Inq. then takes a left turn in the plot and mentions 'mini-cores' which are multi-cores with massive amount of threads. Sort of but not really like Intels' hyperthreading times 32x. Shitloads of threads.


    Bottom line?

    ATI will work on AMD's new cores. I don't know if they'll work on something that'll plug into a PCIe slot still like nVidia.

    nVidia will still be around making graphic cards for AMD. Just won't necessarily be anything remotely similar to what's out on the stores today. AMD doesn't like closed technology like Intel does. So it'll be an open platform still which is a 'good thing' (tm).

    Forget about GPU's and chipsets. The main innovation has to come from these new GCPU's.

    ATI was going to lose its Intel chipset business anyway with or without this takeover. So no big loss here.

    Intel has about a year lead on this tech and probably be first out to market with it.

    CPU cores change radically every 5 years or so. With GCPU's, think more in terms of GPU's and radical changes every year to 18 months. Crazy shit.

    Plenty of space at FAB 36 to build the new cores and the recently announced plant they are building in New York. So no more costly production runs in Taiwan.

    If AMD didn't do this, they'd be out of business in 5 years. Period.
    • Think of it more as adding instruction sets to the CPU, not adding a GPU to the CPU. MMX, not embedded graphics.

                  -Charlie (the author of the Inq article)
    • AMD needs GPU functionality on the CPU
      Eventually we'll all probably just have huge processor packages that we plug expansion cards into, instead of a motherboard.
    • by kriegsman ( 55737 ) on Monday July 24, 2006 @07:30AM (#15768638) Homepage
      AMD needs GPU functionality on the CPU.

      See the entry in the Hacker's Dictionary / Jargon File for "Wheel of reincarnation [catb.org]":
      wheel of reincarnation: [1968] Term used to refer to a well-known effect whereby function in a computing system family is migrated out to special-purpose peripheral hardware for speed, then the peripheral evolves toward more computing power as it does its job, then somebody notices that it is inefficient to support two asymmetrical processors in the architecture and folds the function back into the main CPU, at which point the cycle begins again.

      Several iterations of this cycle have been observed in graphics-processor design, and at least one or two in communications and floating-point processors. [...]


      -Mark
      • Graphics in software (Score:3, Informative)

        by gr8_phk ( 621180 )
        Processors are getting fast enough to do rendering in software again. GPUs are trying to become general purpose CPUs. People will soon have 2 cores as standard and 4 or more are on the way. What are people supposed to do with all those CPU cores? Replace the GPU with them of course. Why would a CPU maker need to buy a GPU maker? Not sure, but perhaps just to gain the graphics expertise to write the software, and possibly to make some suggestions for the instruction set and hardware. I certainly hope they do
        • by smallfries ( 601545 ) on Monday July 24, 2006 @02:32PM (#15771754) Homepage
          Ok, time to blow off the moderations that I've made so far as you're missing something fairly obvious. Take a look at the processor on the 7800GTX, with 300 million transistors it is currently the most complex chip being shipped (I don't how big Cell is). In exchange the peak processing power is 320Gflop/s (40 in the vertex processors, 280 in the fragment shaders). For comparison the floating point performance in a CPU is ~ 8Gflop/s. That's a whopping 40 cores to break even on graphics processing.

          Once you can fab a processor large enough to contain 40 functional cores - how big a GPU do you think you could fab on the same process? The simple fact is that a GPU is completely crippled compared to a CPU. There are huge tradeoffs in the design to get that kind of performance. Stream processing is very limited compared to a von Neumann architecture if you care about latency in the slightest. But for graphics - it's perfect. Throwing completely independent parallel chunks of data through an array of vector processors is a much simpler challenge than attempting to extract parallelism from sequential code. The sequential code has pesky things like control-flow that is missing in the gfx shaders, and I don't mean the rubbish that ATi/Nvidia are selling as control-flow in their current designs. That is sheer marketing given the size of the shader batches and the depth of the pipelines.

          So I don't think the big 'ol wheel of reincarnation is going to move rendering back into software anytime soon. But what people forget is that AMD is not really a processor company. They are a fab company that just happens to design some kick-ass processors. Their main business is silicon, and buying ATi is the biggest chunk of vertical integration you can imagine...
    • This is where the technology is headed. I'm sure we'll see CPUs with an integrated GPU sans memory along with CPUs without GPUs for cheaper machines. We've already seen the Northbridge integrated onto the AMD CPUs.

      I would like to see the first 128MB of RAM into the CPU housing as well as the GPU and the minimal southbridge. This should bring motherboard prices lower at the cost of a higher CPU, the overall cost should still be lower. Even better, it should allow for some serious speeds.

      At the minimum, the
      • Not at all a good idea, nor where things will be heading except in the embedded market perhaps. Not only would it reduce yields enormously because of the larger die size, but it would also put two points of failure into one chip, AND make it much harder to upgrade just one component.

        I could see this perhaps in the mobile/embedded market, but not in the server/workstation space. At least not for a LONG time. It's just not a good idea.
    • An interesting hypothesis that came to mind during and after the confirmed speculation and in light of AMD's announced 4x4 platform: plug-in GPU modules on the motherboard. With the way 4x4 works, you would be able to dedicate determinable and upgradeable RAM to the GPU. And since ATI and nVidia have both been working integrating a PPU core in future GPUs, there are interesting possibilities on the horizon.

      Having a bank of RAM slots on the motherboard in dedication to an socketed GPU has its drawbacks, I'

      • Here's what I don't get. If they do that, how do you upgrade the memory bus bandwidth so that it's futureproof to a degree? Memory on graphics cards changes all the time. It's not just a GPU and Memory. It's everything in-between as well. Power voltages... ect.
    • by 0123456 ( 636235 ) on Monday July 24, 2006 @07:48AM (#15768731)
      Seems highly unlikely to me that they'd stick a GPU into the CPU. Modern GPUs are a similar size to CPUs (if not larger) and need much higher memory bandwidth... so you'd be doubling the size of your CPU and you'd need a 256-bit 1GHz+ memory interface. And then the 'high end' users would just go and buy a PCI-Express card when the next generation came out, making the whole thing a total waste.

      I could see perhaps that they'd stick a cheap and crappy GPU into a cheap and crappy CPU for the low end of the market, but with Vista coming out with all its eye-candy that may not even be viable for rendering the Vista desktop, let alone games.
    • One correction. (Score:4, Informative)

      by LWATCDR ( 28044 ) on Monday July 24, 2006 @10:20AM (#15769835) Homepage Journal
      "AMD doesn't like closed technology like Intel does. So it'll be an open platform still which is a 'good thing' (tm). "
      Actually Intel has been a big supporter of OSS. They helped port Linux the Itanium and have provided all the documentation to their video chips.
      I think you are confusing Intel with Microsoft. Intel has been one of the most open hardware companies.
      AMD has also been very good. ATI like nVidia.... Well let's say not so good.
      I really don't get this.
      AMD could use some good chip-sets but they have made their own for the Opteron so I don't see what they gain from ATI.
      AMD could use a good low end integrated video solution for low end desktops and servers. Yes it is true but servers almost never use nVidia or ATI graphics cards. When I set up a server I only plug the monitor in when I do the install and if something really bad happens.
      I have to think this comes down to laptops. AMD has not done well in that market and a one stop shop for a laptop solution like Intel offers might be a good solution.
      I wouldn't hold my breath on the good open source ATI drivers for Linux. Of course if it happens I might dump my nVidia based motherboard and Video card. I have been buying nVidia just because of their better Linux support for years.
  • This will likely put a functonal end to the Intel processor + Intel chipset + ATI Video card systems I like to build.

    It wouldn't be so bad if every nVidia based product I have ever tried to use hadn't been DOA.

    Well... at least I can still stick with Intel chipsets... there is no way I am using a third party northbridge/southbridge I don't care if I can't use SLI.

  • Makes me uneasy (Score:5, Interesting)

    by Mad Merlin ( 837387 ) on Monday July 24, 2006 @07:08AM (#15768564) Homepage

    I can't see this being good for customers. As we all know, ATI's products tend to be miserably supported, though this hasn't been the case for AMD thus far. How will this affect the nForce line of chipsets? Given ATI's past I'd much rather have an nForce than whatever ATI kicks out.

    On the other hand, perhaps AMD will drag ATI out of it's rut, but I think it's just as probable that ATI will drag AMD down, and that's good for nobody.

    • I am a bit more pessimistic. I don't think AMD needs to support much at this moment. CPU manufacturers will have to show you the instruction set. There are not much to hide. They don't have an Intel style chipset at this moment. In other words, they don't really need to write, support or open their device driver...

      By once ATI gets into the picture, the terrain changes. Arguably, ATI is bought by AMD not the other way round. But the expertise in supporting the chipset graphics card etc is in ATI. I b
    • "ATI's products tend to be miserably supported"
      Oh tell me more, NVIDIA fanboy! Tell me a opposing tale of the wonderful NVIDIA happy land, with a gumdrop house on lollipop lane!
      • Re:Makes me uneasy (Score:3, Informative)

        by moosesocks ( 264553 )
        No. The grandparent poster has a point.

        nVIDIA came out of nowhere about 5-6 years ago, whilst ATI has been firmly entrenched in the marketplace for a much longer time.

        nVIDIA was able to grow so quickly, because their products were faster, less buggy, and better supported than anything on the market at the time. ATI was just barely able to keep up, and everyone else bit the dust.

        The consumer-end graphics industry has been known for buggy drivers for almost its entire existance. nVIDIA's biggest innovatio
  • AMD designs (Score:5, Insightful)

    by bjb ( 3050 ) * on Monday July 24, 2006 @07:08AM (#15768566) Homepage Journal
    Interesting possibility:
    • Today: AMD has integrated memory controllers to get good memory performance.
    • Tomorrow: AMD has integrated video controllers to get good 3D performance.

    OK, so not very close to reality considering what would be involved. AMD bought into ATI because it wants to focus on CPUs, not chipsets.

    However, it does make for an interesting point of interest: the three primary components of PC architecture today are the CPU, GPU and chipset that bind the two together. AMD had two parts of the equation, and ATI has two parts as well, though one of these parts overlap. Now AMD is one company that has end-to-end solutions? There's got to be something interesting coming out of that marriage.

    • Re:AMD designs (Score:4, Insightful)

      by PFI_Optix ( 936301 ) on Monday July 24, 2006 @07:22AM (#15768606) Journal
      One thing that Intel has always done better than AMD is provide the "whole package".

      What I can buy from Intel:

      Server chassis + power supply
      Motherboard
      CPU(s)
      NIC
      RAID

      What I can buy from AMD:

      CPU(s)

      Small-medium OEMs are going to like Intel because it gives them one point of support for most of their major components. It also gives them a single "partner" with which to negotiate pricing; the larger volume of product means they can get overall better pricing.

      Taking on ATI might be AMDs move to start fixing that shortfall in their business model. If they put a solid OEM-friendly motherboard on the market, it will be a huge step in the right direction. With Conroe presently beating the pants off AMD's offerings, this is well-timed.
    • Comment removed based on user account deletion
  • Goodbye ATI? (Score:5, Interesting)

    by XxtraLarGe ( 551297 ) on Monday July 24, 2006 @07:13AM (#15768578) Journal
    "It's been a rumour for several weeks, but now it's confirmed: AMD buys ATI. What implications is this merger going to have for the hardware market?"

    I wonder if this means no more ATI cards in Macintosh computers, seeing as how Apple uses Intel now? Or, even more interesting, could it mean Apple switching over to AMD?

    • Re:Goodbye ATI? (Score:3, Insightful)

      Why would AMD do that?

      No company would kill off a profitable product line just to spite their opposition. Undoubtedly ATI's deal with Apple is profitable, and just because Apple uses Intel processors doesn't mean that such a transaction is any less profitable than it was before.

      Companies don't act in that way, they look out for their bottom line. Unless there's something that would cause that business to become less profitable, ATI is unlikely to give up the block of sales they get from Apple. Is it bet

    • Apple first began offering ATI graphics and then Nvidia and most recently intel graphics.

      iBooks always used only ATI graphics.
      iMacs have used both ATI and Nvidia graphics.
      PowerBooks have used both ATI and Nvidia graphics.
      PowerMacs have used both ATI and Nvidia graphics.

      The Mac mini and MacBook are currently using intel integrated graphics (high volume products)
      The MacBook Pro and iMac both currently use ATI graphics (high volume products)
      The PowerMac currently uses Nvidia graphics (low volume product)

      Apple
  • This will sorta relieve some of the high-stress factor from the "Intel has killed gaming" theory. In which, most business and "consumer" machines that come with fast intel processors but crappy integrated intel graphics is a joke. These users think "hey I got a Penitum 4, 3.2ghz, I am going to go play Half-Life 2, only to not meet the minimum requirements. With AMD releasing PC's combined with low-cost ATI chips imbedded into their "consumer grade" PC's this could have a strong uproar towards the PC gami
  • by Glock27 ( 446276 ) on Monday July 24, 2006 @07:19AM (#15768594)
    NVIDIAs response. Will NVIDIA no longer support AMD processors to the same level? The shocking thing to me about this announcement is that nForce chipsets are the best chipsets for AMD64. Also, NVIDIA driver quality across the board is better than ATI.

    So, we'll see how this shakes out. If, as others have said, AMD forces ATI to produce better drivers, and good Linux drivers, that may be a good outcome...

    The other interesting aspect is (as it often is) Apple. Now AMD gets an instant slice of the Apple pie (sorry) since ATI makes most current Apple graphics chips. Interesting development there... Intel can't be happy.

    I suspect the tension level just notched up at NVIDIAs headquarters as well.

  • Worst case (Score:2, Interesting)

    by botik32 ( 90185 )
    Now that ATI is part of the AMD, the worst case is ATI division is given little attention, developers move to CPU core development, NVidia remains the only serious GPU vendor, and things go downfall from there.

    A second worst outcome is Intel enters a pact with NVidia, so next gen NVidia cards are so integrated with Intel chipsets that they do not run well on AMD. If you buy an AMD platform, you can only buy an ATI video card. If you buy an Intel platform, you are bound to NVidia. This would suck bad as well
  • Ugggh (Score:5, Insightful)

    by LaughingCoder ( 914424 ) on Monday July 24, 2006 @07:24AM (#15768615)
    I think the marketplace has been very well-served by the two dualities that existed before this move: ATI and NVidia beat each other's brains out, as did Intel and AMD. This new dynamic with 3 players does not seem, to me, to promise anywhere near as many benefits for us, the customers. Will ATI become more AMD-centric? Undoubtably. Will NVidia (which has been a great AMD booster) become less supportive of AMD processors? Probably. As this plays out, it seems to me that NVidia will basically be an Intel graphics house (including Macs), and ATI will melt into AMD, becoming mostly an internal chipset house. In the end we lose a very healthy competition between NVidia and ATI. We gain, perhaps, a stronger AMD to keep Intel honest.
  • Well don't expect the same level of cooperation between AMD and nVidia that we've seen these past few years. For some reason I don't see nVidia getting terribly excited about making chipsets for their number one competitor.
  • ...for people like me who were in the AMD/nVidia fanclub? I've always had countless problems with ATI cards both at home and work, generally down to driver issues so I really don't want to switch to ATI, I'd personally rather go the Intel/nVidia route if this will have some adverse effect on using nVidia kit with AMD kit. I'm not sure this is good for the market either if there is some kind of lock in to ATI if you used Intel, it was kind of nice knowing you could choose between 2 processor manufacturers a
  • Wishful thinking (Score:2, Interesting)

    by Alioth ( 221270 )
    The wishful thinking is that now ATi are owned by AMD, they might produce 3D hardware which they publish the hardware interface so we can ave open source graphics drivers. But I'm sure it'll never happen.
  • The Wheel of time [clueless.com] has turned again. GPUs are now general-purpose massively-parallel computers; they will be folded back into the CPU core, so that the general purpose CPU gains massive parallelism. Kind of like SIMD, but on the order of a million operations per instruction instead of 8.

    The next 10 years will consist of a new type of external graphics hardware being built, which will of course, be folded into the CPU at the end of it.
  • by Miros ( 734652 ) *
    So, AMD's stock, up or down in the short-term? They've been taking a bit of a beating over the past 3 months.
  • I'm looking at my AMD/nVidia dev machine and my Intel/ATI laptop and I'm thinking... oh crap.
  • by neersign ( 956437 ) on Monday July 24, 2006 @07:53AM (#15768773)

    I read thru most of the comments on this page, and several people came close to what I think the real reason for this deal is, but no one nailed it. To me, this is a simple example of business 101. AMD has always been a niche vendor. Recently they have begun to spread out, but it is obvious from all the comments on this page that they are still a "gamers" chip. Where Intel and Dell made it big was low-end, mass sale business computers. Intel has their crappy but good enough integrated video chipset which is a part of the vast majority of motherboards. In order for AMD to really be a big player, it needed to a) build it's own integrated chipset from scratch or b) buy a company that already makes integrated video chipsets. Option b won, and while it might cost more initially, it should pay off in the long term.

    I believe this will not stop nVidia from making nForce boards, and it would be stupid of AMD to stop production of ATI 3d cards. I think this may increase the quality of ATI's support for Linux, but I don't think it will be anything drastic.

    • "AMD has always been a niche vendor."

      Are you smoking crack? AMD has most certainly NEVER been a niche vender...

      CPUs
      FLASH
      SRAM
      PLDs
      Embedded Processors
      Microcontrollers
      Ethernet Controllers and PHYs

      What niche exactly are you talking about here?
  • Just one question (Score:3, Insightful)

    by martinultima ( 832468 ) <martinultima@gmail.com> on Monday July 24, 2006 @08:04AM (#15768840) Homepage Journal
    From what I've heard, AMD tends to be pretty Linux-friendly, and very helpful to open-source developers who want to, say, implement AMD64 support and that kind of thing – so will this mean that ATI might start giving a damn about us too? I dunno, probably way too far-fetched, although I can't stand how my brand-new Athlon 64 box can't run 3D because ATI's stupid drivers pretty much don't work on my distribution... either way, though, so long as at least one of them keeps churning out good chips, more power to 'em!
  • by rfunches ( 800928 ) on Monday July 24, 2006 @08:20AM (#15768945) Homepage

    AMD is covering the remaining $2.5b of the deal with a commitment letter from Morgan Stanley Senior Funding, with the debt secured by "a pledge of the capital stock of certain material units of the company, accounts receivable and proceeds from any sale by Advanced Micro of its equity interest in Spansion Inc." The CFO is overly optimistic that the company can get rid of that debt "quickly," without layoffs, and with savings of $75m and $125m over the next two years. DJ Newswires says ATYT will no longer work with Intel, and the execs say that they can make up the lost sales with the severing of Intel-ATI ties. Pretty lofty goals, I'd say.

  • by edxwelch ( 600979 ) on Monday July 24, 2006 @09:05AM (#15769304)
    They borrowed $2.5 billion to pay for ATI. This is top of all the other debt that they owe, they still haven't payed off the massive cost of the 2 fabs in Germany and they also own a lot of stock in Spansion which itself is heavly in debt.
    AMD has been loosing money for a lot of years (only in the last 2 years they started making profit)
    Now they have a price war with Intel and they have to compete with Conroe, so they can't even count on making any profit from the next few quarters.
    Looks like they are living on the knife edge.
  • by Alzheimers ( 467217 ) on Monday July 24, 2006 @09:18AM (#15769385)
    Taking into account all the fanboi anguish, let me point out the very simple fact that now ATI no longer directly competes with NVidia. You could say that the competition would be between AMD and NVidia now, but that's not quite right either. The fact is that the market has become so diverse that all these companies were already competing with each other, despite partnerships and deals.

    AMD, ATI, NVidia and Intel *all* make motherboard chipsets.
    ATI, Nvidia and Intel all make video processors.
    So do SIS, S3, and VIA.

    Yet they all work (relatively) well with each other.

    This isn't about marketshare, it's about technology. ATI does something that AMD wants, so AMD is acquiring the company for the tech. The market won't feel a thing, I promise you. Competition will continue, just like it did when Micron acquired Rendition (wipes a tear for his Verite v2200) and when NVidia bought out 3dfx (wipes another for his Banshee).

    Since everyone's got their prognosticator's caps on today, I'm going to come out and say that, within 5 years, we'll be seeing GPU processors integrated into the motherboard, accessable to both ATI and NVidia (and Matrox, and S3, and ...) The power and bandwidth demands for next gen GPUs are becoming more than expansion boards can handle. Instruction sets are becoming extremely CPU-like. Since the whole universe seems to be moving into Multi-processor designs anyway, perhaps we'll even see some kind of GPU-MMX style expansion of the x86 instruction set (call it v86 for now).

    I think we're seeing a move back to specialization. We've already got separate Audio chips, separate networking chips, even chips to handle I/O for RAID and such. With the new market for Physics co-processors, I'm sure we'll only see more for tasks such as AI, and when the next big UI design is unleashed (either some kind of brain-reading technology, or a true 3d input system -- the WII is just the tip of the iceberg!) another co-processor will be made to handle that. With AMD's focus on integrating external processors with technologies such as HyperTransport, undoubtedly they'll be able to compete for a long time.

    And the best part is, we get to choose from strong market competitors. As long as there is innovation, we win.
  • Hmmmm, Consoles (Score:4, Insightful)

    by MrCopilot ( 871878 ) on Monday July 24, 2006 @09:27AM (#15769449) Homepage Journal
    Q.) How many next-gen consoles have AMD in them now?

    A.) Xbox, Nintendo

    Analysis.....Good move.

  • by chris_7d0h ( 216090 ) on Monday July 24, 2006 @09:44AM (#15769571) Journal
    According to the merger telco. the only substantial argument for the merger from the company's side is that they want to get into the embedded device business. They hope to provide a platform for media processing on cell phones, TVs and the like.

    The Q&A session is apparently already up at The Pirate Bay (though I didn't manage to download it yet): http://thepiratebay.org/details.php?id=3506714 [thepiratebay.org]

    Interesting that they think they'll be able to continue having a good relationship with nVidia. I'd guess it's just PR speak though for "as soon as the merger is complete, you're unimportant to us".

    The CEO Hector Ruiz went on and on like a drone, repeating the same fluff over again (like background noise) and it wasn't until those few moments where his minions were allowed to speak something intelligible was said.
  • by HavokDevNull ( 99801 ) <.ten.smetsysxunil. .ta. .cire.> on Monday July 24, 2006 @09:53AM (#15769638) Homepage Journal
    Wonder if AMD will force ATI's hand to Finlay release a decent ATI video driver for Linux now?
  • by RayDude ( 798709 ) on Monday July 24, 2006 @12:43PM (#15770985)
    There is one obvious reason for the purchase, already stated by others. I'm just reiterating.

    Next year, AMD will be shipping quad core Athlons and Opterons. But, if they wanted to they could replace one CPU with a GPU and have video on die. And if they wanted to they could replace a second CPU with sound, USB, SATA, Gigabit, wireless etc etc etc, and have an entire computer on a chip.

    VIA has been trying to do this for years. AMD has the fab capacity to pull it off.

    AMD could be the first company to enable the $150.00 PC to exist (by saybe 2009). Smaller than a mac-mini, dual core, and all you need to get it to run is slap some flash memory on board for a hard drive substitute, some DDR2, a cheap DVD drive and Voila! Instant computer.

    Imagine a Dual Core Athy with a gig of ram, 20GB flash disk all in the form factor of about twice the size of an IPOD.

    Oh you could put a screen on it too, DGMS.

    This could be a great thing. My only advice for AMD / ATI is: Dedicate some resources to drivers, or better yet, open source the GPU API.

    Raydude

If money can't buy happiness, I guess you'll just have to rent it.

Working...