Forgot your password?
typodupeerror

Samsung to Produce Faster Graphics Memory 138

Posted by ScuttleMonkey
from the faster-and-cheaper dept.
Samsung has announced a new line of GDDR5 chips that will supposedly be able to deliver data at speeds of up to 6 Gbps. In addition to faster data delivery the new chips also claim to consume less energy than previous versions. "Samsung said the new chips consume 1.5 volts, making them about 20 percent more efficient than GDDR 3 chips. Samples of the GDDR 5 chips began shipping to graphics-processor makers last month, and Samsung plans to begin mass production of the chips during the first half of next year. GDDR 5 memory should first appear in high-end gaming systems where users are willing to pay a premium for better graphics. Samsung did not disclose pricing for the chips.
This discussion has been archived. No new comments can be posted.

Samsung to Produce Faster Graphics Memory

Comments Filter:
  • by Anonymous Coward on Monday December 03, 2007 @03:18PM (#21562917)
    Memo From Ki-Tae Lee
    To: All Samsung Employees
    CEO and President,
    Samsung
    December 3rd, 2007

    Would someone tell me how this happened? We were the fucking vanguard of graphics memory in this country. Samsung's GDDR3 was on the card to own. Then the other guy came out with a GDDR3 graphics chip. Were we scared? Hell, no. Because we hit back with a little thing called XDR. That's GDDR3 on crack. For cokehead gamers. But you know what happened next? Shut up, I'm telling you what happened--the bastards went to GDDR4. Now we're standing around with our cocks in our hands, selling XDR & GDDR3. Cokehead gamers or no, suddenly we're the chumps. Well, fuck it. We're going to GDDR5.

    Sure, we could go to GDDR4 next, like the competition. That seems like the logical thing to do. After all, three worked out pretty well, and four is the next number after three. So let's play it safe. Let's make a more crackhead gamer RAM and call it the XDR3SuperTurbo. Why innovate when we can follow? Oh, I know why: Because we're a business, that's why!

    You think it's crazy? It is crazy. But I don't give a shit. From now on, we're the ones who have the speed in the memory game. Are they the best a man can get? Fuck, no. Samsung is the best a man can get.

    What part of this don't you understand? If GDDR2 is good, and three is better, obviously five would make us the best fucking memory that ever existed. Comprende? We didn't claw our way to the top of the memory game by clinging to the GDDR2 industry standard. We got here by taking chances. Well, GDDR5 is the biggest chance of all.

    Here's the report from Engineering. Someone put it in the bathroom: I want to wipe my ass with it. They don't tell me what to invent--I tell them. And I'm telling them to stick enough transistors on there to call it GDDR5. I don't care how. Make the chips so thin they're invisible. Put some on the handle. I don't care if they have to make the ram hang halfway off the motherboard, just do it!

    You're taking the "safety" part of "safety electronics" too literally, grandma. Cut the strings and soar. Let's hit it. Let's roll. This is our chance to make memory history. Let's dream big. All you have to do is say that GDDR5 can happen, and it will happen. If you aren't on board, then fuck you. And if you're on the board, then fuck you and your father. Hey, if I'm the only one who'll take risks, I'm sure as hell happy to hog all the glory when the GDDR5 card becomes the gaming video card for the U.S. of "this is how we game now" A.

    People said we couldn't go to three. It'll cost a fortune to manufacture, they said. Well, we did it. Now some egghead in a lab is screaming "Five's crazy?" Well, perhaps he'd be more comfortable in the labs at Sony, working on fucking electrics. Cell processing chips, my white ass!

    Maybe I'm wrong. Maybe we should just ride in SanDisk's wake and make flash USB drives. Ha! Not on your fucking life! The day I shadow a penny-ante outfit like SanDisk is the day I leave the silicon game for good, and that won't happen until the day I die!

    The market? Listen, we make the market. All we have to do is put her out there with a little jingle. It's as easy as, "Hey, shaving with anything less than GDDR5 is like playing Warcraft on a Commodore 64." Or "It'll be so smooth, I could snort lines off of your monitor." Try "Your frame rate is going to be so friggin' fluid, someone's gonna walk up and confuse it with a urinal."

    I know what you're thinking now: What'll people say? Mew mew mew. Oh, no, what will people say?! Grow the fuck up. When you're on top, people talk. That's the price you pay for being on top. Which Samsung is, always has been, and forever shall be, Amen, GDDR5, sweet Jesus in heaven.

    Stop. I just had a stroke of genius. Are you ready? Open your mouth, baby birds, cause Mama's about to drop you one sweet, fat nightcrawler. Here she comes: Put another microcontroller on that fucker, too. That's right. GDDR5, two microcontroll
    • My God, that was awesome.
      • by empaler (130732) on Monday December 03, 2007 @05:27PM (#21564487) Journal
        It's an old Onion article rewrite... I'd completely forgotten about it, so it was good to see again... :)
        Article [theonion.com]. He missed a few words, but it was good.
        • by empaler (130732)
          Agh. I hadn't seen that there was an article link right below the 'Read Full Comment Text' link. Previous post redundant... :-S
          • Agh. I hadn't seen that there was an article link right below the 'Read Full Comment Text' link.
            Change your Max Comment Size in the Comment preferences to 65535 -- problem solved.
             
            • by empaler (130732)
              I had that a couple of years ago, but a slew of trolls had me shorten the fold. Agh. Might as well, haven't seen that many extra-long trolls lately. Well, save for the shit-eating stories.
    • I'm pretty sure that just gave me a hard on.
    • by calebt3 (1098475)
      Now why can't all First Post AC's be this creative?
    • ... never use Google's Korean->English translation for company memos/press releases/important documents...

      Seems like execs keep getting burned this way.
    • by 4D6963 (933028)

      "Hey, shaving with anything less than GDDR5 is like playing Warcraft on a Commodore 64."

      Does it imply that playing games with less than 5 blades is like scraping your beard off with a dull hatchet?

    • My chips go all the way to 11.
    • by iamhassi (659463)
      Anyone notice the author of the article looks like Mr. Bean? [idg.com]
  • Qimonda (Score:5, Informative)

    by imstanny (722685) on Monday December 03, 2007 @03:19PM (#21562937)
    Qimonda already released GDDR5 Article from November 2: http://www.pclaunches.com/other_stuff/qimonda_gddr5_memory_now_available.php [pclaunches.com]
  • by Anonymous Coward on Monday December 03, 2007 @03:20PM (#21562957)
    How does it consume volts?
    • by EmbeddedJanitor (597831) on Monday December 03, 2007 @03:50PM (#21563351)
      By opening the packaging and chewing?

      Damn... I logged on just to respond to this. "Consuming Volts", "travelling at 5 knots per hour", "uses 4 kW per hour" and similar flagrant misuse of units really winds my shorts (to a torque of 5 Nm). You can forgive USA Today, but a Geek rag should get this right.

    • Maybe it has extremely high resistance so it "consumes" volts lol. The real question is WTF happened to GDDR4? I bet some other company is in the process of making 4 so they named theirs 5 just to be dicks lol. Well I'm gonna go make some Super Ultra GDDR3000 with hypthreadingtransport bus speedination! It's actually not ram at all, it's a cologne lol.
    • Get a voltmeter, place the black lead on ground and the read lead on the power lead of the chip. 1.5 Volts going in. Check.

      Now move the red lead to the other power lead on the chip - Zero volts coming out! Where are these volts going? But it not accurate to say they are being *consumed* per se. The volts are stored in the form of magic smoke. When the chip is full of volts (as magic smoke), it stops working. Be careful, too many volts going in and you can burst the smoke tank, then you're really screw
    • by PWNT (985141)
      volts can be consumed because they are the potential energy in a circuit. A high voltage is always in reference to 'something' typically the ground (wherever that is). Therefore if you have +15 V at node A, and node A connects to node B through a resistor, the voltage at B will become less than A by moving through the resistor (as energy is converted to heat over time; power output) if all the potential energy (if it's not used it's potential) exists up until the point at which it passes through the resisto
      • The voltage at node A and B will be the same assuming no current draw. The resistor limits current, not voltage.
        -nB
    • by iamhassi (659463)
      How does it consume volts?

      the same way bits are more than bytes:
      "Called GDDR (Graphics Double Data Rate) 5, the new chips can transfer data at speeds up to 6G bps (bits per second)....GDDR5 is able to move more data, up to 24G bytes per second..."

      so apparently it can move 6G bits per second, but if it really feels up to it, it can kick it up a notch and move 24G bytes per second (8 bits in 1 byte = 192G bits per second).

      Comprende?

      Something tells me the author can't tell his volt from his watt o
  • Inaccurate summary (Score:4, Informative)

    by Khyber (864651) <techkitsune@gmail.com> on Monday December 03, 2007 @03:21PM (#21562975) Homepage Journal
    it should be noted that it's 6Gbit per pin, not per chip.
    • by Intron (870560) on Monday December 03, 2007 @03:47PM (#21563289)
      That would be implied by the aggregate 24G bytes/sec later in the article. So I guess they are still keeping the 32 I/O pins that the previous generation used and not quite doubling the speed. The article is also missing the size, which is a spec that hardware designers frequently wish to see, but I think it's probably still 512 Mbit or we would have heard about it.
  • Rumors (Score:5, Interesting)

    by Dr. Eggman (932300) on Monday December 03, 2007 @03:32PM (#21563103)
    This lends a bit of credence to the rumored NVidia G9 series launch [beyond3d.com], although I still think February is unlikely.
    • My older ATI x1950 XTX card came with 512MB of GDDR4 while the newer nVidia 8800GT uses GDDR3. Obviously the 8800GT is a faster card in every way in comparison. That said, what do bet there will be another G8 series using the new GDDR5?
  • by UnknowingFool (672806) on Monday December 03, 2007 @03:32PM (#21563111)
    Will this make my porn look better--On second thought, I don't want to see Ron Jeremy any better. [shudders]. :P
  • by randyest (589159) on Monday December 03, 2007 @03:35PM (#21563129) Homepage
    Samsung said the new chips consume 1.5 volts, making them about 20 percent more efficient than GDDR 3 chips.

    What poor science reporting. Nothing "consumes volts." Volts measure voltage -- difference in potential. Devices consume Joules -- units of energy. Also acceptable would be Watts -- energy per unit time. It would have been really nice to be given the Watts per Bandwidth per Size (W/Gbps/bits), but I realize that's asking way too much of the Times.
    • Samsung said the new chips consume 1.5 volts, making them about 20 percent more efficient than GDDR 3 chips.

      What poor science reporting. Nothing "consumes volts."

      No no, you misunderstand. The author of the article meant that this chip can eat one standard AA battery. After that, it gets full.

      *Results may vary for AAA, C, and D batteries. B batteries are only a myth, anyone who believes in them probably also believes that P=IE is some EE's recipe for a tasty tart.

      • by plague3106 (71849)
        Actually, they did make B batteries [wikipedia.org].
        • "B" batteries in the context of vacuum tube circuits refer to their function, not their form, so they are not contemporaries of AAA-, C- and D-sized cells
          • by plague3106 (71849)
            If you read some of the other articles on the subject, it sounds like A and B at least where almost always of a certain size as well.

            The sizes today don't just dictate form, they also tell you other characteristics such as voltages.
    • by epine (68316) on Monday December 03, 2007 @05:06PM (#21564209)
      This has been understood in the industry for decades: in a given silicon process, power consumption fits roughly within the envelop V^2 * F, where F represents frequency. Given a process shrink, this relation might or might not hold true. For a long time it was a good rule of thumb, but then in the era of excessively high leakage current it did not hold true, more recently with better control over leakage, the relationship is again a good rule of thumb. The upshot is that, over two decades, almost every reduction in voltage for a given class of part corresponds to a significant increase in power efficiency.

      What the article failed to explain is this long history of voltage serving as a proxy for power efficiency.

      The other relationship is that a given part will usually demonstrate a relationship where lower frequencies are stable at lower voltages. If increasing the voltage by 20% allows you to overclock a processor from 2GHz to 3GHz, you can estimate your increased power draw as 1.2^2 * 1.5, about double where you started.

      It's almost pointless to convert this measure into watts, as so many other variables change in tandem. The new part has different bandwidth, different latency, different leakage, different dynamic consumption. There's no simple number that gets you apples vs apples. Most of the time, however, voltage is fair proxy. Peak consumption figures are mostly worthless from an efficiency perspective, except for sizing your power and cooling requirements.

      On a side note, I'm wondering when we hit the floor on practical CMOS voltage levels. Surely the band-gap will come into play in the near future, and then what? Does the efficiency graph suddenly develop a crimp and stagger forward on a much reduced slope? This happened with hard drives, where there was a period of accelerated capacity increase (PRML/GMR/pixie-dust era) only to return to the more sedate curve once again later on. It wasn't long ago that F hit thin air (due to thermal issues) and now F is increasing at half the rate it sustained for a least a decade prior.

      Long ago apparently respectable sources used to proclaim "silicon will hit the brick wall at 0.1um". In turns out S-curves hardly ever play out that way. The curve begins to taper downward when the easy gains are exhausted. The phrase "peak oil" is another one of those conceptual nightmares, much like the chimeric brick-wall on photo lithography. It's not going to be a peak, is it? It's going to be a wavy plateau. On any particular graph, you can point to a "peak" (though none of the graphs will agree), it's just that there won't be a momentous Alderan-disturbance that ripples though planet earth as the precocious metaphor suggests. Much like the silicon people had to finally confess, driving F higher and higher as your primary performance metric (at the cost of absolute efficiency) makes about as much sense in the long run as a single-occupancy air-conditioned Hummer in rush hour traffic.

      Speaking of which, engine displacement is roughly as fair as a measure in the automotive sector as voltage in silicon. It's the nature of the internal combustion engine that these engines are far from their peak efficiency at low to medium throttle, which is why having a lot of power you rarely use is no free lunch. If you accept that a typical 2 liter engine is more efficient than a typical 3 liter engine, why would voltage as a proxy for power be any different?
      • by randyest (589159)
        It's almost pointless to convert this measure into watts, as so many other variables change in tandem. The new part has different bandwidth, different latency, different leakage, different dynamic consumption. There's no simple number that gets you apples vs apples. Most of the time, however, voltage is fair proxy. Peak consumption figures are mostly worthless from an efficiency perspective, except for sizing your power and cooling requirements.

        Why does it have to be just one "simple" number? How about
      • by dkf (304284)

        On a side note, I'm wondering when we hit the floor on practical CMOS voltage levels. Surely the band-gap will come into play in the near future, and then what?

        Actually (IIRC) you start to get into trouble at about 1V as you hit problems getting the signal off the chip. On the other hand, if they can run different parts of the chip at different voltage levels, they may be able to reduce things quite a bit further. That would probably require a redesign of the motherboard/memory interface though, since putting power regulators on the memory modules seems like a poor idea to me, so that may be a while coming.

        • by randyest (589159)
          They can (and do) create different voltages on a chip from one higher supply voltage. No "power regulator" required.
    • Hey, if people couldn't abuse units of measurement then Han Solo would never have been able to do the Kessel Run in less than 12 parsecs.
  • but what exactly about this story is novel. The article was pretty bland. With Moore's law anybody can pretty much predict that chipsets and RAM are going to get faster. Now what would have been interesting is to see price comparisons, predictions on what this will mean for the consumer, testing of the product, discussion of the architecture, insight into the manufacturing process, etc. I guess it's just the fact that it was from a mainstream source, but this article was about as useful as a press release,
    • by BPPG (1181851)
      I agree.
      Incidently, I found reading about Colossus, and it's older hardware, more interesting.
  • How many people here keep up with the latest and greatest graphics card? I know that the newest may get you more fps in some game, but are there really that many people who regularly go spend $400+ for the latest and greatest in graphic cards?
    • by ThreeGigs (239452)
      NVidia 8800 GT can be had for about $200 to $250 depending on the bundle, and is currently rated the best card to have, beating out even the 8800 GTS and other $400+ cards.
    • I don't, but mostly because I've had an AGP-computer for the last few years.

      But now with my new system, I do intend to upgrade graphics card every 2 year or so. I won't get the latest and greatest though, thats usually not worth it. Its often just factory overclocked cards anyways.
    • by WK2 (1072560)
      I paid over $100 for a graphics card in 2003. I paid $30 for my graphics card about a year and a half ago. It is a Nvidia 5500 FX 128 MB.

      I'm not a hard core gamer though.
    • by dh003i (203189)
      I might get ATI Radeon 3850 due to it's Avivo video processing, which scales up 1080p to up to 1600p. Of course, for that to be useful, I'd have to get one of the 30" 2560x1600 monitors.

      But as for keeping up with the latest gaming performance GPU, that's not much of a concern (hence why I go for the lower 3800 series card). I would even go with a 2600 if not for the poor reviews on HDTV upscaling, noise-reduction, sharpness etc. But as far as games go, I play the Descent series, the Tomb Raider series, and
    • by jsoderba (105512)

      Cards of the same generation often use the same meomory tech across the entire price range. All GeForce 8xxx cards use GDDR3, for example. Even when this is not the case (GeForce 7 used both GDDR2 and GDDR3), at the speed graphics tech moves, high-end tech tends to trickle down to the mid-range in less than a year, so it's still interesting for more frugal readers.

  • What is the CAS Latency of said chips? The article didn't say.
  • In addition to faster data delivery the new chimps also claim to consume less energy than previous versions. Could be. Chips Outscore College Students on Memory Test. Most likely. I'm getting tired.
  • by srealm (157581)
    Missing Floppies?
  • How come we never see the insides of graphics cores the way that CPU manufacturers release pics of the internals of their CPUs? I looked and couldn't find any. Now that AMD makes both CPUs and GPUs will it start do so? What gives?
    • Probably because CPUs are devices that are meticulously planned and designed over the course of years by a team of humans (i.e. nice looking dies) whereas GPUs have a tendency to be computer-generated in months (i.e. messier, bigger, less efficient) and most likely not pretty to look at...

      Only when GPU manufacturers start to worry about efficiency do I figure we'll start seeing prettier dies... And graphics cards that don't consume hundreds of watts.
  • The summary reminds me of a commercial from several years back that purportedly allowed you to give your own car a jump start through the cigarette lighter. One of the selling points for the device was how it was supposedly better than the actual battery in your car. To prove it, they hooked a multimeter up to each one, while the voiceover said, "Your normal car battery has only 12 volts of energy. But {our product} has 48 volts of energy!"

    This was almost as good as the one where one of those ionic air f

There is no distinction between any AI program and some existent game.

Working...