Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays Input Devices Intel Hardware

New Thunderbolt Revision Features 20 Gbps Throughput, 4K Video Support 301

hooligun writes "The next-gen Thunderbolt tech (code-named Falcon Ridge) enables 4K video file transfer and display simultaneously in addition to running at 20 Gbps. It will be backward-compatible with previous-gen Thunderbolt cables and connectors, and production is set to ramp up in 2014. An on-stage demo with fresh-off-the-press silicon showed the new Thunderbolt running 1,200 Mbps, which is certainly a step up from what's currently on the market."
This discussion has been archived. No new comments can be posted.

New Thunderbolt Revision Features 20 Gbps Throughput, 4K Video Support

Comments Filter:
  • by AtomicSymphonic ( 2570041 ) on Monday April 08, 2013 @09:07PM (#43397421)

    So, will we see OEM Windows PCs come by default with Thunderbolt ports? Or is this another fantastic, magical, extraordinary Apple Inc. exclusive?

    • by OhANameWhatName ( 2688401 ) on Monday April 08, 2013 @09:11PM (#43397449)

      So, will we see OEM Windows PCs come by default with Thunderbolt ports? Or is this another fantastic, magical, extraordinary Apple Inc. exclusive?

      You wouldn't seriously risk upgrading to Windows 8 just to be able to use 20 Gbps external connections would you???

    • by Radagast ( 2416 ) on Monday April 08, 2013 @09:22PM (#43397499) Homepage
    • by UnknowingFool ( 672806 ) on Monday April 08, 2013 @09:27PM (#43397537)
      I don't know what you've been smoking but Thunderbolt is an Intel invention. They worked with Apple on implementation with Apple's most obvious contribution being the VESA compliant mini-Displau port connector. For their efforts, Apple got a good six month lead on their competition as they had products the day Intel released the specs. Incidentally, Apple got the Thunderbolt trademark and then transferred it to Intel.
      • Apple has confusingly named their new iDevice connector "Lightning," so I think people can be forgiven for assuming Thunderbolt and Lightning are from the same company.

    • by Guppy06 ( 410832 )

      Would even that be enough? I have a motherboard with Firewire ports, but the only use I get out of them is when I need to connect my sister's MacBook for some reason.

      I've got more USB 3.0 devices than I do Firewire.

      • A counter-example: Last time I bought a desktop computer, I went out of my way to buy one with Firewire(400, at the time, was cutting edge). I then proceeded to buy an external hard drive for routine backups, and later daisy-chained an iPod 4g off it. It beat the pants off the USB cable that the iPod came with, at the time.

        I had no idea what I was going to do with it at the time, though.

        This time, I went with one of the Gigabyte thunderbolt motherboards. God only knows what I'm going to do with it, bu

        • Your saying Firewire was great because you could plug a whopping 2 devices in to it? And both those devices had USB connectors which everything else used?

          I'm not saying that Firewire isn't technically better than USB for several things.
          Its just a poor reason to say its great because you managed to find something to plug in to it.
          FYI I have never used my firewire ports - nothing I have uses it.

          Is a separate technology really required just for hard drives? Not really which is why USB 'won'.

          • by dbIII ( 701233 )
            Firewire can do a lot more than two, initially was vastly faster and could do reliable streaming (used to be VERY important with video cameras for one thing, and even USB CDROM burners used to be shit while firewire ones worked due to the reliable streaming). USB won because it was a lot cheaper and usually good enough.
          • Firewire was great because USB2 had a problem of shared bandwidth, and the sliver of my total 480 MB/sec allotted to my backup drive and iPod made Firewire's 400 MB/sec of intelligently allotted bandwidth a lot more compelling.

            On top of that, by the time I got the iPod, the Pentium 4 was showing its age; anything that made the whole experience more responsive was appreciated.

            Plus, I was quite an impatient little tyke at the time. I haven't gotten all that much better, but the technology sure has.

        • next generation of graphics cards will want pci-e at least a X16 2.0 link or a X8 3.0 one. thunderbolt is way under that it's not even pci-e 2.0 X8

          • I'm throwing science at the wall to see what sticks.

            I actually wouldn't be entirely surprised if the oil-immersion GPU happens eventually, though.

      • by jtownatpunk.net ( 245670 ) on Monday April 08, 2013 @10:09PM (#43397735)

        I used to use firewire all the time back when I used to do a lot of video editing around the turn of the millennium. The first generation of USB was so bad that I didn't even consider USB2 for my external storage. Firewire, OTOH, was a rock. Never had a device just disappear for no reason. Throughput was better, CPU load was lower, isochronous transfer was possible. Night and day. Like comparing a Lexus to a Yugo.

        Of course, now all my stuff is USB because firewire components are so rare and I have no need to move devices between computers. I've got gigabit ethernet to move files and I don't need to move a single optical drive between multiple machines. And USB is much more reliable than it used to be. My new gaming rig has two firewire ports but I haven't used them. Neither of my laptops has a firewire port and I haven't missed them. Thunderbolt seems like a solution to a problem that no longer exists [in my world].

    • Magical and exclusive? Do you mean like when Windows PCs started to ship with FireWire ports?

    • by smash ( 1351 )
      It's not a mass market consumer technology. Itsa the modern day equivalent of SCSI.
    • by tlhIngan ( 30335 ) <slashdot&worf,net> on Monday April 08, 2013 @11:59PM (#43398317)

      So, will we see OEM Windows PCs come by default with Thunderbolt ports? Or is this another fantastic, magical, extraordinary Apple Inc. exclusive?

      There are laptops coming with Thunderbolt.

      Sony's integrated one where the "mobile" mode is a standard Intel 4000 graphics for low power, but then you can dock it (via proprietary USB connector - grr...) which adds a Blu-Ray optical drive AND a decent GPU to the mix. Some company is actually making a PCIe enclosure so you can drive an external card through thunderbolt.

      Heck, that's one of Thunderbolt's interesting applications - you can wire up a PCIe video card to it and have powerhouse graphics that suck down the watts, but easily unplug when you don't need it. Essentially, it's a form of hot-pluggable PCIe. And it lacks all the funkiness that USB adapters typically entail.

      A thunderbolt serial port, while overkill, will present itself to your laptop as a NATIVE serial port - no messing with icky USB serial adapters that are iffy - this works just like a built-in serial port because it is using the standard busses your PC expects. As far as anyone is concerned, it hooks straight to the PCIe bus, and does normal PCIe things, and other than some minor hardware bridging, acts like it's plugged into an internal PCIe bus.

  • Apple has pissed off all the other CE manufacturers. There will be nothing to plug the other end into.

    Without general support great features are worthless. Apple is repeating Sony's mistake with betamax. They won't share, thus it will fail.
    Great technology without support is worthless.
    • by Radagast ( 2416 ) on Monday April 08, 2013 @09:24PM (#43397505) Homepage

      That's simply false. There's a large amount of Thunderbolt accessories, including video gear, PCIe expansion chassis (very useful for laptops), and docks. Sonnet just announced this Thunderbolt dock [sonnettech.com], which seems to be a pretty great deal for laptops.

      • by jedidiah ( 1196 )

        ...for differing values of large.

        I am sure that you would also claim that Firewire rules the world too.

        With all of the costs and extra gear involved. You might as well just have another PC. The real problem here isn't that Apple laptops are lame but that there isn't a seamless experience between different devices on an Apple network.

        The Cloud concept there doesn't quite live up to the hype.

        • It was released as a spec just over 2 years ago. It requires a different connector. Considering most motherboards came with LPT ports up until a few years ago and I can't remember using one in almost twenty years, I would say that rapid adoption of new technology may be lacking for some manufacturers.
        • by Radagast ( 2416 )

          No, Firewire is pretty much dead, although it was good for a while.

          It seems to me that Thunderbolt has had faster and wider adoption than Firewire did over the same time after introduction. Thunderbolt is also a lot more useful than Firewire, since it's essentially PCIe over a serial cable. It's fairly trivial to adapt existing PCIe drivers to run the same hardware as an external TB box (or the PCIe card in a TB PCIe chassis), so it's very flexible.

          Basically, TB finally delivers on the ancient promise of a

      • I want one, but srsly, the cost of that thing is retarded. $400 for the base option, which is essentially an extravagant usb3.0 hub and dvd rw with some bells on it.

      • by putaro ( 235078 )

        Unfortunately it doesn't "just work".

        I have a Mac Pro 17" with Thunderbolt that I mainly use to hook up an external monitor (Thunderbolt->DVI with a KVM switch in between).

        I picked up a LaCie Thunderbolt-SATA adapter to mess with. Plugged it in between the laptop and the KVM switch. Oops. Video quality goes to hell. If I pull the KVM it works better, but that kind of screws up my desktop.

        It would have been nice if Apple had put two (or more) Thunderbolt ports on the machine but, hey, all you need is

    • by SuperKendall ( 25149 ) on Monday April 08, 2013 @09:30PM (#43397541)

      What could I connect this to?

      Several RAID arrays, gigabit ethernet, multiple monitors, misc external storage (like single disks or a DROBO).

      All with one connector...

      Yes Thunderbolt stuff was slow to come out, but the rate of arrival has picked up.

      • by ADRA ( 37398 )

        Do each of the devices get their own DMA signalling, or are you crippled to only one device being fast at a time? How does context sharing of the pipe sharing work? I imagine that this -could- be a great step in the right direction, but they need a lot more than just a raw fat pipe to make multiple peripherals fast and responsive.

      • Also external graphic cards.
      • What could I connect this to?

        Several RAID arrays...

        I wouldn't suggest it. It'll only take two SSDs to saturate a Thunderbolt bus (or 4 SSDs with 20 Gbit Thunderbolt).

      • by Sycraft-fu ( 314770 ) on Tuesday April 09, 2013 @03:28AM (#43399019)

        Thunderbolt has 2 lanes of PCIe 2.0 (this new version changes that to 3.0). 10gbps raw data rate, around 8gbps effective. It also has one channel of DisplayPort 1.1a.

        So in terms of non-display devices, that means one RAID array of reasonably fast drives can easily overload it. I dunno about you, my RAID controllers usually hand of of 4-8x slots. 1 good SSD can kill half of that on its own. A 10 gig NIC is more than it can handle (look in the thread for a post by someone who implements those). In terms of display, DP 1.1a has enough bandwidth to get you 2560x1600@60fps. Knock on a second display at that rez? Well you don't have enough bandwidth anymore, so you are going to have to reduce rez, or framerate.

        Or you could always, you know, have more than one connector and not bitch.

        Seriously the one connector thing seems a little silly to me. A marketing solution looking for a problem. Yes, it'll work fine for the kind of stuff Apple likes to do: A laptop connected to a monitor, which then provides USB ports n' such, all over one connector. Ya. Great. Not really that big a deal.

        It is not something, at least at present, that you can effectively hang a bunch of shit on one connector and get high performance.

    • Repeat after me: Thunderbolt is an Intel technology. It is the copper version of LightPeak. Apple was the first to have it in their products because they worked with Intel on a number of things like the mini-Display Port connector. Thunderbolt == Intel.
    • by tyrione ( 134248 )

      Apple has pissed off all the other CE manufacturers. There will be nothing to plug the other end into. Without general support great features are worthless. Apple is repeating Sony's mistake with betamax. They won't share, thus it will fail. Great technology without support is worthless.

      Don't own any professional equipment or work within the NAB world? Do some more research. More and more manufacturers are jumping on-board. https://thunderbolttechnology.net/products [thunderbol...nology.net]

    • by jo_ham ( 604554 )

      Apple has pissed off all the other CE manufacturers. There will be nothing to plug the other end into.

      Without general support great features are worthless. Apple is repeating Sony's mistake with betamax. They won't share, thus it will fail.

      Great technology without support is worthless.

      How are Apple "not sharing"?

      The technology is owned by Intel, and was developed with the assistance of Apple. Apple also licenced (for free, in perpetuity) the mini displayport connector and port that TB uses. Intel have been promoting it since launch.

      Not really seeing how this is Apple not sharing, given that it's not Apple who actually controls the technology, but such is the way with /. posts - the barest tissue of lies covering the bare bones of the truth. The exclusivity deal was over more than a year

  • by iamwhoiamtoday ( 1177507 ) on Monday April 08, 2013 @09:25PM (#43397515)

    But in the end, it all comes down to cost. Current Thunderbolt displays are rather expensive. Heck, I picked up a dual-link DVI monitor of the same resolution for $275 on ebay! why pay three to four times as much for something with only a small few bells and whistles added on?

    Thunderbolt, overall, is great in terms of performance, but it just seems to be well beyond what most folks are willing to pay. It's like that guy who brags about how "My car has a Turbo Kit option from the dealer" but he NEVER SPENDS THE MONEY TO GET IT.

    The external drives, the only situation that I'd actually be interested in, are also stupid expensive. In the long run, just better off either using E-SATA, USB3, or internalizing the drives. Same goes for daisy chaining monitors. Want to run tons of monitors? Install more video cards! woo.

    no more coffee for me after 5pm, k? ._.

    • You're looking at this from the narrow viewpoint of a monitor connector. ThunderBolt can replace a laptop dock with a single cable. So instead of hooking up a USB, Ethernet, and monitor cables, just hook up one TB cable. It's not there yet as the specifications were released just over two years ago.
    • You can't connect a 4k monitor to a dual-link DVI connector. You need 4 DVI channels for that.

    • The external drives, the only situation that I'd actually be interested in, are also stupid expensive. In the long run, just better off either using E-SATA, USB3, or internalizing the drives. Same goes for daisy chaining monitors. Want to run tons of monitors? Install more video cards! woo.

      I'll get right on that, as soon as I figure out how to stuff more hard drives and more video cards into my smart watch.

      The impetus for Thunderbolt isn't to do existing jobs for today's technology. Rather, I get the distinct impression that it's being over-engineered to be the standard of choice for the next few decades. No, we don't need 20 Gbps throughput and 4K video now, but when you walk into a friend's living room and plug your watch into his video game display, it's going to need the bandwidth to stre

      • by smash ( 1351 )
        Depends what you're doing. If you want to hook up to a fibre-channel SAN or a 10 gig network port (1GBe will get nowhere near saturating your SSD), nothing else will cut it. There already exist use cases for thunderbolt today. They just aren't home-user scenarios.
    • by Sycraft-fu ( 314770 ) on Tuesday April 09, 2013 @03:16AM (#43398983)

      Originally Light Peak was supposed to basically just be an external PCIe bus (and it could be internal too). The idea was a connector for things that need lower overhead than USB, and also hopefully eventually a single connector for all kinds of things. With the original goal of 100gbps, that would have been realistic (optical was the original interface design).

      However things got changed pretty quick in part for cost reasons, but also because Apple got involved (meaning gave Intel money). Apple is obsessed with less cables because cables = evil in their mindset. So it got changed to be display + PCIe on one cable.

      That had negative implications for the bandwidth, but also for the cost and ability to implement it. If it was just PCIe, well then a PCIe-thunderbolt card would be real feasible, and you could add a thunderbolt port by hanging it off the PCI bus. However with display integrated, it needs to work with the integrated display adapter and all that jazz.

      Ultimately more cost, and thus less interest. While some Apple types might salivate over the prospect of one cable that goes from a laptop to a monitor, and then a bunch of non-monitors ports on that monitor, most people don't care.

    • by smash ( 1351 )

      Except for those use cases where it isn't good enough. Which is where thunderbolt is used. People seem to be expecting it to be as ubiquitous as USB or SATA, which is not ever going to happen, because its not a cheap CPU driven consumer-oriented bus.

      It exists so that users of portable machines can plug in high speed peripherals. Not single external hard drives, but arrays, fibre channel, external GPUs, etc.

      Most users don't do that. And that's fine. But if you DO need to do that, then USB3 just won

      • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday April 09, 2013 @07:52AM (#43399997) Homepage Journal

        It exists so that users of portable machines can plug in high speed peripherals. Not single external hard drives, but arrays, fibre channel, external GPUs, etc.

        Most users don't do that.

        I feel you have misstated the case by way of apologia. Most portable machines powerful enough to be worth plugging in anything more than a single storage device have more than just one port, so users don't need to plug everything into one port. Using a single cable would be a cool feature, but of the vanishingly few people plugging arrays into laptops, vanishingly few of them need a single cable. Slightly more are willing to pay for the privilege and save a few seconds they weren't going to use anyway, but still not enough to justify the cost of the feature.

        The simple truth is that you can build a whole storage server on GigE for less than the cost difference to buy a machine with thunderbolt and external devices. That means it's overpriced, full stop.

  • watch units please (Score:5, Insightful)

    by v1 ( 525388 ) on Monday April 08, 2013 @09:32PM (#43397561) Homepage Journal

    Mbps != MBps

    Please stop doing that in article summaries. When you start getting up into large numbers like that you can't just expect everyone to "read what you meant to say."

  • Is it me, or it looks like FireWire scenario playing out all over again? Only this time it's not only USB, but also upcoming WiGig to jointly lock it into a small niche...
  • by GrumpyOldMan ( 140072 ) on Monday April 08, 2013 @09:59PM (#43397679)

    I do 10GbE drivers, and the previous generation of tbolt did not really offer 10Gb/s of usable bandwidth to PCIe devices, it was more like 8Gb/s:

    If you recall, tbolt muxes PCIe and Display Port. On the PCIe side, the thunderbolt bridge passed 2 lanes of Gen2 PCIe through to devices. Since Gen2 is "5GT/s" per lane, you'd think you'd have 10Gb/s. But not really, as "10Gb/s" does not take into account PCIe overhead, which can be about 20% of the data transfer rate. So on the original "10Gb/s" thunderbolt, you were lucky to get 7Gb/s transfer rate from 10GbE NIC, once you also add in network protocol overheads.

    Having a bus-constrained NIC leads to all sorts of weird problems when receiving data.. With flow control disabled in combination with bursty transfers, you often see far less than the 7Gb/s peak, as TCP hunts around to find the constraint and recover from frequent packet loss events.

    It sounds like they've built the new part from 2 lanes of Gen3 PCIe, which should be good for ~16Gb/s of usable bandwidth. This is a very welcome change, as 16Gb/s should be enough for a single-port 10GbE NIC running at full speed, and a disk controller talking to a fast SSD or an external RAID array that can deliver ~750MB/s (bytes) of I.O.

    Just don't try to use a bonded 2 port 10GbE NIC, or you're back at the bandwidth constrained problem.
     

    • Do you have any insight why they even bother with TB when 10Gb Ethernet already exists and has been deployed for ages? I.e. why not just use 10GbE instead?
      It seems like reinventing the wheel for no real gain.

      • Re: (Score:3, Informative)

        by Strider- ( 39683 )

        Do you have any insight why they even bother with TB when 10Gb Ethernet already exists and has been deployed for ages? I.e. why not just use 10GbE instead?
        It seems like reinventing the wheel for no real gain.

        When all you have is a hammer...

        The main reason for using Thunderbolt over 10Gb Ethernet is that one has a fairly significant protocol overhead (Ethernet) while the other is primarily a bus protocol, and operates at a much lower level than Ethernet does. Each has their strengths and weaknesses, each has their application.

      • by smash ( 1351 )
        Because 10GBe doesn't expose PCI to your peripherals. Thunderbolt isn't JUST for 10GBe NICs, although that is a popular high bandwidth application that no other external connector can currently provide.
    • using pci-e 3.0 on the Qm77 chipset for Thunderbolt will cut video down to X8 3.0 and some boards may use switchers to get full use out of the pci-e lanes.

      http://www.intel.com/content/www/us/en/chipsets/performance-chipsets/mobile-chipset-qm77.html [intel.com]

      but only 2 lanes on the TH side still makes it useless for video cards.

  • Where are the data only cards? mac pro* that may need some kind of voodoo like loop back cable as the dual xeon systems don't have on board video as part of the cpu / chipset?

    Some boards do have on board pci 33 based video mainly server boards.

  • by fahrbot-bot ( 874524 ) on Monday April 08, 2013 @10:53PM (#43397979)

    ... Thunderbolt tech enables 4K video file transfer and display simultaneously in addition to running at 20 Gbps. It will be backward-compatible with previous-gen Thunderbolt cables and connectors ...

    And even faster with gold-plated Monster cables / connectors !

  • by bertok ( 226922 ) on Tuesday April 09, 2013 @01:48AM (#43398679)

    Why are manufacturers coming out almost-but-not-good-enough connector standards one after another?

    Both tablets and TVs are leaving PC displays in the dust, and new PC connector standards that aren't even available yet already don't have the required bandwidth to support displays that are coming to market now, let alone in the future!

    For example, support for full 4K video over 20 Gbps is bullshit, because some aspect of the full spec has to be abandoned:

    Resolution: 3840 x 2160
    Bits per pixel: 30 or 36 (10 or 12 bits per color channel)
    3D or High Framerate: 120 fps

    This adds up to: 3840 * 2160 * 30 * 120 = 29.8 Gbps.

    Sure, you can drop the framerates, but then expect to have a headache viewing 3D. The bit-depth can be lowered, but then expect visible banding when using gamuts that are wider than sRGB. The resolution can't be lowered, because calling 3840 pixels "4K" is already a stretch.

    • 3D or High Framerate: 120 fps

      Sure, you can drop the framerates, but then expect to have a headache viewing 3D.

      60 fps only are required for "3D" at 30fps

      72 fps is the maximum rate that enhances user experience, above the eyes and brain don't feel there's a difference (for 2D display)

      20 Gbps are enough for this.

    • 3D or High Framerate: 120 fps

      Huh? Most video is at 24 fps. Even if we generously triple that, we're nowhere near 120 fps. Hobbit 3D's highest encode was 48 fps which was considered super high quality, and most theaters were still at 24.

      • 3D is made of two frames, one for each eye, so to display 48 fps 3D stereo you usually need as much bandwith as for displaying 96 fps.

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...