Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Technology

USB 4 Will Fully Support DisplayPort 2, Including 8K HDR Monitors (engadget.com) 78

VESA has announced that USB 4 will fully support the massive bandwidth available for the DisplayPort 2.0 standard, including support for 8K 60Hz HDR or even 16K 60Hz monitors. Engadget reports: Since USB 4 works at 40Gbps and DisplayPort 2.0 supports 80Gbps speeds, how will this work? USB 4 can actually send and receive at 40Gbps at the same time, so VESA took advantage of that with a new spec called DisplayPort Alt Mode 2.0. Since DisplayPort is primarily used for video, which only sends data one way from your PC to a monitor, the Alt Mode 2.0 standard remaps USB-C's data pins to work in one direction only -- giving you double the speeds.

According to Anandtech, Alt Mode 2.0 will support regular USB 4 cables. At the same time, monitors won't need to have USB 4 controllers, which should simplify display designs. Since it also supports the Thunderbolt 3 standard, USB 4 will become a universal connection standard for both smartphones and PCs, supporting things like "docking, gaming, AR/VR HMDs, and professional HDR displays," VESA said.

This discussion has been archived. No new comments can be posted.

USB 4 Will Fully Support DisplayPort 2, Including 8K HDR Monitors

Comments Filter:
  • by Burdell ( 228580 ) on Thursday April 30, 2020 @08:07PM (#60009440)

    Wasn't USB 3 supposed to be the universal standard? Then they called it USB 3.0, gave us USB 3.1, renamed USB 3.0 to USB 3.1 gen1 and made the real USB 3.1 as USB 3.1 gen2, then gave us USB 3.2 and re-renamed 3.0/3.1 gen1 to USB 3.2 gen1x1, added USB 3.2 gen1x2, renamed USB 3.1 gen2 to USB 3.2 gen2x1, and made what should have been USB 3.2 as USB 3.2 gen2x2.

    Sowing confusion? I guess that's as universal as anything.

    • If there's one thing many of us learned over the years, it's that if something can do anything, it isn't good at any of them.

      • by Somervillain ( 4719341 ) on Thursday April 30, 2020 @08:52PM (#60009542)

        If there's one thing many of us learned over the years, it's that if something can do anything, it isn't good at any of them.

        What are you talking about? That sounds like an old man comment, but is so naive. Did you not use computers 23 years ago when you had to deal with Parallel Ports, Serial Ports, PS2, SCSI, etc? USB was one of the greatest things that happened to the personal computer, peripheral-wise. Now they're including the ability to handle displays as well. I personally like the notion of a small reliable, reversible cable connector handling everything I need from charging my phone/laptop to powering my display.

        • Re: (Score:3, Interesting)

          by OldMugwump ( 4760237 )
          Yes, I was (not just 23 years ago but more than 40). And while USB is nice, it still, after 26 years (USB 1 is from 1994), doesn't work reliably much of the time. Lots of cables are still incompatible with some devices, even many USB-C cables, which are supposed to work plugged in either way, don't.

          And still, much of the time, we have to go searching for the proper drivers for things.

          It's not all that much better than it was in the days of RS-232. (Altho we have MUCH more bandwidth than back then, and i
          • by SDedaluz ( 6818404 ) on Thursday April 30, 2020 @10:16PM (#60009672)

            It's not all that much better than it was in the days of RS-232.

            You and I remember RS-232 very differently. Is it 8-N-1 or 7-E-1? I have to convince this guy to set his bit rate on COM2 to 19,200 even though the box his modem came in says 14,400 bps in big letters - but I'd have him set it even higher (blazing 115,200) if he didn't have a sluggish 8250 UART. What command set does my modem use? Looks like my dialer doesn't support non-Hayes configuration so I have to download a profile using Zmodem (with my old modem - hope autostart will work today) from a local BBS. Failing that, I will have to decrypt the handy command set reference at the back of the manual. But wait, is the serial port sharing an IRQ level with another peripheral that will cause the connection to seize or the OS to lock up? Looks like I'll have to check the BIOS settings and the jumper settings on the add-in ISA cards that are already installed. Remind me again how little has changed?

            • All the problems you describe can be fixed by RTFMing. USB didn't change that, Plug and Play/Pray drivers did.

              • How would plug and play have worked over any other external interface on a PC? How would you identify the device that you plugged in?

                Go ahead and detect the printer I plugged into the parallel port. Or my flight stick on the midi port, it dares you to auto detect its configuration. Oh, or the scanner with parallel port pass through, auto load drivers for that shit, please.

                USB is what enabled plug and play for external devices, and for common device classes like mice, keyboards, joysticks, gamepads, etc.

                • In fairness to sexconker, Plug 'n' Play worked over serial. Devices would respond to a pre-defined query with a vendor and device ID that were used by the OS to install the correct driver.
              • I worked in technical support and in Applications Engineering at one of the major modem manufacturers in the 1990s. I can assure you that Plug'n'Play did not address the RTFM issues. Still had hardware level problems like address/IRQ conflicts with PnP. Still had dumb customers trying to force invalid data rates. An interface that (theoretically) juggles 127 nodes without the need for additional assignment of hardware resources is better than an add-in card with DIP switches or jumpers. A device that automa
          • by slack_justyb ( 862874 ) on Thursday April 30, 2020 @10:43PM (#60009710)

            even many USB-C cables, which are supposed to work plugged in either way, don't.

            Stop buying shitty cables.

            And still, much of the time, we have to go searching for the proper drivers for things.

            Go get a real OS

            It's not all that much better than it was in the days of RS-232

            The hell it isn't. DIP switches and jumper pins. Not to mention having to setup the correct protocol between the devices.

            worked harder to make things Just Work

            Okay, but a lot of your rant comes off as you just being lazy as fuck.

            And imposed sufficient penalties on vendors who build almost-fully-compliant devices and foist them on the public, to make them Stop Doing That.

            Or stop buying shitty things. Cheap shitty things do that to you and no amount of "penalty" is going to stop people from buying shitty shit. So just stop doing that. Get your a real OS. Actually take care of your system like you give a damn and the bulk of your problems will go away.

          • by Anonymous Coward

            You're full of shit. USB works flawlessly MOST of the time. Any time it doesn't actually work is a surprise.

            • by grumbel ( 592662 )

              USB works well when you just want to connect a mouse or keyboard, but try to connect something more demanding, like say a VR headset, and all hell breaks lose. The original Oculus Rift was especially fun as it needed at least three USB3.0 ports all working at the same time and that rarely worked with most PCs. Once you add some powered USB Hubs, active extension cords or USB PCIe cards, you can generally get a working configuration going, put Plug&Play it was not, more like classic Plug&Pray.

          • by AmiMoJo ( 196126 )

            One of the great things about USB is how it eliminated the need for drivers in most cases. You can plug in a mouse or keyboard or gamepad and it just works, not like back when you needed a driver to support extra buttons or keys. Most video capture devices (webcam, analogue to digital converter) doesn't need a special driver because there is a USB standard for that. Same with audio devices, mass storage etc.

            Even special devices generally don't need a driver if the manufacturer is half way competent. For Win

            • by N1AK ( 864906 )
              I might be misremembering but I'm not sure that it was USB that eliminated drivers as I can remember having to install drivers for USB devices beyond keyboard/mouse being standard for a considerable time. Don't get me wrong, I'm strongly in the thank god for USB vs how things were 25 years ago camp, and I'm sat here using a laptop that is closed connected to everything including power by USB; but the point isn't entirely invalid. It's true that USB now is considerably better than what USB replaced 20 odd ye
              • by AmiMoJo ( 196126 )

                You had to install drivers because older versions of Windows did not have support for standard USB devices built in.

                For example USB supports video capture devices like webcams in a standard way, but Windows didn't get support until XP SP2 and it wasn't complete until Windows 7.

                It's interesting you mention HDMI because that has some compatibility issues too. Reports of TVs and monitors blacking out momentarily were widespread in the early days as they lost sync with the source and had to renegotiate. DP too.

                • There are always alternatives. What an obtuse comment. USB sucks balls but it was a business decision to use USB. FireWire was vastly superior.
                  • by torkus ( 1133985 )

                    The more things change the more they say the same. The next incarnation of firewire is thunderbolt and it's largely suffering the same fate generally due to high licensing costs. That's abated, but USB v17 will take over as usual even though it's not as good. But...what can you do?

                • by torkus ( 1133985 )

                  You're confusing USB plug n' play with universal drivers and/or on-demand download of drivers. Two very different things.

                  USB and Plug n Play wasn't really about drivers at all - it was about not having to reboot a system every time you added or removed a device (yes, I'm that old).

                  Plug n play only worked like magic if you had the device rivers already installed. You used to plug in a USB device, get a weird 'missing driver' type error...then install the software which would update the device to new driver

                  • by AmiMoJo ( 196126 )

                    I'm not. I'm saying that because USB defined a standard for things like mass storage you didn't need a separate driver for every random CD-ROM, flash drive, zip drive, tape drive, floppy drive and card reader out there. There is one driver for all and it comes with the OS.

            • And yet every time I plug a new USB flash drive into the USB port the computer says loading driver for....
          • by thegarbz ( 1787294 ) on Friday May 01, 2020 @03:58AM (#60010110)

            after 26 years (USB 1 is from 1994), doesn't work reliably much of the time. Lots of cables are still incompatible with some devices, even many USB-C cables, which are supposed to work plugged in either way, don't.

            Do you buy your cables from Alibaba from something and sort the list by price from low to high? I'm not sure what you're talking about, I've never come across a USB-C cable that didn't do exactly what I wanted it to, be that pump 60W of power into a laptop, or transfer at Gen2 speeds, or whatnot.

            The USB spec is perfectly fine, you just need to buy products that actually follow it.

            It's not all that much better than it was in the days of RS-232.

            Your rose coloured glasses are so dense they are outright opaque. RS-232 was an absolute clusterfuck of completely incompatible cables with no indication and no negotiation of capabilities. By comparison if you plug in a USB cable that e.g. doesn't support 60W charging, it will still at least work, just slower.

            I wish the people who define USB standards (I mean you, Toby) worked harder to make things Just Work.

            They do. But what else do you wish? 100% compatibility with 100% of devices means that ever little shitty device would then need to support every little shitty part of the standard. Are you prepared to pay extra for that USB socket on a Raspberry Pi to support 8K alternate mode, attempt to pump 60W into your battery pack, and support 40GB/s transfer speeds? Would you pay extra for that removable 2.5" HDD which spins at 4800RPM to require Thunderbolt 2 including daisychain support?

            Hell an interview with the original designer of the USB port said quite clearly the number one goal of USB was to keep implementation cost down, the original design had a reversible connector, but by adopting USB-A instead the connector cost halved.

            • I may be wearing somewhat rose-colored glasses; fair enough. And please don't get the idea I don't think USB is an improvement - it is in many ways. I'm just frustrated that it didn't fully do the job it promised to do.

              Also I'm looking at this more from a developer's viewpoint than a user's - with a RS-232 breakout box and (at most) an oscilloscope you could always figure out how to get a RS-232 device to talk to your application, after spending 20 minutes with those tools.

              With USB if the vendor didn't
            • Re " what else do you wish? 100% compatibility with 100% of devices means that ever little shitty device would then need to support every little shitty part of the standard", all I wish is that vendors implemented the fall back provisions of the standard properly, so devices declare accurately what parts of the standard they do and don't support, and that vendors market their wares clearly so buyers can tell what they're getting.

              Probably I jumped on USB-C too early and am still butthurt from that. At that
        • by Kjella ( 173770 )

          My guess is he's one of those who wants more ports on the go. By putting everything and the kitchen sink into one spec the trend is that you get one magic port. At work it's one USB-C cable and I got keyboard, mouse, headset, ethernet, power and dual displays all connected like a dock which is super slick. Fortunately we didn't go with a completely minimalist PC so it's not the only port but a lot of modern laptops need a breakout box to do more than one thing at a time.

        • Absolutely.

          HDMI and DVI carried the exact same video signal for a while, except DVI had the bandwidth to far outstrip HDMI's capabilities. HDMI did other things, things that are stupid and pointless (like network), and some things that made sense (like audio). But DVI did its job perfectly and better than what HDMI could do for years and years. And it was much cheaper, of course.

          Parallel ports were somewhat annoying, but not too much. SCSI was more capable but far more annoying. Serial was and is great

        • And your comment sounds like an ignorant moron who does not learn from history. Many have claimed to have the universal connector and all have failed.

          Given the history of USB one should be skeptical. USB is notorious for not living up to the bandwidth specs. Transfer a file to a USB flash drive sometime and you'll what I mean.
      • It used to be that "Universal Standards" came from technically adept people working together.
        They would create a standard, and manufacturers would implement them.

        Then manufacturers reasoned if they could stack these "technical" committee with shils, they
        could get their own -- sometimes proprietary sometimes patent encumbered-- standards
        approved That was bad enough.

        Then we have the creation of nonexistent "standard bodies" to collect fees from everyone, and
        bless the new "standard". USB. WiFI. BlueTooth.

      • If there's one thing many of us learned over the years, it's that if something can do anything, it isn't good at any of them.

        Not true. https://html5zombo.com/ [html5zombo.com]

    • I tend to agree. This is another thing we really don't need.

      • I tend to agree. This is another thing we really don't need.

        Dude...Might as well say "640K is more memory than anyone will ever need on a computer" while you're at it. If you're saying this, I am guessing you haven't used thunderbolt3. While not perfect, it's much nicer than DVI and much much more compact than HDMI and traditional DisplayPort cables. Even if you don't plan on buying an 8k monitor, the extra bandwidth allows you to daisy chain many small monitors. There is not such thing as too much bandwidth or too much speed. Give people more power and they'll

        • by AmiMoJo ( 196126 )

          I'm just hoping this means we start to see more 8k monitors. The only one you can get at the moment is that very expensive Dell. 4k isn't quite enough and everyone seems to have abandoned 5k.

          • And you don't need USB to have 8k monitor. There's something called HDMI. And there was never a 5k spec. 4k and 8k and now 16k. A computer monitor can display in any resolution the manufacturer and graphics card can support but for TV content the specifications for video transmissions were clearly defined and there was not a 5k spec.
        • Ok, well that's a fair argument, if a bit hyperbolic. To be more accurate though, what I really meant was that this is something we really don't need yet. Seeings as how the dust still hasn't even settled on USB-3 support everywhere, you know.

    • Sounds like Linux distros.

      • Re: (Score:2, Funny)

        by Anonymous Coward

        Still better than the idiotic sequence that Windows went by:

        1.x, 2.x, 3.x, NT 3.x, 95, 95 A, 95 B, 95 B USB, 95 C, NT 4.x, 98, 98 Second Edition, 2000, 2000 SPx, Millennium Edition, XP Home, Small Business Server, Server for Embedded Systems, Home Server, XP Professional, Server 2003, XP SPx, Server 2003 SPx, XP Professional x64 Edition, Vista Home Basic, Vista Home Premium, Vista Business, Vista Enterprise, Vista Ultimate, Server 2008, Vista Home Basic SPx, Vista Home Premium SPx, Vista Business SPx, Vista

    • by phantomfive ( 622387 ) on Thursday April 30, 2020 @08:51PM (#60009538) Journal
      It's not really a problem since it's backwards compatible. If you need really big bandwidth, then you'll use USB 4. If you don't, then you can still plug your USB 3 devices and cables into the USB 4 slot. We can also expect that there will be a USB 5 with even more bandwidth. That's the way the computer world is, it gets faster. If USB 4 is well-designed, then the upgrade to USB 5 will be seamless.
      • It's a problem if you're trying to make sense of all the stupid naming conventions of USB 3.

      • by fintux ( 798480 )

        So you buy a motherboard that says "6 x USB 3.2 ports". What does that mean? Gen 1? Gen 2? Gen 2x2? Is the speed 5 Gbps? 10 Gbps? 20 Gbps? Can be anything since "USB 3.2" is meaningless, but it's used in marketing. How about a motherboard with "USB 3.0"? That also doesn't mean anything anymore - it is now obsolete. It used to mean 5 Gbps, but later the correct term becamse "USB 3.1 Gen 1". But that also is now obsolete, and the correct term is "USB 3.1 Gen 1". How about a motherboard with "USB 3.1 Gen 2"? S

    • by johnjones ( 14274 ) on Thursday April 30, 2020 @09:05PM (#60009554) Homepage Journal

      the problems where simple in 3.2 as there was "optional" parts of the spec in USB 4 these are not optional any more

      for USB 4 certification you have to support Power Delivery (PD) and USB Type-C connector (you can work old devices with a dongle)

      basically it REQUIRED to have USB Type-C and power delivery to be USB 4

      • Full DP 2 alt mode is optional in USB 4.
        Everything they trot out from today onward will be optional in USB 4.
        Actual power delivery capabilities are optional in USB 4. USB PD may be required, but all that means is you have to talk the talk and negotiate. It doesn't mean you have to actually provide enough power for a decent laptop to charge and run at the same time.

        • Actual power delivery capabilities are optional in USB 4

          People aren't complaining about the actual capabilities of USB PD. They complain about the alternative clusterfuck of standards. If a device advertises that it can do 3A at 12V I expect it to output 60W when I plug something into its socket. Not just when I plug the correct iQ3 compatible device in, or the QC4 compatible device, or the Apple I'm special compatible device, or the Samsung Fast Charge (which thankfully they've dropped) device.

          • by tlhIngan ( 30335 )

            People aren't complaining about the actual capabilities of USB PD. They complain about the alternative clusterfuck of standards. If a device advertises that it can do 3A at 12V I expect it to output 60W when I plug something into its socket. Not just when I plug the correct iQ3 compatible device in, or the QC4 compatible device, or the Apple I'm special compatible device, or the Samsung Fast Charge (which thankfully they've dropped) device.

            That's because USB is only guaranteed to carry 5V @ 100mA. That's th

      • basically it REQUIRED to have USB Type-C and power delivery to be USB 4

        Which is irrelevant because if the USB standards body does what they've done in the past

        • The new "guaranteed to have USB type-C and PD" will be called USB 4.0 type C.
        • Data-only with a type-C connector will be caused USB 4.0 type B.
        • The old USB-A connector with be called USB 4.0 type A.

        Resulting in more confusion and ambiguity when a computer lists "USB 4.0 connector" in its specs. They really need to remove anybody with a marketing

        • by AmiMoJo ( 196126 )

          All connectors except type C are deprecated in USB 4.

          The only different types of cable will be some that don't support higher levels of PD and the cable itself signals that to the port using a resistor (so very low cost).

          The guaranteed PD part only means that all the previous USB charging stuff is deprecated. All current USB-C charging is PD anyway, except for the base 750mA a device can request. Of course most devices won't support 100W.

          It really looks like they have finally sorted everything out. Best con

      • for USB 4 certification you have to support Power Delivery (PD) and USB Type-C connector (you can work old devices with a dongle)

        This is a double edged sword. Expect motherboards to cost $10-20 more.

        Mind you I'm happy to pay this, but people often forget the reason the USB standard is a mess is that it tries to be all things to all people. It's generally not a good idea to go shopping at Ikea on a motorbike, sometimes you need to use your pickup truck.

        • by AmiMoJo ( 196126 ) on Friday May 01, 2020 @06:59AM (#60010322) Homepage Journal

          Motherboards won't cost more unless you want very high power levels available. Support for PD doesn't meant that the device has to be capable of supplying 100W, it just means that it has to understand PD requests and tell whatever is connected "sorry I only support 5V/750mA" if that's the max.

          I expect many motherboards will have one or two charging ports that can support 12V at 1.5A or 3.0A (18W or 36W respectively) but not 5A or 20V for cost reasons.

    • USB 3 is a universal standard. The fact that these idiots keep renaming things while they keep upping the speed capabilities doesn't make it any less universal. ;-)

    • Technically USB 1 was suppose to be the universal standard. Hence why it is called USB (Universal Serial Bus)

      That said, being that I can plug in a 20 year old USB 1 device into my Computer with USB 3 and it still works shows how universal it it.
      We really cannot expect a Device transmitting and expecting to get data at 40gbs to work on a 20 year old computer Where even the RAM Bus was only 1.5 GBS.

    • by eepok ( 545733 )

      It's as universal as universal can be given the constantly changing landscape of technology. Science gonna science.

  • TB3 support is optional. There is zero requirement for anyone to actually support this feature. I think you'll see TB3 support in the first round of chips that go into workstations and business class laptops, but unlikely to see backwards compatibility on consumer class laptops, especially after a year or two. TB3 is neat but it'll have had only a ~3 year lifespan in the consumer space and won't likely be supported much after USB4 is released.

    • USB4 can't pass pci-e TB3 can

      • by Hadlock ( 143607 )

        So on the USB4 wiki page, where it explicitly says "PCIe Tunneling is based on the PCI Express (PCIe) specification" that is incorrect? I can't find any spec sheets that point to USB4 not having pcie tunneling. Thunderbolt 3 has pcie tunneling.
         
        https://en.wikipedia.org/wiki/USB4

        • by Z00L00K ( 682162 )

          Each transition between a bus type will however add latency and some protocol overhead.

          So even if USB4 is usable for connecting a display it's still going to fall short for cases where you have multiple monitors. But it would probably be good enough for office use for most people.

  • Given that most USB support is on the chipset, this will end up being mostly a marketing ploy for bullet points on a laptop or similar mobile device that uses integrated graphics on the same chipset. "Hook your iShiny Pro laptop to an external 8K monitor and enjoy Ultra Resolution!" Of course, the performance will suck for anything other than a desktop UI with office apps, because throwing that many bits around that fast requires prodigious amounts of bandwidth and memory and the inevitable concurrent batte

  • USB4 in a Nutshell (Score:5, Informative)

    by nateman1352 ( 971364 ) on Friday May 01, 2020 @05:27AM (#60010234)

    Reading through the comments, there is a bunch of misunderstandings about USB4. Given the poor quality of the press coverage on it, I am not surprised. So let me clear things up.

    #1 - USB4 is almost identical to Thunderbolt 3
    USB4 is actually an "alternate mode" that gets negotiated via USB PD. The protocol and electrical signaling for USB4 is completely different and totally incompatible with USB3. USB4's protocol and signaling is an exact copy of Thunderbolt 3. USB4/Thunderbolt 3 follows the OSI model; it has a generic data transport physical layer and flow control protocol; on top of that you can tunnel many different protocols. Just like how you can tunnel TCP/IP, IPX, AppleTalk, NetBIOS, etc. over Ethernet; You can tunnel PCIe, DisplayPort, and Ethernet data packets over the USB4/Thunderbolt 3 physical layer. USB4 adds one very important thing that the original Thunderbolt 3 lacked... in addition to PCIe/DIsplayPort/Ethernet USB4 can also tunnel USB data packets. Other than that, the only difference between USB4 and Thunderbolt 3 is that Thunderbolt 3 has a different Alternate Mode ID number. Backward compatibility with USB3 devices is achieved by not activating the USB4 alt mode. USB1/2 has dedicated pins on the USB-C connector, that gives backward compatibility all the way back to 1994.

    #2 - USB-C is a fundamental requirement
    Since USB4 is an alternate mode, and alternate modes do not exist for the older connectors (USB-A, Micro-USB, etc.) it is technically impossible to implement USB4 on top of USB-A.

    #3 - USB4 hubs are going to be crazy complex (aka spendy for the first couple years)
    The USB4 specification requires USB4 hubs to be backwards compatible with both USB3 AND Thunderbolt 3. Moreover, when running in USB4/Thunderbolt 3 mode, all hubs are required to support PCIe and USB traffic. That means that every hub needs to implement not only a USB hub, but a PCIe switch as well. The Thunderbolt 3 compatibility requirement also means that every hub needs to have an integrated XHCI controller, since Thunderbolt 3 can't tunnel USB packets natively USB traffic has to be converted to PCIe traffic to go over the Thunderbolt bus.

    USB4 devices are not required to support Thunderbolt 3 backward compatibility however. This is going to result in some pretty weird behavior. If you plug a USB4 only device directly into a Thunderbolt 3 USB-C port on a laptop... it won't work. But... if you plug a USB4 hub into that Thunderbolt 3 port, and then plug the USB3 only device into the hub... it will work!!! This will be mitigated somewhat by the fact that most USB4 devices will probably support backwards compatibility with USB 2/3, but still be prepared for lots of confusion from less technical users.

    #4 - PCIe Tunneling is a standard feature, and its secure
    Since USB4 is the same as Thunderbolt 3, PCIe tunneling comes standard. And no, this a NOT a security issue. Microsoft implemented support for IOMMU assisted IO Virtualization in WIndows 10 1803/RS4, so the "Thunderstrike" security issue has been fixed. Apple did a similar fix on macOS ~8 years ago. No idea what Google's plans are for implementing IOMMU support in Android/Chrome OS, so there is a chance that PCIe tunneling will be disabled in firmware on stuff running on Google OSes... hopefully not as that would just be one more compatibility headache for everyone.

    • by xOneca ( 1271886 )
      Very informative, thank you!
    • No idea what Google's plans are for implementing IOMMU support in Android/Chrome OS, so there is a chance that PCIe tunneling will be disabled in firmware on stuff running on Google OSes... hopefully not as that would just be one more compatibility headache for everyone.

      Since Google OSes are Linux-based, I'm sure Google will do whatever Linux does. Or maybe contribute a solution to Linux if someone else hasn't done it.

    • First off, thank you for this very informative post.

      You mention "The USB4 specification requires USB4 hubs to be backwards compatible with both USB3 AND Thunderbolt 3."

      Are not all ports of a laptop essentially connected to an 'on-board hub'? Therefore even if a laptop has just one USB4 port (and maybe a few USB3 on another hub) that port essentially MUST double as a Thunderbolt 3 port (because that's what USB4 is really), right?

      What I'm getting at is that all current Thunderbolt 3 docking stations (or eGPU

      • First off, thank you for this very informative post.

        You are very welcome!

        You mention "The USB4 specification requires USB4 hubs to be backwards compatible with both USB3 AND Thunderbolt 3."

        Are not all ports of a laptop essentially connected to an 'on-board hub'? Therefore even if a laptop has just one USB4 port (and maybe a few USB3 on another hub) that port essentially MUST double as a Thunderbolt 3 port (because that's what USB4 is really), right?

        The spec explicitly differentiates between hubs and ports on computers. The ports on a computer are "root ports" and they are NOT required to be Thunderbolt 3 compatible. Remember, Thunderbolt 3 has a different Alternate Mode ID number, so even though the protocol is identical a USB4 root port might not be able to negotiate with a Thunderbolt 3 device just because it doesn't advertise the Thunderbolt alt mode (even thought it could totally talk to it fine.) So the weird behavior that I

  • I hope this works without some proprietary "Bat Soup" in the CPU. Thunderbolt 1, 2 and 3 never worked on generic PC's or older PCIe slots because Intel said you must have "Bat Soup" inside. Thunderbolt 3 for a 10 gig adapter would be sweet on older hardware. A PCIe 2.0 x 16 slot has plenty of bandwidth for USB4....

    I also wonder if we will get USB4 on AMD chip sets and ARM processors. I do not think Intel likes the sound of that.

  • ...and iPhone will still have lightning because Apple can't give up that sweet proprietary license money.
    • I also expect that people have a lot of lightning accessories too. So switching over may be a bit of an issue.
      I can also see issues where people are plugging in Android Devices into the iPhone where iOS doesn't support at this time.

      • I would argue that the fact that "people have a lot of lightning accessories" is actually an incentive for Apple to replace it. Time to sell new accessories to all those people!

    • If Apple changed their cables again, the haters would just be whining about that as well, same as when they whined when Apple changed from the Dock connector to Lightning. Even if their favored Fanfroid manufacturer changed their own cables many more times than that over the time span of time.

  • by jellomizer ( 103300 ) on Friday May 01, 2020 @09:07AM (#60010704)

    I hope we will stop with this resolution war.

    going from standard def to 1080p was a big jump and improved quality.
    If you are either close to your screen (like with a PC/Laptop) or have a large TV over 40" going to 4k does have improvements (but less so)
    going to 8k and above is just burning bandwidth that could be use towards better features.
    Off the top of my head, 3d rendering, color management, improved audio, different languages. Or just save a load of power.

    • 8K is way better then 4K for monitors, if you are supposed to read text on it. Check your mobile phone on why higher DPI is good for your eyes. I think even 12K on 40" monitors will make sense compared to 8K, regarding text rendering quality. 8K on 40" translates to same DPI as FullHD on 10" tablet, which is good, but not great.
    • human eye's sensing elements are some less than 4K monitor has, anything over that is utterly pointless for a scene that is intended to be viewed all at once.

To stay youthful, stay useful.

Working...