Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Power Hardware News

New USB Specification Promises 100W of Power 287

Blacklaw writes "The group behind the USB 3.0 specification has announced a tweak which could lead to impressive new devices, including large-format displays, printers, and even laptops that are powered entirely from a USB port."
This discussion has been archived. No new comments can be posted.

New USB Specification Promises 100W of Power

Comments Filter:
  • Finally (Score:5, Funny)

    by Nerdfest ( 867930 ) on Wednesday August 10, 2011 @08:56AM (#37042908)
    Awesome. I'll finally be able to implement those high powered "negative reinforcement" keyboards I keep dreaming about.
    • by dintech ( 998802 )

      Funny you say that, but in the past I had a small electric shock from a broken USB port. I suppose because the voltage is 5V, this could never be serious but the electricians can tell us about that.

      • Re: (Score:2, Funny)

        by Anonymous Coward

        Funny you say that, but in the past I had a small electric shock from a broken USB port. I suppose because the voltage is 5V, this could never be serious but the electricians can tell us about that.

        That problem will be solved by the new specification as well, since the voltage will go higher.

      • Re:Finally (Score:4, Informative)

        by Anonymous Coward on Wednesday August 10, 2011 @09:28AM (#37043238)

        If you got a shock from your USB port it most likely means you have a broken/disconnected ground lead on your power supply.

        • Mod parent up.

          I have had this problem as well and traced it to missing ground connections on my screen (poorly designed 2 pin switching power supply) so when the laptop charger is not in and my external screen is connected, any groundplane on the laptop gives me a shock. If you look between "ground" on your port and ground on your mains supply with a CRO, you will probably see a fairly big signal...

      • Re:Finally (Score:4, Informative)

        by tibit ( 1762298 ) on Wednesday August 10, 2011 @09:54AM (#37043530)

        It had nothing to do with 5V, nor with the port being broken. It was an issue with electrical wiring (lack of proper PE - Protective Earth a.k.a. "ground"), most likely. Alternatively, there was no PE connection at all, and you were shunting power supply's leakage current to ground. Most PC power supplies have filtering capacitors between the case and the Live and Neutral conductors. Those capacitors form a voltage divider that puts the case at 50% of live voltage in absence of PE connection, that's the source of the leakage current.

      • Re:Finally (Score:4, Insightful)

        by PopeRatzo ( 965947 ) * on Wednesday August 10, 2011 @09:59AM (#37043572) Journal

        Funny you say that, but in the past I had a small electric shock from a broken USB port

        That's nothing. You should see what it feels like when you dip a plugged-in USB cable into conductive gel and stick it about 4 inches up your ass.

        I mean. Not that I would do such a thing.

        Not 4 inches at least...

    • you could cover the keyboard in fire ants, crushed glass, or hot coals

      that would result in negative reinforcement

    • Re:Finally (Score:4, Funny)

      by hedleyroos ( 817147 ) on Wednesday August 10, 2011 @09:20AM (#37043170)

      I for one won't be happy until I can weld from my netbook.

  • and display port is better and put's the load on the video chips / gpu. Maybe use usb for power and not data.

    • by RenHoek ( 101570 )

      USB 3.0 should be less CPU intensive, because IIRC they switched from a polling protocol to an interrupt based protocol.

    • Who cares? If they get to the point that they can show HD video over USB 3.0 without sending all the CPU cores to 100%, then that's a win. I use a USB 2.0 / VGA adapter to increase my Work Notebook from 2 screens to 3. The USB one is usually just showing a datasheet PDF, schematic, or some other static display. Fantastic increase in capability for $50. The USB adapters have have their place, just like mobo-integrated graphics and $300 discrete cards have their place. The exciting thing is the possibli
      • USB 2: 480 Mb/sec theoretical, real world half that

        USB 3: 5 Gb/sec theoretical, real world about 3.2 Gb/sec

        DVI: 4 Gb/sec single link, need dual-link for more than 1900x1200 resolution

        DisplayPort: 1.6 to 5.4 Gb/sec per lane, four lanes, for 17.3 Gb/sec max, real world is 80% of that (enough for four 1080p60 displays), plus a 1 Mb/sec auxiliary channel.

        Thunderbolt: 20 Gb/sec bi-directional, can carry the four lanes of DisplayPort data with room to spare.

        So you have less bandwidth than a single-link DVI and fa

    • this is stupid. There are explicitly reasons to not want to put the load on the video chips/gpu. Part of that is of course, things that don't involve graphics. Really are you that shortsighted? What the hell does a printer need a gpu for? Printers need a CPU to process the data, mostly.

  • by 6031769 ( 829845 ) on Wednesday August 10, 2011 @08:59AM (#37042940) Homepage Journal

    Netbook battery life drops to an average of 12 minutes.

    • by somersault ( 912633 ) on Wednesday August 10, 2011 @09:02AM (#37042968) Homepage Journal

      But the beauty of this is that you can power the netbook from its USB port, too!

      • And charge the netbook from itself!

    • So this will give Apple a good reason not to include any USB 3.0 ports...
  • Taking bets on how long before a USB powered vacuum cleaner comes out of Asia!
  • > laptops that are powered entirely from a USB port Finally I get to plug 2 laptops together whir their USB ports. Free perpetual energy. Problem science ?
  • So where is the power coming from a AC to usb power only box?

    Why not just keep what they have now? works in more places. Also some kind of standard car power jack in put will be nice to for laptops.

  • by erroneus ( 253617 ) on Wednesday August 10, 2011 @09:02AM (#37042976) Homepage

    So now, I can hook my computer to my car to jump-start it!

    You know? Long ago, Apple made a display that was powered through the display cable. It worked but it was not popular in the end as they stopped doing it. So they are talking about bringing it back again?

    I can see power enough to power some devices, but 100W?

    You know, whatever USB standards come out, it should work equally well on a battery powered laptop and a desktop as well. People will get confused and frustrated when they buy a fancy new USB 3.0 display unit only to find they can't take it with them on the road because it doesn't work well with their laptop and the tiny travel power adapter they use while on the road.

    • So now, I can hook my computer to my car to jump-start it!

      You can't jump a car without some serious amps; but virtually anything that can put out 12-14volts for a while without keeling over dead will allow you to trickle-charge your car's own starter battery and then start normally. Doing the math for how long a trickle charge will take to shove enough amp-hours into an automotive lead-acid for it to start your engine(particularly on the bitter, freezing, late evening in the sleet when this is inevitably occurring) will tell you that this isn't a method for the im

    • Youre not seeing the big picture. Think of this scenario:

      You have a monitor, mouse, keyboard, and printer at work. All the peripherals are plugged into the monitor's USB hub. When you bring your laptop into work, you plug a single USB3 cable from the monitor into the laptop; this delivers 90W of power (for charging), and also hooks in all of the peripherals.

      THATS what theyre shooting for.

  • by RenHoek ( 101570 ) on Wednesday August 10, 2011 @09:04AM (#37042994) Homepage

    I was hoping Power over Ethernet (PoE) was going to be successful since it would mean a LOT less cables, but this seems like a good alternative. I just hope it becomes a standard because PoE was nowhere to be found.

    • Re:PoE replacement (Score:4, Insightful)

      by vlm ( 69642 ) on Wednesday August 10, 2011 @09:11AM (#37043062)

      I just hope it becomes a standard because PoE was nowhere to be found.

      The mark up for PoE switches was/is spectacular, because the marketing guys told them to price that feature at "just below the cost of hiring a union electrician to run a dedicated AC line next to the wall plug". Which, it turns out, is a heck of a lot of money.

      The marketing people forgot about extension cords. So, most of the real world uses extension cords instead. Whoops. PoE was a cool idea, but too sabotaged to ever make it.

      • by Amouth ( 879122 )

        yeap - still tempted to build a charge box powered off PoE so i can charge my laptop over it in meetings.

        never understood why no laptop manufacture has done this.. it just seems like an obvious one to me.

        • Ahh, because we use wireless in meetings?
        • by ledow ( 319597 )

          Because PoE gives you about 25W (up to 50W if you don't care about specs, standards and safety) at, usually, 47V. Converting that down to 19V probably takes quite a bit of efficiency so you'll be lucky to get 10W.

          Which *ISN'T* enough to power most modern laptops even just to run, let alone run while charging the battery. So, yeah, you could probably charge a laptop from a PoE port, which requires expensive switches, expensive efficient convertors, specialised circuity for the niche case you specify (i.e.

          • by vlm ( 69642 )

            Because PoE gives you about 25W (up to 50W if you don't care about specs, standards and safety) at, usually, 47V. Converting that down to 19V probably takes quite a bit of efficiency so you'll be lucky to get 10W.

            It would take a miracle to get less than 10W because thats what you'd get out of a linear regulator, the least efficient form of commercially available regulators.

            lots/most AC switchers start by rectifying the incoming AC ... the question is, can you get it to start off only 47 volts. A working switcher would deliver near 25W out with 25W in. You can't dump 15 watts in those tiny little things, they'd literally melt.

          • Because PoE gives you about 25W (up to 50W if you don't care about specs, standards and safety) at, usually, 47V.

            And that is the latest "POE plus" stuff, the older POE stuff is quite a bit lower.

            at, usually, 47V. Converting that down to 19V probably takes quite a bit of efficiency so you'll be lucky to get 10W.

            Only if your converter is awful, 20W should be perfectly achivable if the converter doesn't suck.

    • Re:PoE replacement (Score:4, Informative)

      by fuzzyfuzzyfungus ( 1223518 ) on Wednesday August 10, 2011 @09:12AM (#37043078) Journal
      PoE, for whatever reason, is absolutely dead in the consumer space; but it is alive and kicking in corporate gear. Not quite 100%, of course, because a PoE switch necessarily costs more than a non-PoE one, and wasting PoE ports on desktops and docking stations doesn't make any sense; but some gigantic portion of the corporate world's APs, IP cameras, access control devices, and similar low-power-and-networked junk are PoE powered...
      • For me EoP works much better in my home environment. It would be a huge pain/cost to wire up most UK houses for ethernet, and likewise WiFi is not feasible in many situations (either the walls are too thick for a signal to go more than a couple of rooms meaning you're back to ethernet to extend the network or your signal is drenched in the 500 other signals competing for the same bit of spectrum). With EoP I just plug in a little box in each room I need it and wire/wireless off that. Cheap, incredibly easy
      • The place I work bought PoE "injectors." They are rack mounted equipment that just feeds power to the device. No new switch needed. Just run the Ethernet patch out of the switch, into the PoE device then from there to the patch panel.

        It looked something like this: []

        But it wasn't "smart" so I can't imagine they paid anywhere near that price.

      • by smash ( 1351 )
        POE works in a corp environment if everyone has an IP phone with a piggyback port for their PC. Cisco gear does this; the price of cheap 2960 POE switches has come down sufficiently now that it is cheaper and WAY less hassle than running power boards and power bricks into the phones.
        • by jon3k ( 691256 )
          That's what we do, exactly, PoE 2960s. wall plate->ip phone->computer. also used to power wireless APs.
    • My previous company provided managed/hosted communications services and we used PoE quite a bit. When deploying 100 or so phones it ends up being cheaper to put in a PoE switch than it is to buy power bricks at $5/pop. In the consumer space you only have, what, maybe 3 devices on a switch? At home I have an Ooma, LinkStation NAS, and my computer. PoE only provides enough power (~25W) for the Ooma (maybe), so as much as I'd like to get rid of three power cords it just isn't possible or cost-effective.


      • I would take it just for the first point there. Get rid of a bunch of power bricks and give me one standard where the device and brick can negotiate how much power to send. Even if I have to have 4 bricks for USB A/B/mini/micro it's far better than the current mess of not knowing which brick goes with which device.
    • by pz ( 113803 )

      PoE doesn't help much when you're connecting wirelessly to the net. That dramatically reduces the number of devices -- in terms of what consumers own and use -- for which PoE would be potentially useful.

    • by tibit ( 1762298 )

      PoE is nowhere to be found? WHAT? Even I'm designing instrumentation (transducers) that uses PoE, every IP phone in my workplace runs on PoE. A decent HP 2626-PWR switch with 24 PoE ports sells on eBay for $300 BIN. It's hardly expensive IMHO.

    • by jon3k ( 691256 )
      Well, there's the original 802.3af then they added PoE+ (802.3at) which was some more power. But I think the problem is the in-wall cabling and the rating. Right now we can run ethernet just about anywhere because it's just low voltage DC. I always wondered how much more we could crank up the voltage on 'ol cat5e before we started igniting walls and ceilings all over the world.

      Any experts want to chime in?
  • What else has driven technology so hard? Pun intended.
  • about using the heat form the pc to heat rooms?

    are we approaching a world where we can replace our electric outlets and our heating ducts with our pcs?

    when can i replace the sump pump and hot water tank in my basement with my pc?

    • about using the heat form the pc to heat rooms?

      when can i replace the sump pump...

      Does your "sump pump" boil the water till it evaporates? Mine just pumps the water to the weeping tile outside.

  • by Chrisq ( 894406 ) on Wednesday August 10, 2011 @09:15AM (#37043120)
    Does this mean that USB3 cannot be implemented on tablets, netbooks and other low power portables?
    • USB negotiates power requirements. It could refuse to deliver such a high power output. Additionally, it could accept power over usb from a powered hub or whatnot.

    • Moot issue. USB3 is going to be about as ubiquitous as Firewire800. USB2 hits all necessary expectations for its pricepoint, and Firewire400, eSATA, or good ole ethernet cover everything USB2 lacks. They'll get a slight bump from being new, then it's all yawns.
  • Good news in my book. Now they need to provide USB mode for transferring data over hundreds of meters long cables, as last remaining (Ethernet) obstacle towards unification.
  • by The Raven ( 30575 ) on Wednesday August 10, 2011 @09:21AM (#37043180) Homepage

    The current spec allows for about 4.5W (900mA at 5V). One of the last sentences in the article mentions 0.9 Watts.

    Now, I could totally understand this kind of mistake in the past. But don't these people understand the wonder that is Google? Before I made this post, I wanted to make sure that I wasn't the dufus, and typed 900mA * 5V into Google. It's not that hard to fact check, is it?

  • My laptop has a 65W power adapter.

    If the USB ports are rated for 100W, I would need a 365W power adapter(3 USB ports), and a battery capable of discharging at the higher rates.

    So, would only people who can easily carry 10+ kg with them all the time have laptops?

    • Power over USB is negotiated. If the supplier does not have it does not grant the device's request. Simply put this means you won't be able to use all your devices with your laptop without adding an external power supply to the device... There are other physics problems with their idea. (At 50V, 2A still seems like a lot of current to put through your small USB connector. At 100V your USB cable is not a safety hazard...)

  • 900ma at 5V, using a standard power(W) = Current(I) time Voltage (V) w=iv. 900ma = .9A *5V = 4.5W
  • Once again, USB starts out way behind the game. Last time it still overtook Firewire and buried it. But I have a feeling that this time, with Apple in a completely different market position, it's not going to be so simple to catch up with Thunderbolt.
    • Alas, I have yet to see a Thunderbolt flash drive or external HD. (Granted I'm not a Mac person)

      On the other hand, USB3 can handle *existing* USB2 and USB1 devices.

      If Thunderbolt could handle *existing* USB2 and USB1 devices, USB3 would be in trouble but USB patents would probably kill Thunderbolt. That's progress in the USA today.

    • And just like firewire, thunderbolt is much faster than it's USB counterpart. Doesn't mean it's going to win.

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday August 10, 2011 @09:37AM (#37043340) Journal
    Even with USB2, there was the persistent problem that certain applications(notably 2.5 inch external drives) were right on the edge of what the spec allowed. Some machines played fast and loose, and everything worked fine, some played to spec, and the device wouldn't spin up, or the bus would freak out, or whatever. Despite USB's formalized, standardized, power-request mechanism(100ma on connect, negotiate in units of 100ma for up to 400 more...), the, er... 'inventive'... nature of the peripheral ecosystem always created some uncertainty: Some devices just requested 500ma at all times, to avoid possible brownouts, leading to more spec-compliant busses freaking out about lack of power even when actual draw was well within safe limits, some devices (fans, LED goosenecks, humping dogs) just grabbed the +5 and ground rails and hoped for the best, without any negotiations. Some hubs report themselves as self-powered(and thus good for 500ma per-port) even when they were bus powered(and thus only good for 400ma across however many ports they had). Some others were self-powered; but with wall-warts that could only deliver 500ma to a number of ports smaller than the number available(7-port hubs with 1amp adapters, I'm looking at you...)

    This new standard seems like it would simply be a polite codification of this confusion. Particularly at low voltage, 100watts is nontrivial current(and nontrivial power generally, most non-DTR laptop bricks are less than that...) Many PCB layouts would burn a trace trying to deliver that, and you can bet that your garden-variety 10-USB-ports boring desktop isn't going to ship with 1000watts of PSU headroom...

    This will mean that, in effect, devices will be able to demand up to 100 watts in a 'compliant' way; but the capabilities of USB ports on the market will vary enormously by device. A laptop with an 85 watt power brick is hardly going to be good for 100watts out of a port. Worse, it might be good for 50 when lightly loaded and fully charged; but only 5 when charging its battery and flogging its CPU... Having a device that only intermittently functions is near worthless, even if it is all entirely standard... A desktop might ship with the ability to push a single port to 100; but then it will either have to beef up its traces significantly, or have the always-confusing-to-dumb-users-and-people-fumbling-behind-desks '1 special blessed "high power" port, and 9 identical-feeling-but-low-power ones' configuration. Fan-fucking-tastic...

    While a bit more power on the bus certainly would expand the number of viable, bus-powered use cases, I'm just not sure that such a high 'standard' number can ever be usefully 'standard'. Hooray, it is now officially standard for specialized devices to shove 100watts across a USB bus. Doesn't change the fact that it won't work in 90+% of ports, and will probably burn a fair percentage of cheaper cables. Unless they come up with some sensible set of "tiers", so that people actually know what works with what, this seems like it is going to end in a mess of nominally-USB powered docking stations with wall warts and mini-B connectors, at best.

    While its comparative obscurity, and the general lack of bus-powered devices made it less of an issue, Firewire flirted with this problem in its early days: Both available power and available voltage on a given 6-pin port were widely variable: A desktop could, if it so chose, be pumping out 24 volts and reasonably credible wattage. One of the(almost exclusively Apple) laptops with a 6-pin port might be limited to a handful of watts at whatever voltage its battery was set to provide. In practice, much firewire gear just skipped bus power entirely(despite the fact that charging over firewire would have been a very popular consumer camcorder feature, if today's flip-likes are anything to go by), the mixture of widely variable power availability, and the '' or just 'IEEE1394' connectors entirely without bus power pretty much doomed the widespread availability of bus-powered peripherals. USB's pitiful 2.5watts was rather limiting; but at least you could reasonably assume that it would be there...
    • This was well reasoned, well informed and well written.

      This of course leads to a question:

      What the hell made you think this comment had any place on slashdot?
  • I'm looking forward to the USB-powered space heaters that office secretaries will put under their desks. They were forbidden from doing that before because it takes too much power from the wall plugs, but this comes from the COMPUTER so it must be okay!
    • Limiting space heaters generally has nothing to do with plug capacity and everything to do with brainless secretaries ( and execs too, lets be honest) leaving them on for the weekend without a second thought. ITs a fire hazard because of the nature of the device and how its used, not because it draws too much power.
  • Finally I see a purpose for the 1.2Kilowatt PC power supplies.

    One hundred watts per port? That's insane! I could run a nicely-equipped Atom ITX MB, HD, DVD drive, and ION graphics adapter off such a port!

    I see USB hubs getting much more expensive if this standard gains traction...

    • by tibit ( 1762298 )

      While technically you could implement it as 100W per port, any affordable host will probably limit the overall USB power consumption to 150W or so, across all outputs. That lets you power a monitor, a hefty external hard drive, and a few small devices like mice, keyboards, flash drives. Power management is called just that for a reason, you know...

      This article wins at making people jump to silly conclusions. I'm bookmarking it and will reference it every time someone asks why engineering is hard. It's "hard

  • I wonder if they are hoping to get these new super-power USB ports classified as Electric Vehicle charging stations, thus eligible for several thousand dollars in federal subsidies and grants []? Imagine charging your Chevy Volt from your laptop USB port!

  • I don't know how this would ever work on a laptop, or what this will mean for power supplies (probably have to double in size to realistically even use just a small percentage of your USB ports) but the ability to not plug in printers/other high power peripherals to the wall would be nice.

  • by hey ( 83763 )

    Hopefully they'd design several new "standard" plugs too.

  • I'm really looking forward to the required power supply that will be the approximate size and weight of a cinder block.
  • by AdamHaun ( 43173 ) on Wednesday August 10, 2011 @11:51AM (#37044860) Journal

    As several people have pointed out, 100W seems like too much. I bet this is just a specification tweak to provide headroom for devices that need more than 4.5W (like 8W or 10W or 15W). In other words, the spec is no longer an artificial limit on how much power you can provide.

"Let every man teach his son, teach his daughter, that labor is honorable." -- Robert G. Ingersoll