Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays Google Open Source Software Television

YouTube Goes 4K — and VP9 — At CES 255

sfcrazy writes "YouTube will demonstrate 4K videos at the upcoming CES. That's not the best news, the best part of this story is that Google will do it using it's own open sourced VP9 technology. Google acquired the technology from O2 and open sourced it. Google started offering the codec on royalty free basis to vendors to boost adoption. Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event. Google today announced that all leading hardware vendors will start supporting the royalty-free VP9 codecs. These hardware vendors include major names like ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba."
This discussion has been archived. No new comments can be posted.

YouTube Goes 4K — and VP9 — At CES

Comments Filter:
  • by ackthpt ( 218170 ) on Friday January 03, 2014 @07:54PM (#45861927) Homepage Journal

    I don't quite have to have it, yet.

    • by Anonymous Coward on Friday January 03, 2014 @08:03PM (#45861977)

      4K for $3K, isn't that a great deal!? Less than a dollar per pixel!

      • 4K for $3K, isn't that a great deal!? Less than a dollar per pixel!

        4K is the horizontal resolution, not the number of pixels. Actually, it is 3840 pixels × 2160 for most "4K" TVs, or about 8.3 MegaPixels. Some models are much less than $3K. Here is one [amazon.com] for $500.

        • by tlhIngan ( 30335 ) <slashdot.worf@net> on Saturday January 04, 2014 @01:51AM (#45863657)

          4K is the horizontal resolution, not the number of pixels. Actually, it is 3840 pixels Ã-- 2160 for most "4K" TVs, or about 8.3 MegaPixels. Some models are much less than $3K. Here is one for $500.

          4K Ultra HD is quad-1080P, i.e., 3840x2160.
          "4K" can also refer ot 4K Cinema, which is 4096x2160, where the 4K literally means 4Ki.

          Though, sometimes it's also confused with the plain old 4000x2160 format, or 4K.

          Of course, home electronics use 3840x2160 because it's just doubling 1080p in each dimension, making it easy to scale.

          And until you have HDMI2.0, it's really 3840x2160 @ 30fps (same bandwidth as 1080p 3D @ 60fps) using HDMI 1.4 which doubles bandwidth of HDMI 1.3 which supports 1080p @ 60.

          HDMI 2.0 is to support full 4K (Cinema) @ 60fps.

      • by evilviper ( 135110 ) on Saturday January 04, 2014 @05:10AM (#45864055) Journal

        4K for $3K, isn't that a great deal!? Less than a dollar per pixel!

        I suppose you think an HDTV has just 1080 pixels as well?

        3840x2160 = 8294400 pixels

        8294400 / 3000 = 2,764.8 pixels per dollar

        I just wonder if they have the same dead-pixels policy as my first 800x600 LCD monitor way back when. Three out of 8 million isn't a bad ratio.

        • Well if we're gonna get technical, you just calculated pixels per dollar whereas he was calculating dollars per pixel. Switch your numbers around:
          3000 dollars / 8294400 pixels = 0.00036 dollars per pixel
          So technically he is still right. It is less than a dollar per pixel.

      • by fuzzyfuzzyfungus ( 1223518 ) on Friday January 03, 2014 @08:22PM (#45862119) Journal
        The Seiki SE50UY04 [cnet.com] shows up at less than a thousand pretty frequently.

        The one major downside is that the cheapies almost certainly have neither Displayport nor HDMI 2.0 HDMI 1.4 will drive a 4k panel; but maxes out at something like 30Hz. Given that pre-canned 4k video is practically nonexistent (but would be the material that might have been shot at under 30FPS originally, and has plenty of detail in the original film if somebody feels like doing a good transfer), the only real use case is hooking it up to a computer, where the refresh rate will promptly unimpress you.

        It won't flicker or anything, this isn't the CRT days; but 30FPS is Not Good.
        • by aliquis ( 678370 )

          If the film is filmed in 4K film wouldn't one really want to have that rather than 4K UHD?

          Seem a little weird to shrink the pictures just a little bit.

          • Unless the DCI changes course, fast, the point will be largely moot. DCI 4K is higher resolution than 4k UHD; but 4k UHD hardware is available right now (with the cheap seats down to under $1000, and the 'walk into a Sony store and look rich and clueless' option still likely to set you back only ~$5k). DCI 4K gear is... niche. Not only is it almost entirely high end projectors designed for commercial movie theaters (and priced accordingly), the DCI writes their specs with all the paranoia, loathing, and pat
            • by Kjella ( 173770 )

              The DCI spec won't change because it's a cinema standard to protect cinema content, but there's no reason for consumer level gear to follow it. Whether you have a 16:9 (HDTV), 17:9 (DCI) or 21:9 (ultra wide) ratio screen it should play consumer movies just fine. The only question is if the delivery system will have a 17:9 version of the movie, it depends on how they've reframed the movie. If you could take the 3840x2160 stream and add a 256x2160 slice left/right you could have "dual format" discs that'd giv

        • but would be the material that might have been shot at under 30FPS originally

          Recent Peter Jackson films excepted.

        • by Jupix ( 916634 ) on Saturday January 04, 2014 @06:39AM (#45864233)

          I've actually got a Sony X9005A as a desktop display for my PC and no, the 29Hz refresh rate does not make it "unimpressive". If you're looking for getting impressed then the resolution will vastly overpower the refresh rate. When you have a window-like view to your games, photos etc. you just instinctively ignore the slow refresh.

          The worst thing is probably the input lag introduced by the low refresh rate. The thing has one of the lowest input lag scores on the market, but the slow refresh still makes cursor input really laggy. It's not the kind of lag you see but the kind you feel. It's gone if you switch to 1080p, but you won't if you have a 4K panel, will you.

          FWIW the Sony supports hdmi 2.0 and thus 4k@60fps, but good luck finding a GPU that outputs it. I'm stuck waiting for the eventual NV GTX 800 series which probably will. NVIDIA haven't even confirmed it.

          On the topic of Youtube, I thought they'd supported 4K since 2010 [blogspot.fi]. In fact 4K vids on Youtube were one of the first materials I tested my panel on. They stream fine over 24mbps ADSL2 but the bitrate is not great (the vids are noisy).

      • by Salgat ( 1098063 )
        That's not a monitor and doesn't even support 60fps at 4K resolutions. Try again.
    • Yeah... (Score:5, Funny)

      by rsilvergun ( 571051 ) on Friday January 03, 2014 @08:14PM (#45862057)
      but the cat videos look _amazing_ in 4k.
    • Supposedly Polaroid is to offer a 4k display under $1000. Still rich for me but for some...
    • Re: (Score:3, Interesting)

      by ArtForz ( 1239798 )

      Dell UP2414Q. $1299 for a 24" 3840x2160 60Hz IPS (also 10bpp and wide gamut, but... meh.).
      Finally a monitor for us high-DPI weirdos clinging to their preciousss IBM T221.

    • Try $500 [amazon.com] plus shipping for a 39". Of course it's only capable of 30 Hz at UltraHD resolution because it only has HDMI, but it's available. I can't imagine why they didn't include a DisplayPort connector, since DisplayPort is royalty-free, but they didn't. I'm hoping they'll release another model this year that includes DisplayPort.

      Reviews say Seiki customer service is nothing great, and there are more than their fair share of DOA units, but it's a start.

      • Probably because DisplayPort isn't suitable for televisions. What it really needs is HDMI 2.0, which can support rates much higher than 30 Hz at 4k.

    • There is SEIKI and other chinese manufacturers with 4K TVs for $500-$1000. Not sure if difference is big to these $3K monitors for programming work, which is what I care.
    • by Kjella ( 173770 )

      Unfortunately the current crop of monitors is aimed mainly at the photography professional, since being able to see 8MP of a still instead of 2MP is immediately useful to anyone with a modern camera. The downside - for us that want affordable monitors at least - is that they then also put great emphasis on color accuracy and coverage, that $3k buys you a Dell UltraSharp 32 with 99% AdobeRGB coverage and factory calibration to a delta-e 2. In short, they promise you plug it in and the colors are pretty much

    • It is just entering the consumer arena. Hence it is expensive. However since it is entering the consumer arena, we see companies like Youtube starting to support it.

      • by EdIII ( 1114411 )

        Doesn't matter about entering the consumer arena with display technology. I think this is very limited by bandwidth technology and realistic costs of deployment in the US. It's going to be amazeballs, but in South Korea, Japan, and several EU countries.

        YouTube said it was demonstrating streaming of 4k. I would assume that to be true since YouTube is a website...

        Doesn't that put the required bandwidth for streaming at 20-30Mpbs in ideal networking conditions with moderate compression? That's from a single

        • Speak for yourself. (100mbit down 65mbit up). I'll down load the video and stream it to you while I watch it.

          Yay FIOS.

  • 4K video (Score:5, Informative)

    by BringsApples ( 3418089 ) on Friday January 03, 2014 @07:58PM (#45861951)
    I had to look it up, so here ya go...

    4K resolution is a generic term for display devices or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography

    • Woops, forgot to provide the link to it [wikipedia.org] on wikipedia.
    • by ackthpt ( 218170 )

      I had to look it up, so here ya go...

      4K resolution is a generic term for display devices or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography

      And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity.

      Well, I'd like the vertical resolution just for photo editing, it would be nice to see the full resolution without affect from scaling algorithms.

      • And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity

        ...in stills.

        It's still going to come down to bandwidth. VP9 might be a revolution in codecs, but it won't deliver 4k to me except for very special events until we've all got FIOS.

        • Yeah, but it will deliver 1080p and 720p video to you with lower bandwidth requirements. Less buffering and fewer artifacts (because of lowered data requirements and a corresponding lower rate of dropped packets).

        • by jedidiah ( 1196 )

          >> And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity
          >
          > ...in stills.

          Nope. Video shows you all of Hollywood's ugly skin conditions in all of their frightening glory. Although you don't even need 4K for that. Just regular 1080p will give you that effect.

          Sometimes, you really don't want to see everything.

          • by EdIII ( 1114411 )

            Sometimes, you really don't want to see everything.

            And yet... sometimes, in those special moments, you really really do want to see everything :)

            I'm right, right?

      • Re: (Score:3, Insightful)

        by Kjella ( 173770 )

        And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity.

        You're like an echo from 15 years ago when 1920x1080 was to replace NTSC 640x480, both HD porn and HD sports looks great despite the naysaying. Movies and TV too, if the costumes, props, backdrops or special effects no longer looked real they simply had to improve until they did. Why should UHD be any different? It might be that many people meet it with a yawn like Super Audio CD vs CD, for the vast majority a regular CD was more than good enough already but the "too much detail" should be thoroughly debunk

      • by EdIII ( 1114411 )

        see the full resolution without affect from scaling algorithms

        Laserdisc baby :)

        I really hate that waterfall effect amongst all others. It's a almost a deal breaker to me.

        That's why the top of the line Laserdisc player and some titles just look awesome, even by today's standards. At least, IMHO.

    • 3840x2160 for "video" (four full-HD screens per display)
      4096x2160 for "movies"

      At $3K for a 4k screen, that's indeed cheap per pixel, but it's time for the chicken-and-egg of prices/adoption vs available content.

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        That's a complete waste of money. You'd need a gigantic screen and to be sitting so close that you can't see all of it in order to need 4k. 4k is for movie theaters.

        I'm surprised that the idiots on slashdot aren't aware of the pixel density that the eye can perceive is the limiting factor here. Even with the current crop of HDTVs being 1920x1080, you need a rather massive TV and to be sitting quite close in order for the pixels to be a problem.

        4K really only makes sense for monitors and large screens like a

        • I've spent the last 7 years with a 37" 1080p screen as a primary monitor. I sit about 4 feet from it, can't see pixels, but almost (I see them at 3').
          Games looks absolutely awesome.
          Since it's showing signs of aging, if I can get a cheap 60" 4k screen in 2-3 years, I'll sit 5-6 feet from it, except when I have to display three datasheets side-by-side.
          Games will look awesome.

  • by fuzzyfuzzyfungus ( 1223518 ) on Friday January 03, 2014 @08:12PM (#45862041) Journal
    Is there any word on how this '4K' actually looks at bitrates Youtube can push enough ads to pay for, and your ISP will permit?

    I have the greatest respect for people who actually handle the challenges of paying the computational costs of video compression and decompression (and scaling if necessary) as efficiently as possible; but once their work is done, a nominal resolution (even an actual X pixels by Y pixels value, not some marketing bullshit) is nearly meaningless unless you are in the (rare for video, probably less rare for audio) situation of having such a high bitrate that the quality of your reproduction is being constrained by your resolution.

    Barring an increase in bitrate, will it even be possible to distinguish between Xmb/s nominally 1080 video scaled up to 4k and Xmb/s nominally 4k video?
    • Re: (Score:2, Funny)

      by alen ( 225700 )

      you can see the individual brain cells of the russian drivers splatter on the camera as they get hit by trucks on the highway

    • by ackthpt ( 218170 )

      I have a feeling AT&T, Comcast, et al, are working feverishly to figure if they can make money with existing bandwith caps on this.

      • That's what killing network neutrality is about.
        "The only way to get these 4k images are if you download from my pay-per-view"

  • by QA ( 146189 ) on Friday January 03, 2014 @08:14PM (#45862055)

    That’s not the best news, the best part of this story is that Google will do it using it’s own open sourced VP9 technology. Google acquired the technology from O2 and open sourced it. Google started offering the codec on royalty free basis to vendors to boost adoption.

    Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event.

    VP9 keeps FSF happy, users happy, content providers happy, carriers/ISPs happy and hardware vendors happy.

    Google today announced that most leading hardware vendors will start supporting the royalty-free VP9 codecs. These hardware vendors include major names like ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba.

    VP9 is beneficial for everyone as it makes the codec available to vendors for free of cost – thus boosting its adoption compared to the non-free H.264/265. At the same time being Open Standard and Open Source it also ensures that users won’t require proprietary (and insecure) technologies like Flash to view content. The third benefit of VP9 is that it can deliver high-resolutions at low bit-rates thus using less bandwidth to watch content. It means that those on slower connections will not have to wait for buffering and be satisfied with low-resolution videos. It will benefit those on faster connections as they won’t have to waste their expensive bandwidth on videos.

    • by Overzeetop ( 214511 ) on Friday January 03, 2014 @10:24PM (#45862793) Journal

      And not a single Apple device will play VP9. Every Apple device will require transcoding, or using whatever format they find optimizes their [battery life|thermal envelope|PROFIT], which will nudge every well heeled, non-technical user to gravitate away from VP9.

      • by Jah-Wren Ryel ( 80510 ) on Friday January 03, 2014 @11:52PM (#45863127)

        And not a single Apple device will play VP9. Every Apple device will require transcoding, or using whatever format they find optimizes their [battery life|thermal envelope|PROFIT], which will nudge every well heeled, non-technical user to gravitate away from VP9.

        Jobs is gone. Android marketshare is up. Apple may not be as wedded to h265 as they were to h264. Things change.

        • True. Jony has destroyed much of the UI Jobs toiled over; maybe he will sell out and join this open standard. My money is on the no side of this though. I see Apple devolving into all the bad things with none of the elegance now that Jobs is feeding the worms.

      • by ne0n ( 884282 )
        That's OK, Macs don't come in 4k anyway. When they do I suspect they'll be interested in supporting Youtube.
        • by dbraden ( 214956 )

          The new Mac Pro supports 4K, it can drive three 4K displays simultaneously. They'll probably release their own 4K monitor(s) later this year.

      • And not a single Apple device will play VP9.

        And your evidence to support this assertion? So what if Apple is not among the chipmakers supporting VP9 before the gate, since Apple isn't in the business of making chips, but buying them and putting them in consumer electronics.

      • And not a single Apple device will play VP9.

        That's utterly untrue. You might mean that Apple devices won't include VP9 support out-of-the-box (unlike Android), but that's quite different, and won't necessarily hamper adoption. You might as well say: "And not a single Apple device will have Google Maps."

        All that's needed is for a popular iOS multimedia app to include VP9, or perhaps even for someone to simply implement a VP9 decoder in javascript:

        http://libwebpjs.hohenlimburg.org/vp8/webm-javascript-decode [hohenlimburg.org]

    • by Kjella ( 173770 )

      Don't get your hopes up too high because I've heard the same all the way back to the VP3-based Theora and every version since. It's the escape hatch/emergency break to keep H.264/HEVC from abusing their near monopoly but in reality nobody seems to want to abandon them, not even Google. Kind of like how VC-1 is in the BluRay standard and in every player but 99% of recent discs use H.264. Even Firefox finally bit the bullet in October this year and said they'd use Cisco's H.264 blob on platforms that don't ha

      • The difference with VP9 is that there isn't already an entrenched standard that's better. When work on Theora started, MPEG-1 was still pretty common for web video, but by the time it was released everyone had moved to MPEG-4. Theora was definitely a big step up from MPEG-1 (and MPEG-2), but not as good as MPEG-4. When VP8 was open sourced, it was better than MPEG-4 (ASP), but most of the rest of the world had moved on to H.264. Now VP9 and H.265 are appearing at the same time. No one is considering sw
        • by fatphil ( 181876 )
          Frauhoffer say that VP9 has 8.4% worse bitrate (at same PSNR) than H.264/MPEG-AVC, and has encoding rates that are 100x slower. See page 3 here:
          http://iphome.hhi.de/marpe/download/Performance_HEVC_VP9_X264_PCS_2013_preprint.pdf

          I see no incentive to move in the direction of VP9. It's google very persuasively shoving their proprietory format on everyone, that's all. We criticised MicroSoft for doing that in the past, we shouldn't pretend that google is anything apart from an enormous multinational that wants
    • While YouTube's preference is VP9, [YouTube's Francisco ] Varela left open the possibility that the site might use HEVC in the future. ''We are not announcing that we will not support HEVC,'' said Varela, adding that YouTube supports 17 different codecs currently.

      According to YouTube, the first partner TVs and other devices that incorporate VP9 will start hitting the market in 2015. In 2014, YouTube will start transcoding HD video into VP9.

      YouTube's Ultra HD Strategy Could Spark Battle Over 4K Video-Delivery Tech [variety.com]

      I am not convinced that the transcode to YouTube will be enough to derail HEVC.

      On May 9, 2013, NHK and Mitsubishi Electric announced that they had jointly developed the first HEVC encoder for 8K Ultra HD TV, which is also called Super Hi-Vision (SHV). The HEVC encoder supports the Main 10 profile at Level 6.1 allowing it to encode 10-bit video with a resolution of 7680x4320 at 60 fps.

      On October 16, 2013, the OpenHEVC decoder was added to FFmpeg.

      On October 29, 2013, Elemental Technologies announced support for real-time 4K HEVC video processing. Elemental provided live video streaming of the 2013 Osaka Marathon on October 27, 2013, in a workflow designed by K-Opticom, a telecommunications operator in Japan. Live coverage of the race in 4K HEVC was available to viewers at the International Exhibition Center in Osaka. This transmission of 4K HEVC video in real-time was an industry-first.

      On November 14, 2013, DivX developers released information on HEVC decoding performance using an Intel i7 CPU at 3.5 GHz which had 4 cores and 8 threads. The DivX 10.1 Beta decoder was capable of 210.9 fps at 720p, 101.5 fps at 1080p, and 29.6 fps at 4K.

      High Efficiency Video Coding [wikipedia.org]

      An inbuilt HEVC decoder is not entirely new of course, as LG's LA970 series of UHDTVs released last year also offered the same feature. However, the company's latest 4K Ultra HD TVs due to be unveiled at CES 2014 will use a ViXS XCode 6400 SoC (system on chip) that can decode HEVC-based content at 3840x2160 resolution with support for 60p frame rate and 10-bit colour depth, a world's first.

      LG's 2014 4K TV Models Gets HDMI 2.0 & 10-Bit HEVC/H.265 Decoder [hdtvtest.co.uk] [Jan 3]

    • by iluvcapra ( 782887 ) on Saturday January 04, 2014 @01:05AM (#45863463)

      Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event.

      I work in film post-production in Hollywood and I'm not sure we've had any consultations on VP9, MPEG always gets SMPTE and the ASC involved in screenings and quality shootouts. Of course Google might just be trying to buffalo filmmakers, which would be nothing new, I suppose. "Content providers," as a term, rarely describes the people working the camera or doing the color (let alone syncing the sound). If you're a professional the licensing of the codec is completely irrelevant, it's a poor economy if the quality is even remotely compromised.

      Panasonic and Sony were demonstrating Google TV STBs a few years ago and I we all know how that turned out. It's basically no-cost for these shops to turn out this gear for whatever marketing event Google cares to throw. What you want to hear is Sony Consumer Electronics saying they wouldn't support the next MPEG standard, or Sony Pictures Entertainment announcing they'd standardize their delivery format on VP9. SPE is one of my employers and the codecs that, say, Crackle.com uses is decided by a group of people completely independent from the consumer electronics folks, and Crackle will support whatever codec is optimal on the target STB/mobile/desktop platform.

      Why would a provider want to go single-track with a codec which is "Open" in the way Android is, which is to say, you can download the source code, but the reference implementation that's distributed to millions of clients is controlled by a single vendor?

  • by alen ( 225700 ) on Friday January 03, 2014 @08:25PM (#45862137)

    well worth the $3000 TV and the $100 a month internet bill to go with it

    • the $100 a month internet bill to go with it

      You obviously don't live in the U.S. where decent speed costs a lot more than that.

    • Yeah I hear a lot of 4K hype but it's all just noise to me because NOBODDY GON' BUY DAT SHIT when it costs $1000+

      Plus the TV industry is far away from upgrading their equipment to produce the content.

      Also, Internet pricing and availability sucks in America, yes.

      • by alen ( 225700 )

        not really, most movies from the last few years were shot with 4K cameras. just like most movies starting in the late 90's were shot on HD cameras in anticipation of blu ray and HD adoption.

        the studios need to get the 4K masters and put them on a new format for 4K

        • not really, most movies from the last few years were shot with 4K cameras.

          But just about all television is shot 1080p, and the TV stuff is where the money is in terms of DVDs and streaming. That's yer Walking Dead, BSG, Mad Men, Orange is the New Black , all in straight HD.

          the studios need to get the 4K masters and put them on a new format for 4K

          Most films are shot in 4K now, which is to say their "originals" are in 4K, not their "masters." Many, many films have a 2K workflow and distribution pipeline, eve

  • by rudy_wayne ( 414635 ) on Friday January 03, 2014 @08:52PM (#45862343)

    When Google bought Youtube they converted all the videos to h.264 and made that the standard. Now all of a sudden h.264 is evil.

    • "Now all of a sudden h.264 is evil."

      hyperbole much?

      H.264 cant do 4k like vp9 which is royalty free and open source, where H.264 is not. And it will be widely supported by all the manufacturers that matter.

      I fail to see how this is a bad thing.

      What exactly is the problem here?

  • With ISPs throttling them, 4K videos will probably play about ten frames in between each thirty seconds of buffering.
  • In other news... (Score:5, Informative)

    by westlake ( 615356 ) on Friday January 03, 2014 @09:12PM (#45862457)

    Related Posts

    Is LG Ditching Google TV? Working On WebOS TV?
    Goodbye Patent Evil H.264; YouTube Switches To WebM
    Opera Welcomes Google's Move To Drop H.264 Support
    Microsoft Backs H.264, I Back Betamax

    YouTube goes 4K at CES, brings royalty free VP9 to fore front [muktware.com]

    There are some very big players moving in HEVC.

    Netflix has tossed their hat in the 4K ring with the announcement of 4K streaming starting next month.

    The jump from streaming 1920x1080 to 3840x2160 is not something that can be done by just flipping a switch. First of all, viewers need a 4K TV, which practically no one has yet. PCMag's Chloe Albanesius has informed us that Netflix's 4K content will require ''somewhere between 12 and 15 Mbps'' to stream properly. That;s a pretty serious connection which, again, not many .

    By using H.265 HEVC (High Efficiency Video Coding) moving forward instead of the currently popular AVC H.264, Netflix thinks they will be able to stream the same quality they currently transmit at half the bitrate. Not only does this mean there's room for higher quality 4K streams, but the current HD content will be transmitted more efficiently.

    It's unclear when we'll see 4K streaming available in standalone set-top boxes any time soon, or whether or not it will require new hardware in order to handle the increased resolution in the future, but for now it looks like the TV itself is the home for 4K streaming.

    Netflix is bringing 4K streaming to TVs with H.265 and House of Cards [geek.com] [Dec 19]

  • http://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP [wikipedia.org]
    Youtube has split up all video/audio over 720p into separate streams, which makes downloading much more difficult.
    Some downloaders use ffmpeg to mux the streams together, but other than that, you're SOL for downloading anything better than 720p mp4.

  • by Trogre ( 513942 )

    Good to see further progress in YouTube video quality. Any word on when they're going to break the 30 frame per second limit and allow HFR content?

  • by wonkey_monkey ( 2592601 ) on Friday January 03, 2014 @09:17PM (#45862495) Homepage

    So now we'll get idiots uploading cellphone footage of clips from Family Guy (dubbed into Spanish) scaled up to 4K instead of 1080p...

    • Also slap "3D audio" in the description as it was a mono recording duplicated to stereo (with heavily clipping audio).
  • Comment removed based on user account deletion
    • I'd be happy if YouTube denied posting anything less than 720p.

      Consoles prior to the Dreamcast output 240p, except for a few PS1 and Nintendo 64 games that output 480i. The Dreamcast, PS2, GameCube, original Xbox, and original Wii output 480i (or 480p if you were lucky). How would requiring 720p improve reviews of games for those platforms?

  • Google can't even serve Youtube using codecs currently supported by its own browser. There is a mode to switch to HTML5 video, but some of the videos require Flash anyway. Besides Google does not indemnify the potential users of VP9 against potential patent infringement, so it's only slightly better than just grabbing an h265 codec and linking against it without first obtaining a license. It's a patent mine field of epic proportions, and if you're going to pay for patents, you might as well get the real dea

    • by Lennie ( 16154 ) on Saturday January 04, 2014 @08:15AM (#45864415)

      Yes, when you enable the HTML5-player some videos are still using Flash.

      If you look closely, you might have noticed that videos with 'annotations' all load in Flash, those without annotations load in HTML5.

      While I have seen videos on YouTube that had annotations in the HTML5-player (they clearly do some A/B testing at times), I would call that: not yet for general consumption.

      So it is work in progress, but they aren't moving fast.

  • When will I actually be able to watch a movie without stutter every 20 seconds because their buffering is shot to hell and back? And don't think that you can simply pause and let it load, because it simply friggin' doesn't! Instead, if you dare to pause the movie for more than a few minutes it will most likely not play at all anymore.

    Quite frankly, the recent year has seen the worst development in quality for Youtube since. Now, I don't care too much about their "tie YouTube to your G+ account" spiel. I nev

    • Make sure that you are using the latest Adobe Flash Player (version 11.9).

      Also make sure that you have enough bandwidth available (YouTube uses 2Mb/s for 720p).

    • by dbraden ( 214956 )

      I, too, have been having a terrible experience with YouTube, with it often times just freezing at the half or 2/3rd mark. It also seems that when you select a specific resolution, they take it as merely a suggestion. As you noticed, you can no longer let it buffer, they try to do some type of adaptive resolution switching which more often than not results in the stream dropping down to 240 just because I chose to skip ahead. Very annoying.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...