Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays Hardware

Samsung Develops First LCD Panel Using DisplayPort 121

SK writes "Samsung has developed the world's first LCD panel using the next-generation video interface — DisplayPort. Sanctioned by VESA (the Video Electronics Standards Association), DisplayPort will serve as a replacement for DVI, LVDS and eventually VGA. By using a transmission speed more than double that of today's interfaces, Samsung's new LCD only requires a single DisplayPort interface, instead of the two DVI (Digital Visual Interface) ports now used. The speed enables 2560x1600 resolution without any color smear."
This discussion has been archived. No new comments can be posted.

Samsung Develops First LCD Panel Using DisplayPort

Comments Filter:
  • Hope it gets off (Score:3, Insightful)

    by mariushm ( 1022195 ) on Saturday July 28, 2007 @10:20AM (#20022873)
    I honestly hope this gets off to a good start and that it will be supported by the industry. As far as I know, it has less (or no) DRM included and is much better at handling large resolutions.
    • Re: (Score:2, Informative)

      by z0M6 ( 1103593 )
      Dsplayport 1.1 supports HDCP. kind of sucks, but Displayport>HDMI
    • by HeroreV ( 869368 )
      The big deal with DisplayPort is that it's license free. It's got lots of other great properties, but that's what will really give this a big push.

      Of course, that won't mean as much if the optional DRM (DPCP, DisplayPort Content Protection) becomes a de facto standard, since it does have licensing fees.
  • by GodWasAnAlien ( 206300 ) on Saturday July 28, 2007 @10:22AM (#20022883)
    So is there more DRM in this? Is it optional or mandatory.

    HDMI and DVI are at least compatible with a cable.

    Is DisplayPort?
    • HDMI and DVI are at least compatible with a cable. Is DisplayPort?

      I would guess 'no' since it's a different interface entirely. DVI and HDMI were essentially the same interface, just with different connector types.
    • Re:DRM is HDCP (Score:5, Interesting)

      by Anonymous Coward on Saturday July 28, 2007 @10:31AM (#20022945)
      "Exactly six months after the tech world was introduced to DisplayPort, the Video Electronics Standards Association (VESA) has proposed DisplayPort Version 1.1, which would bring high bandwidth digital content protection (HDCP) support to the standard. Previously, DisplayPort 1.0's copy protection support was described as "optional," but if the VESA DisplayPort Task Group has its way, it will become mandatory."

      HDCP is mandatory.

      So why not just use HDMI.

      We do not need different standards for tv and computer if they do the same thing.
      • Re: (Score:3, Informative)

        by Graftweed ( 742763 )
        The bad news is that DisplayPort supports DRM. Both HDCP and DPCP (DisplayPort Content Protection). Like you said, it isn't mandatory yet, but future revisions of the standard will almost surely make it so, which is why I'm not in a hurry to upgrade.

        So why not just use HDMI.

        Here DisplayPort has a huge advantage: it doesn't require licensing fees. This means that every manufacturer in China and Taiwan could implement this overnight.

        However... implementing HDCP/DPCP does require a license fee, so if it becomes mandatory there wil

        • by Firehed ( 942385 )
          Mandatory to implement, or mandatory to use? There's no reason to have your desktop encrypted, though I can see the validity in the claim of having HD content protected, no matter now much I disagree with it. As it so happens, you can just use AnyDVD HD and disable the ICT on HD movies that require HDCP for full-res playback and make your older 1080p display work its wonders.
          • Mandatory to implement, or mandatory to use?

            It's never mandatory to use, even if DRM gets implemented, unless you want to display protected content. So yes, I believe you could still display your desktop just fine. The software playing a movie, for example, would just refuse to do so if the whole path wasn't protected. Someone feel free to correct me if I'm wrong, I've never messed around with HDCP and hopefully never will.

            My point, however, was that by being _mandatory_ to implement license fee requiring DRM on an otherwise license fee free spec

            • by jZnat ( 793348 ) *
              Thankfully, by the time DRM'd content starts forcing use of HDCP, we'll have cracked AACS et al. so thoroughly that it won't matter at all. The only thing HDCP is good for is increasing the price of hardware and making early adopters buy new hardware all over again.
      • Another approach I would like to see:
        - DisplayPort
        - DisplayPort/Secure

        The idea being that anyone could implement the basic version without the support for encryption. The differing names would also avoid confusion caused by version numbers. Heck I work in the software industry and version numbers don't always describe the difference, so I doubt the layman would understand it any more. By having two differing versions it would also allow the market to decide which one they really want, as opp
      • It's easily cracked [freedom-to-tinker.com]. For some mysterious reason *COUGH*Intel*COUGH*, DisplayPort's original copy protection (the far better AES-128) had the kaibosh put on it. That's fine - 40 exposed keys cracks the whole system, as my link says.
    • Re: (Score:3, Interesting)

      by nurb432 ( 527695 )
      DRM is only optional in the beginning, to get you to switch.
  • ...instead of the two DVI (Digital Visual Interface) ports now used. The speed enables 2560x1600...

    You need a 'dual link' DVI - which is actually a single cable. I've got an old 7900gtx running my 30" Dell at that resolution - and while the card is a bit long in the tooth for current games, it uses a single cable and works just fine for work and CS:Source at native resolution.
    • by electromaggot ( 597134 ) on Saturday July 28, 2007 @10:41AM (#20023029)
      Good point. I've seen multiple posts on the internet where confused people think "dual link" DVI means it requires both of the DVI ports on your graphics card. If you look at the plug-ends of that single "dual link" cable, you realize it actually has a lot more pins packed in there than standard DVI cables! So the name, while maybe accurately descriptive, is perhaps a misnomer to consumers.
    • Dual link is definitely best, it's all in one connector. I've never tried to use two single link connectors, but that seems to be asking for trouble.

      Anyway, the /. story mentions color smear. I don't understand what that means. Dual link DVI does fine, it doesn't color smear either that I've seen. The article itself mentions that it's for 10bit color at the 2560x1600 resolution, otherwise requiring three links to do the job, but is it ready yet? Is their new LCD good enough yet to display better than 8
      • by aliquis ( 678370 )
        Considering everyone seems to buy 6-bit displays I guess no.

        But then your average consumer are retarded.
      • Anyway, the /. story mentions color smear. I don't understand what that means. Dual link DVI does fine, it doesn't color smear either that I've seen.

        They may have meant in comparison to VGA, which should have gone the way of the Dodo along with the CRT, but seems to be alive and well regardless.
      • by aliquis ( 678370 )
        Regarding smear I would assume that they compare it to VGA and not DVI. I have no idea if that resolution at whatever framerate would lead to smear using VGA but I guess it's possible.
    • by RalphBNumbers ( 655475 ) on Saturday July 28, 2007 @11:03AM (#20023181)
      That was my first reaction as well. And if you're only using 8 bits per color, then yes one dual-link cable will do.
      However the display port panel in question uses 10 bits per color, which would require another cable even with dual-link DVI. As I understand DVI's handling of high bit depth displays, cable#1 would carry the most significant bits for it's half of the screen on link#1, and the least significant bits on link#2, while cable#2 does the same for it's half of the screen.
      • I'm not sure you would like to do that. What would happen if one cable suddenly gets removed?

        It would be much easier for each cable to transmit one half of the image at full 10bit depth and have the electronics inside the display interlace each line and form the complete image.
        If one cable is removed, the display would double the line received from the first cable, effectively lowering the resolution but yet you would still have image. Then, maybe you would get the option to lower the color depth to 8bits/c
        • by Anonymous Coward

          What would happen if one cable suddenly gets removed?

          Plug it back in?

          • Try to tell that to your mother after she goes with the vacuum throughout the house. move the computer and messes up one of the cable and all she knows to do with a computer is to turn it on, click on my nickname inside Yahoo Messenger and then click on "Call".

            The world is not made out only of geeks that know how to fix their computers.

            But I agree, this is not the point of the article and there shouldn't be a need for two cables in the first place.
            • Plugging in a cord that you just unplugged isn't exactly a geek-level skill. I'm sure anyone who's ever used electrical outlets could figure it out.
    • by Afecks ( 899057 ) on Saturday July 28, 2007 @11:41AM (#20023439)
      An old 7900GTX?? Do you have any old flying cars or solid gold toilets you want to get rid of?
  • TN panels=garbage that is dominating the marketplace.

    Zillion:1 fake contrast ratio, viewing angles 160/160 (yeah right) and other marketing junk to hide the truth: TN is low end.
  • Nice screenshots! (Score:3, Insightful)

    by Anonymous Coward on Saturday July 28, 2007 @10:38AM (#20023007)
    Those screenshots really show off the benefits of this new technology.

  • Not needed (Score:5, Insightful)

    by crow ( 16139 ) on Saturday July 28, 2007 @10:39AM (#20023013) Homepage Journal
    Since we have dual link DVI, and this only doubles the DVI data rate, how does this help?

    Shouldn't they be putting forth a standard that will last a bit longer? Go for 10x speed, not just 2x.

    This sounds like a rush to put out a new product, not for the sake of market need, but for the sake of patent royalties.
    • by Kjella ( 173770 )
      They're way late to the party, there's DVI (all current computers not using old analog) and HDMI (smaller contacts, standard on HDTVs and can carry sound). DisplayPort sounded like a poor and late idea when I first heard of it, and it doesn't look better now.
    • Re: (Score:3, Interesting)

      by imadork ( 226897 )
      I haven't heard of this standard until now, but since it's using a multi-lane high-speed serial protocol, there's probably nothing holding them back from expanding the current 4-lane architecture into a 8-lane or 16-lane architecture (other than redesigning the cable and connector, of course). Just like PCI Express, for instance.
      • But that's exactly what they did with DVI. The connector uses LVDS serial transmission lines [wikipedia.org] much like other modern interfaces. The link design was made scalable (two links) for two reasons:

        1. In 1999 literally NOBODY needed dual-link DVI.
        2. In 1999, the cost (in terms of silicon) for the second controller was too high to justify.

        They have slowly introduced the second link as the need grew, and now most add-in boards have at least one dual-link port.

        Unfortunately, the current connector has no room for gro
    • Re: (Score:2, Insightful)

      by Nozsd ( 1080965 )
      This sounds like a rush to put out a new product, not for the sake of market need, but for the sake of patent royalties.

      Actually, it is license free, so if it means what I think it means, there are no royalties to use this interface. I do believe it is rushed though. Just look at the connector [wikipedia.org], it's like it doesn't even need any help for it to fall out of its socket. The wonderfully original name also says something. Apparently it can transmit audio data as well, so why doesn't the name at least give so
    • by Sentax ( 1125511 )
      This sounds like a rush to put out a new product, not for the sake of market need, but for the sake of patent royalties.

      There shouldn't be royalties attached to any new display standards.

      This seems like a rush on Samsungs end to be the first to use it. It is innovation in my eyes. If you sit back and wait for someone else to do it, then what kind of company are you? Some spin off crap display manufacture?
      • there's always royalties... hence the reason that companies compete over "standards" they are backing.
        no companies actually cares about the growth of technology anymore
        • by Sentax ( 1125511 )
          there's always royalties

          DisplayPort is royalty-free. Look it up.

          hence the reason that companies compete over "standards" they are backing.

          Have you looked at the back of a high-quality monitor before? They have every imaginable display standard. If it doesn't then you can find one that does. How is this competing over standards? I think your thinking about the current HD-DVD and Blue-Ray format war and how some companies are backing one or the other format.

          no companies actually cares about t
          • DisplayPort is royalty-free. Look it up

            So it is royalty free (my mistake), but does that mean that no one is making a profit from it? Someone somewhere will incorporate/patent this technology and make money on it, as it will then no longer be royalty free.

            Have you looked at the back of a high-quality monitor before

            I happen to own many, and sure they have but these ports are there for very different reasons... none of them can completely replace the other and high end monitors need them all so as to
            • by Sentax ( 1125511 )
              If you want to see companies backing standards look at... oh what is that new technology called where we store data on a disk and watch it on our televisions... oh HD-DVD and Blue-ray.

              I did mention this in my previous reply, please read again.

              So, you own your own business, good for you. As a thriving young business man as I am guessing you are (by the immaturity of your post/arguments)....

              Calling my posts/argument immature is worse than what I typed. And to clarify what I typed, which I assume wha
  • Hooray, more ultra-high-resolution equipment for displaying low-res content to people who can't see the difference.

    Anyway, most of the people who will buy this stuff are middle-aged and old people who get suckered by Circuit City salesmen and can't even see the resolution of a 20 year old 27" tv hooked up to a VHS tape.
    • by h4rm0ny ( 722443 ) on Saturday July 28, 2007 @10:46AM (#20023057) Journal

      I think we should judge for ourselves. Can't someone post a screenshot?
    • Re: (Score:3, Interesting)

      by Jeff DeMaagd ( 2015 )
      Does Circuit City sell a 30" computer monitor? I'm not sure they sell anything larger than a 22", which is a lower ppi monitor anyway. Something like this is probably for young whipper snappers that have more money than sense.
      • Or better eyes. I dunno, I can't imagine running som ething at 1600x...I'd have to squint. Even 1280x makes my eyes work.
        • 1600 is huge on a 30" monitor.
        • Try using a high resolution theme for your windowing system and cranking up the font sizes. You'll run into weird graphical glitches because your apps suck, but you'll also get to see how much better higher resolution can look with fonts the same size on screen.

      • Re: (Score:3, Insightful)

        by TubeSteak ( 669689 )

        Something like this is probably for young whipper snappers that have more money than sense.
        No offense, but you're talking out your ass.

        There are gobs of commercial/industrial applications for hi-res monitors.
        I couldn't even begin to list all the fields where this would get snapped up...

        Please abandon the "just because I don't have a serious use for [X], then neither will anyone else" mode of thinking.
    • by DreadSpoon ( 653424 ) on Saturday July 28, 2007 @11:51AM (#20023507) Journal
      You can quite clearly see the difference. When the screens get up to 1000 DPI, then maybe we'll have a reason to stop increasing resolution. Until then, the pixels are still way too large. Look at how much effort goes into font rendering (and it still pretty much sucks). If we had 1000 DPI screens, or even 300 DPI screens for that matter, we wouldn't need sub-pixel anti-aliasing, font hinting, etc. And things would look super crystal clear.

      I used to say the same thing about HDTV. "TV looks fine now. How much better could it be?" Then I actually saw some HDTV programs. Then I said the same thing about HD-DVD/Bluray. "DVDs are sharp, like HDTV! How much better could it be?" Then I saw some HD-DVD movies on a 1080p TV.

      It's going to be a long time before we stop having a need to increase resolution.

      We also these days have a color problem. 24-bit (8-bit per component) color seems like a lot, but it doesn't compare to even 10-bit per component color. I can't imagine what a monitor with 12-bit per component color would look let, but I'm willing to bet it'll look better than what we've got now.
      • by Tacvek ( 948259 )
        There are problems though. High DPI monitors are often used as though they were high resolution monitors of standard DPI. That is trivial to program, and many people like it just fine. On the other hand it is harder to make other programs look decent. The problem is many programs work by pixel based units and assume that a pixel is roughly the size of a pixel in a 96DPI display.

        As for the 10 bit per component: I think they will find that very very few programs utilize this. It will be a pain to move beyond
        • Windows (Aero Glass), Mac OS X (Aqua/Quartz), and Linux (X w/ Compiz/Beryl) are all moving toward resolution independence ( http://en.wikipedia.org/wiki/Resolution_independe n ce [wikipedia.org] ). They do not seem to be there quite yet, but hopefully within a few years, it will be a standard feature. Particularly, your example of a 1000x1000 icon is silly: the icon would be vector graphics (ex. SVG) like many of the icons on my Linux system already are. Some people claim vector graphics are difficult to make pixel perfect
      • We also these days have a color problem. 24-bit (8-bit per component) color seems like a lot, but it doesn't compare to even 10-bit per component color. I can't imagine what a monitor with 12-bit per component color would look let, but I'm willing to bet it'll look better than what we've got now.

        I fully agree with you on resolution, but not color.

        I can easily discern individual pixels with my eyes. I cannot display anything as thin as a hair on my screen, and even antialiasing it only makes it looks like a semi-blurred, slightly thick hair. Color is different, discerning two near colors in an 8-bit palette is almost impossible.

        Huge resolutions are needed, because without tiny pixels it is just not possible to display tiny details. Assume the actors are reading black-on-white text on a piece of pap

        • Current technology does not do a good job of properly showing all the possible colors. Though it's not really a problem of throwing more bits at it, it's a problem of the way it works. Your monitor is only capable of displaying 3 colors - a specific shade of each red, blue, and green (well, 4 if you want to count black). To make another color, your monitor mixes these three colors together to fool your eye into thinking it sees the color you want to display. But it's not the same thing. Your monitor ca
      • because a screen cannot go bigger than your wall.
  • The single biggest problem with current video technologies is that it is not possible to have very long cables (50' +).

    Ideally, I would like to be able to put the computer in another room and just run a long video cable, and then use the USB hub in the monitor to hook up everything else. This would be great for office environments too.

    USB has the same cable length problem , unfortunately.
    • by nrgy ( 835451 )
      People have typically done this with video editing suites for client side sessions, you stick the box back in your server room and just run long cables to the equipment in the suite.

      The client doesn't want to hear your big 4tb raid array clanging away and the wonderful hum of a computer while the he asks you to move that logo over to the left a bit.
    • I was looking into the same thing myself. on ebay for 65$ shipped, you can get a usb 2.0 - cat5e converter so you can make 150 foot usb runs if necessary. If you really wanted, stick a usb hub on the business end of things, put on a usb keyboard, mouse, monitor maybe a soundcard and you got a silent front end. Probably only good for videos and small things, as I doubt any of it is hardware accelerated. As for me, I put the noisy beast in the basement, and drilled a few small holes through the floor. Cheap a
    • If you look at the Wikipedia article [wikipedia.org], you will see that it has a copper cable length similar to that of HMDI. However the 1.1 spec adds in a provision to use optical fiber, which should go as long as you need to. I'm very curious what type of fiber adapters you could use on copper plugs to put a device far away.
  • DRM? (Score:1, Redundant)

    by nighty5 ( 615965 )
    Is there any DRM garbage in this cable that works with Bluray etc? I'd like to stay clear of that shit.
  • Hype... sort of. (Score:5, Informative)

    by iamdrscience ( 541136 ) on Saturday July 28, 2007 @11:16AM (#20023265) Homepage
    The technical advantages of DisplayPort are minimal. Dual-link DVI can already do most of the things that DisplayPort does, and it has the advantage of already having decent market penetration. At first glance, I thought DisplayPort was doomed to become another in a long line of digital video standards that never caught on (LDI, OpenLDI, PanelLink, etc.). On closer examination, I think it might have a shot though.

    The importance of DisplayPort is two-fold. First, unlike DVI, it's an open standard, thus requiring no license. Second, although DisplayPort's capabilities don't have much over DVI, the way it implements capabilities does. Namely, it requires less electronics and simpler/smaller cabling, potentially making it significantly cheaper to produce DisplayPort products.
    • Re: (Score:2, Insightful)

      by bomanbot ( 980297 )
      AFAIK Another advantage is that the actual DisplayPort connector is a good bit smaller than a DVI connector, which makes it easier to build DisplayPorts into small portable devices.

      It also helps on graphics cards, where two DVI connectors take up a lot of space and do not leave much room for other connectors. Maybe with DisplayPort it would be possible to get graphics cards with more connectors for Multiscreen Environments.
    • by StandardCell ( 589682 ) on Saturday July 28, 2007 @01:25PM (#20024149)
      The short history is that VESA became a political organization unable to get anything passed through to replace analog VGA (e.g. NAVI). The Digital Display Working Group, led by Silicon Image, defined the DVI standard and never looked back, eventually defining HDCP encryption and adding onto DVI by defining HDMI. The only meaningful thing prior to DisplayPort and after analog VGA that VESA contributed to was the mounting hardware for monitors. You'll also notice that Samsung was not part of the original HDMI working group.

      The problem was that consumer electronics and computer manufacturers didn't want to pay Silicon Image skim for its patents on TMDS that's used in DVI, HDMI and the now-dead UDI. Samsung, having been left out in the cold, led the charge to DisplayPort alongside HP and a few others. They defined the open standard using PCI-Express PHY and a new link layer with lots of resolutions, audio support, and anything you could imagine. They were ready to put it out the market with its own proprietary encryption scheme called DPCP when Intel led the Hollywood charge against it. They basically said DisplayPort had to use HDCP, which was about the only concession VESA made to them. Ironically, HDCP is far weaker than the AES-128 used in the original DPCP, but they wanted it anyway and got it. Bear in mind that VESA is essentially the DisplayPort working group today. This is also the primary reason why Samsung is the first one out the gate with it.

      So, this is the product that we have today. Intel has pretty much left Silicon Image to twist in the wind. However, DisplayPort has one other use, and that's to protect the video links on a system board. Today, virtually all LCD panels use LVDS signaling, which is power hungry and requires big wide wiring harnesses between the board output and the panel input. DisplayPort was also designed for a chip-to-chip and board-to-board link so that people couldn't bypass copy protection by taking their TV's LVDS output to the LCD and building a converter board to unencrypted digital format. DisplayPort solves all of these problems plus allows for modes such as 120Hz and 240Hz panel refresh rates to combat motion blur and judder (which would require quad-link LVDS just for 120Hz at current 85MHz LVDS raw transmission rates). As a side note, Silicon Image touts iTMDS for a similar purpose, but it will never gain mass acceptance for the reasons already stated.

      It's my guess that, in the next 4-5 years, LVDS will be supplanted by DisplayPort in all the "big 5" LCD manufacturers (LG/Philips, Sony/Samsung, CMO, AUO, and Sharp). AMD/ATI, nVidia and Intel mobos/GPUs will likely adopt this on a bigger scale starting next year. The one thing that's for sure is that all of the manufacturers not aligned to Silicon Image (read: everyone) are hell-bent on pushing through DisplayPort, no matter how painful or how long it takes. And all of us will get dragged along with it.
      • Re: (Score:2, Informative)

        by jollyreaper ( 513215 )

        So, this is the product that we have today. Intel has pretty much left Silicon Image to twist in the wind. However, DisplayPort has one other use, and that's to protect the video links on a system board. Today, virtually all LCD panels use LVDS signaling, which is power hungry and requires big wide wiring harnesses between the board output and the panel input. DisplayPort was also designed for a chip-to-chip and board-to-board link so that people couldn't bypass copy protection by taking their TV's LVDS output to the LCD and building a converter board to unencrypted digital format. DisplayPort solves all of these problems plus allows for modes such as 120Hz and 240Hz panel refresh rates to combat motion blur and judder (which would require quad-link LVDS just for 120Hz at current 85MHz LVDS raw transmission rates). As a side note, Silicon Image touts iTMDS for a similar purpose, but it will never gain mass acceptance for the reasons already stated.

        FUCK. I hate these DRM freaks. It's just as stupid as banning camcorders from movie theaters. Newsflash, you asshats: this isn't like drugs where you have to grow and smuggle a million tons of coke to sell a million tons of coke. So long as one, ONE person manages to rip a good copy of a movie, it can be duplicated an infinite number of times. Unless they come up with some sort of spiffy watermarking technique that can flawlessly identify ripped movies, a watermarking that cannot be stripped out or repaire

      • It's my guess that, in the next 4-5 years, LVDS will be supplanted by DisplayPort in all the "big 5" LCD manufacturers (LG/Philips, Sony/Samsung, CMO, AUO, and Sharp). AMD/ATI, nVidia and Intel mobos/GPUs will likely adopt this on a bigger scale starting next year. The one thing that's for sure is that all of the manufacturers not aligned to Silicon Image (read: everyone) are hell-bent on pushing through DisplayPort, no matter how painful or how long it takes.

        I'm not so sure - at present I'm using DVI to connect a $100 graphics card to a $300 flat panel; one would think licensing costs would be a negligible fraction of the final selling price.

        Furthermore, one would think a device supporting only DisplayPort would command a lower price than one supporting both DisplayPort and DVI, because many people have DVI hardware already, and that price premium would be greater than the licensing fee for DVI.

        To me the idea of DisplayPort displacing DVI is similar to the sit

  • by Anonymous Coward
    Ad Terras Per Aspera (featured on Slashdot three or four times in the past few years) has already discussed Display port here [adterrasperaspera.com] and here [adterrasperaspera.com].
  • by Panascooter ( 948131 ) on Saturday July 28, 2007 @11:42AM (#20023449)
    I have already spent $20 for a MiniDVI -> DVI (actually DVI-D) and another $20 for a MiniDVI -> VGA (due to the incompatibility of DVI-D and VGA), and another $20 for a for a MiniDVI -> S-Video and Composite video for my macbook. Does this mean that I have to spend yet another $20 for yet another display option. Good thing I didn't start with an ibook and have repeated the whole process again.
  • This new connection doesn't seem to bring much to the table. I remember the Apple Display Connector which passed DVI, USB and power in a single cable; sure it had limitations and was proprietary but it really helped reduce the clutter. Why can't those new display standard bring more functinalities rather than just DRM?
    • by Thaelon ( 250687 )
      DRM isn't a functionality, it's an impediment.
    • Personally I think it would be nice to have everything in 1 cable, running to the monitor, and then everything just plugs into that. Even if the cable is 1 inch thick, I'd still be happier, Instead, I have to have a spider web of cables going from my desk to my tower. Also, all devices should come with cables that detach at bother ends so I can buy a cable that is the right lenght. I just bought a pair of headphones [sonystyle.ca], and the cable is 3.5 Metres long. They are home theatre headphones, so the length is
    • Why can't those new display standard bring more functinalities like DRM?


      There, fixed it for you.
  • by b1ng0 ( 7449 )
    Watch for Apple rebranding these and replacing their current line up of LCDs. Apple is a huge Samsung investor and undoubtedly has some say in the direction of product lines. And we all know Apple is usually the first to switch to new standards.
  • Why alot of industries are supporting DisplayPort is because there is supposed to be no license fees. It supports HDCP but its is not mandatory, just like in HDMI you dont have to use HDCP to transmit an image. Also display port can transmit over long distances 15m which at 1080p. And yes, HDMI/DVI is not compatible with DisplayPort since display port embeds the clock signals in the color streams. http://en.wikipedia.org/wiki/Display_port [wikipedia.org]

    Given that, does anyone know where I can find DisplayPort transmitt
  • It will fail as the DRM nonsense. Not just because of the usual reasons but this crap that tests the distance to the screen and other bullshit ill make the connector cables stupidly expensive, will probably mean you will have to pay a license to produce one ( I can't possibly see how that could be bad for adoption ) and then you will get a bunch of incompatible devices and users screaming for something else. In short, it will suck, it will be hated, and it will die. VGA has stayed around so long for a few s
  • I once read an article that heavily criticized the complexity of HDMI which made it extremely challenging to create cables of any length. The issue seemed to be that instead of simply implementing a serial stream of data, they have several parallel streams traveling over different twisted pairs, which all need to keep in sync in order to get a clean picture. Easy with lower resolutions or very short cable runs, but once you pump up the resolution you must also pump up the bandwidth. The greater the rez, the
  • display port.
    it reminds me of a story my boss told us about how when he worked at oracle they spent gobs of $ on a team to name their internal DB server app and after months and hundreds of thousands came up with WebDB in arial 12 point font.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...