Samsung Develops First LCD Panel Using DisplayPort 121
SK writes "Samsung has developed the world's first LCD panel using the next-generation video interface — DisplayPort. Sanctioned by VESA (the Video Electronics Standards Association), DisplayPort will serve as a replacement for DVI, LVDS and eventually VGA. By using a transmission speed more than double that of today's interfaces, Samsung's new LCD only requires a single DisplayPort interface, instead of the two DVI (Digital Visual Interface) ports now used. The speed enables 2560x1600 resolution without any color smear."
Hope it gets off (Score:3, Insightful)
Re: (Score:2, Informative)
Re: (Score:2)
Of course, that won't mean as much if the optional DRM (DPCP, DisplayPort Content Protection) becomes a de facto standard, since it does have licensing fees.
DRM vs HDCP in DVI/HDMI (Score:4, Interesting)
HDMI and DVI are at least compatible with a cable.
Is DisplayPort?
Re: (Score:2)
I would guess 'no' since it's a different interface entirely. DVI and HDMI were essentially the same interface, just with different connector types.
Re:DRM is HDCP (Score:5, Interesting)
HDCP is mandatory.
So why not just use HDMI.
We do not need different standards for tv and computer if they do the same thing.
Re: (Score:3, Informative)
So why not just use HDMI.
Here DisplayPort has a huge advantage: it doesn't require licensing fees. This means that every manufacturer in China and Taiwan could implement this overnight.
However... implementing HDCP/DPCP does require a license fee, so if it becomes mandatory there wil
Re: (Score:2)
Re: (Score:2)
Mandatory to implement, or mandatory to use?
It's never mandatory to use, even if DRM gets implemented, unless you want to display protected content. So yes, I believe you could still display your desktop just fine. The software playing a movie, for example, would just refuse to do so if the whole path wasn't protected. Someone feel free to correct me if I'm wrong, I've never messed around with HDCP and hopefully never will.
My point, however, was that by being _mandatory_ to implement license fee requiring DRM on an otherwise license fee free spec
Re: (Score:2)
Re: (Score:2)
- DisplayPort
- DisplayPort/Secure
The idea being that anyone could implement the basic version without the support for encryption. The differing names would also avoid confusion caused by version numbers. Heck I work in the software industry and version numbers don't always describe the difference, so I doubt the layman would understand it any more. By having two differing versions it would also allow the market to decide which one they really want, as opp
HDCP is good for one reason (Score:3, Informative)
Re:DRM is HDCP (Score:5, Informative)
Released December 2002.
Single-cable digital audio/video connection with a maximum bitrate of 4.9 Gbit/s. Supports up to 165 Mpixel/s video (1080p60 Hz or UXGA) and 8-channel/192 kHz/24-bit audio.
HDMI 1.3
Released 22 June 2006.[7][8]
Increases single-link bandwidth to 340 MHz (10.2 Gbit/s)
=> 2560x1600 and beyond. Personally I feel 1920x1200 is enough, I don't need that huge a workspace and it's highly unlikely above-1080p will become common in the next decade or two.
Re: (Score:3)
Also, I believe 640kB ought to be enough for everyone, and that the world at most needs, perhaps five computers.
Re: (Score:2)
a) You have 1080p+ source material
b)
Re: (Score:1)
Re: (Score:2)
Re: (Score:3, Insightful)
plus, the most important one (Score:1)
Re: (Score:2)
I do not agree that a full HD picture is "perfect". I can easily view the difference between a book printed with 1000 dpi and a document printed on a cheap printer with 300 dpi. My monitor ha
Re: (Score:1)
Cinemas recommend about 30 degrees FOV as ideal because past that people get disoriented and nauseated
Horse shit, guess you have never experienced the joy of a truly immersive home entertainment systems (50+ deg FOV), that 30 deg FOV is probably the minimum requirement or people get shitty cause the screen is to small.
And yes HDTV does look pathetic once you scale it up to 4m by 2m but then your brain forgets to notice and you really do start to enjoy it, and postprocessing tricks go a long way to help.
Honestly does IMAX make you want to puke? You realise they do 180 deg FOV? Take your FUD elsewhere.
Re: (Score:2)
Er... yeah actually... When the camera goes on a roller coaster ride, my stomach tightens in genuine anticipation. That's half the point of IMAX for me.
When Imax first came out they used to give a speech about how to 'just close your eyes if you feel nauseated or disoriented and the feeling will pass'.
Re: (Score:2)
2560x1600 and beyond. Personally I feel 1920x1200 is enough, I don't need that huge a workspace and it's highly unlikely above-1080p will become common in the next decade or two.
What about the 30" displays from Apple and Dell? Are you saying they will be so unsuccessful over the next decade that we shouldn't even standardize on a single-link protocol that can support them? That sounds like a recipe for a format war to me. It also sounds like you can't comprehend the professional and academic markets. 8MP cameras are common, and you don't see a need for a display with more than 2.3MP? Talk about short-sighted.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
I've been running 1536p for the last decade (2048x1536 CRTs, 19"-21", $250-$350 each). Why should I have to downgrade for the next?
Re: (Score:3, Interesting)
Re: (Score:2, Informative)
dual link DVI, not two ports/cables.... (Score:5, Informative)
You need a 'dual link' DVI - which is actually a single cable. I've got an old 7900gtx running my 30" Dell at that resolution - and while the card is a bit long in the tooth for current games, it uses a single cable and works just fine for work and CS:Source at native resolution.
Re:dual link DVI, not two ports/cables.... (Score:4, Informative)
Re: (Score:2)
Anyway, the
Re: (Score:2)
But then your average consumer are retarded.
Re: (Score:2)
They may have meant in comparison to VGA, which should have gone the way of the Dodo along with the CRT, but seems to be alive and well regardless.
Re: (Score:2)
Re:dual link DVI, not two ports/cables.... (Score:4, Informative)
However the display port panel in question uses 10 bits per color, which would require another cable even with dual-link DVI. As I understand DVI's handling of high bit depth displays, cable#1 would carry the most significant bits for it's half of the screen on link#1, and the least significant bits on link#2, while cable#2 does the same for it's half of the screen.
Re: (Score:1)
It would be much easier for each cable to transmit one half of the image at full 10bit depth and have the electronics inside the display interlace each line and form the complete image.
If one cable is removed, the display would double the line received from the first cable, effectively lowering the resolution but yet you would still have image. Then, maybe you would get the option to lower the color depth to 8bits/c
Why are we discussing this? (Score:1, Insightful)
Plug it back in?
Re: (Score:1)
The world is not made out only of geeks that know how to fix their computers.
But I agree, this is not the point of the article and there shouldn't be a need for two cables in the first place.
Re: (Score:1)
Re:dual link DVI, not two ports/cables.... (Score:5, Funny)
At least it is not TN (Score:1)
Zillion:1 fake contrast ratio, viewing angles 160/160 (yeah right) and other marketing junk to hide the truth: TN is low end.
Nice screenshots! (Score:3, Insightful)
Re: (Score:3, Funny)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Not needed (Score:5, Insightful)
Shouldn't they be putting forth a standard that will last a bit longer? Go for 10x speed, not just 2x.
This sounds like a rush to put out a new product, not for the sake of market need, but for the sake of patent royalties.
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
1. In 1999 literally NOBODY needed dual-link DVI.
2. In 1999, the cost (in terms of silicon) for the second controller was too high to justify.
They have slowly introduced the second link as the need grew, and now most add-in boards have at least one dual-link port.
Unfortunately, the current connector has no room for gro
Re: (Score:2, Insightful)
Actually, it is license free, so if it means what I think it means, there are no royalties to use this interface. I do believe it is rushed though. Just look at the connector [wikipedia.org], it's like it doesn't even need any help for it to fall out of its socket. The wonderfully original name also says something. Apparently it can transmit audio data as well, so why doesn't the name at least give so
Re: (Score:1)
There shouldn't be royalties attached to any new display standards.
This seems like a rush on Samsungs end to be the first to use it. It is innovation in my eyes. If you sit back and wait for someone else to do it, then what kind of company are you? Some spin off crap display manufacture?
Re: (Score:1)
no companies actually cares about the growth of technology anymore
Re: (Score:1)
DisplayPort is royalty-free. Look it up.
hence the reason that companies compete over "standards" they are backing.
Have you looked at the back of a high-quality monitor before? They have every imaginable display standard. If it doesn't then you can find one that does. How is this competing over standards? I think your thinking about the current HD-DVD and Blue-Ray format war and how some companies are backing one or the other format.
no companies actually cares about t
Re: (Score:1)
So it is royalty free (my mistake), but does that mean that no one is making a profit from it? Someone somewhere will incorporate/patent this technology and make money on it, as it will then no longer be royalty free.
Have you looked at the back of a high-quality monitor before
I happen to own many, and sure they have but these ports are there for very different reasons... none of them can completely replace the other and high end monitors need them all so as to
Re: (Score:1)
I did mention this in my previous reply, please read again.
So, you own your own business, good for you. As a thriving young business man as I am guessing you are (by the immaturity of your post/arguments)....
Calling my posts/argument immature is worse than what I typed. And to clarify what I typed, which I assume wha
Now if there were only more high-res eyes (Score:2, Interesting)
Anyway, most of the people who will buy this stuff are middle-aged and old people who get suckered by Circuit City salesmen and can't even see the resolution of a 20 year old 27" tv hooked up to a VHS tape.
Re:Now if there were only more high-res eyes (Score:4, Funny)
I think we should judge for ourselves. Can't someone post a screenshot?
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Try using a high resolution theme for your windowing system and cranking up the font sizes. You'll run into weird graphical glitches because your apps suck, but you'll also get to see how much better higher resolution can look with fonts the same size on screen.
Re: (Score:3, Insightful)
There are gobs of commercial/industrial applications for hi-res monitors.
I couldn't even begin to list all the fields where this would get snapped up...
Please abandon the "just because I don't have a serious use for [X], then neither will anyone else" mode of thinking.
Re:Now if there were only more high-res eyes (Score:5, Insightful)
I used to say the same thing about HDTV. "TV looks fine now. How much better could it be?" Then I actually saw some HDTV programs. Then I said the same thing about HD-DVD/Bluray. "DVDs are sharp, like HDTV! How much better could it be?" Then I saw some HD-DVD movies on a 1080p TV.
It's going to be a long time before we stop having a need to increase resolution.
We also these days have a color problem. 24-bit (8-bit per component) color seems like a lot, but it doesn't compare to even 10-bit per component color. I can't imagine what a monitor with 12-bit per component color would look let, but I'm willing to bet it'll look better than what we've got now.
Re: (Score:2)
As for the 10 bit per component: I think they will find that very very few programs utilize this. It will be a pain to move beyond
Re: (Score:2)
Re: (Score:2)
We also these days have a color problem. 24-bit (8-bit per component) color seems like a lot, but it doesn't compare to even 10-bit per component color. I can't imagine what a monitor with 12-bit per component color would look let, but I'm willing to bet it'll look better than what we've got now.
I fully agree with you on resolution, but not color.
I can easily discern individual pixels with my eyes. I cannot display anything as thin as a hair on my screen, and even antialiasing it only makes it looks like a semi-blurred, slightly thick hair. Color is different, discerning two near colors in an 8-bit palette is almost impossible.
Huge resolutions are needed, because without tiny pixels it is just not possible to display tiny details. Assume the actors are reading black-on-white text on a piece of pap
Re: (Score:2)
Re: (Score:2)
certainly there is an upper bound (Score:1)
Cable length - what is the max? (Score:2, Interesting)
Ideally, I would like to be able to put the computer in another room and just run a long video cable, and then use the USB hub in the monitor to hook up everything else. This would be great for office environments too.
USB has the same cable length problem , unfortunately.
Re: (Score:1)
The client doesn't want to hear your big 4tb raid array clanging away and the wonderful hum of a computer while the he asks you to move that logo over to the left a bit.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
DRM? (Score:1, Redundant)
Re: (Score:2)
Re: (Score:1)
Hype... sort of. (Score:5, Informative)
The importance of DisplayPort is two-fold. First, unlike DVI, it's an open standard, thus requiring no license. Second, although DisplayPort's capabilities don't have much over DVI, the way it implements capabilities does. Namely, it requires less electronics and simpler/smaller cabling, potentially making it significantly cheaper to produce DisplayPort products.
Re: (Score:2, Insightful)
It also helps on graphics cards, where two DVI connectors take up a lot of space and do not leave much room for other connectors. Maybe with DisplayPort it would be possible to get graphics cards with more connectors for Multiscreen Environments.
It's about removing Silicon Image's lockhold (Score:5, Informative)
The problem was that consumer electronics and computer manufacturers didn't want to pay Silicon Image skim for its patents on TMDS that's used in DVI, HDMI and the now-dead UDI. Samsung, having been left out in the cold, led the charge to DisplayPort alongside HP and a few others. They defined the open standard using PCI-Express PHY and a new link layer with lots of resolutions, audio support, and anything you could imagine. They were ready to put it out the market with its own proprietary encryption scheme called DPCP when Intel led the Hollywood charge against it. They basically said DisplayPort had to use HDCP, which was about the only concession VESA made to them. Ironically, HDCP is far weaker than the AES-128 used in the original DPCP, but they wanted it anyway and got it. Bear in mind that VESA is essentially the DisplayPort working group today. This is also the primary reason why Samsung is the first one out the gate with it.
So, this is the product that we have today. Intel has pretty much left Silicon Image to twist in the wind. However, DisplayPort has one other use, and that's to protect the video links on a system board. Today, virtually all LCD panels use LVDS signaling, which is power hungry and requires big wide wiring harnesses between the board output and the panel input. DisplayPort was also designed for a chip-to-chip and board-to-board link so that people couldn't bypass copy protection by taking their TV's LVDS output to the LCD and building a converter board to unencrypted digital format. DisplayPort solves all of these problems plus allows for modes such as 120Hz and 240Hz panel refresh rates to combat motion blur and judder (which would require quad-link LVDS just for 120Hz at current 85MHz LVDS raw transmission rates). As a side note, Silicon Image touts iTMDS for a similar purpose, but it will never gain mass acceptance for the reasons already stated.
It's my guess that, in the next 4-5 years, LVDS will be supplanted by DisplayPort in all the "big 5" LCD manufacturers (LG/Philips, Sony/Samsung, CMO, AUO, and Sharp). AMD/ATI, nVidia and Intel mobos/GPUs will likely adopt this on a bigger scale starting next year. The one thing that's for sure is that all of the manufacturers not aligned to Silicon Image (read: everyone) are hell-bent on pushing through DisplayPort, no matter how painful or how long it takes. And all of us will get dragged along with it.
Re: (Score:2, Informative)
So, this is the product that we have today. Intel has pretty much left Silicon Image to twist in the wind. However, DisplayPort has one other use, and that's to protect the video links on a system board. Today, virtually all LCD panels use LVDS signaling, which is power hungry and requires big wide wiring harnesses between the board output and the panel input. DisplayPort was also designed for a chip-to-chip and board-to-board link so that people couldn't bypass copy protection by taking their TV's LVDS output to the LCD and building a converter board to unencrypted digital format. DisplayPort solves all of these problems plus allows for modes such as 120Hz and 240Hz panel refresh rates to combat motion blur and judder (which would require quad-link LVDS just for 120Hz at current 85MHz LVDS raw transmission rates). As a side note, Silicon Image touts iTMDS for a similar purpose, but it will never gain mass acceptance for the reasons already stated.
FUCK. I hate these DRM freaks. It's just as stupid as banning camcorders from movie theaters. Newsflash, you asshats: this isn't like drugs where you have to grow and smuggle a million tons of coke to sell a million tons of coke. So long as one, ONE person manages to rip a good copy of a movie, it can be duplicated an infinite number of times. Unless they come up with some sort of spiffy watermarking technique that can flawlessly identify ripped movies, a watermarking that cannot be stripped out or repaire
Re: (Score:2)
Re: (Score:2)
It's my guess that, in the next 4-5 years, LVDS will be supplanted by DisplayPort in all the "big 5" LCD manufacturers (LG/Philips, Sony/Samsung, CMO, AUO, and Sharp). AMD/ATI, nVidia and Intel mobos/GPUs will likely adopt this on a bigger scale starting next year. The one thing that's for sure is that all of the manufacturers not aligned to Silicon Image (read: everyone) are hell-bent on pushing through DisplayPort, no matter how painful or how long it takes.
I'm not so sure - at present I'm using DVI to connect a $100 graphics card to a $300 flat panel; one would think licensing costs would be a negligible fraction of the final selling price.
Furthermore, one would think a device supporting only DisplayPort would command a lower price than one supporting both DisplayPort and DVI, because many people have DVI hardware already, and that price premium would be greater than the licensing fee for DVI.
To me the idea of DisplayPort displacing DVI is similar to the sit
DisplayPort is rather old news (Score:1, Informative)
Re: (Score:1)
Mini DVI - DisplayPort (Score:3, Insightful)
The White Album (Score:2)
Not impressed (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
There, fixed it for you.
Apple (Score:2)
No License Fees (Score:1)
Given that, does anyone know where I can find DisplayPort transmitt
It will fail (Score:2)
Is it superior to HDMI Re: signalling issues? (Score:2)
I love the name (Score:1)
it reminds me of a story my boss told us about how when he worked at oracle they spent gobs of $ on a team to name their internal DB server app and after months and hundreds of thousands came up with WebDB in arial 12 point font.
Re: (Score:2)
Re: (Score:2)