Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Displays Upgrades Hardware News Technology

New DisplayPort 1.4 Standard Can Drive 8K Monitors Over A USB Type-C Cable (arstechnica.com) 156

AmiMoJo writes: VESA has finalized and released the DisplayPort 1.4 spec, which can drive 60Hz 8K displays and supports HDR color modes at 5K and 8K. The physical interface used to carry DisplayPort data -- High Bit Rate 3 (HBR3), which provides 8.1Gbps of bandwidth per lane -- is still the same as it was in DisplayPort 1.3. The new standard drives higher-resolution displays with better color support using Display Stream Compression (DSC), a "visually lossless" form of compression that VESA says "enables up to [a] 3:1 compression ratio." This data compression, among other things, allows DisplayPort 1.4 to drive 60Hz 8K displays and 120Hz 4K displays with HDR "deep color" over both DisplayPort and USB Type-C cables. USB Type-C cables can provide a USB 3.0 data connection, too.
This discussion has been archived. No new comments can be posted.

New DisplayPort 1.4 Standard Can Drive 8K Monitors Over A USB Type-C Cable

Comments Filter:
  • by Anonymous Coward on Wednesday March 02, 2016 @08:29PM (#51625851)

    Call it what it is. Don't break terminology for marketing.

    • by rh2600 ( 530311 ) on Wednesday March 02, 2016 @08:35PM (#51625875) Homepage
      Agreed... 'visually' lossless for images is a bit like saying 320kbps MP3s are 'audibly' lossless for music... Something is either lossless or not, it's a binary...
      • Re: (Score:3, Informative)

        by cheater512 ( 783349 )

        Erm 320kbps MP3's ARE audibly lossless.

        They aren't actually lossless but no one can ever tell the difference in a proper double blind study.

        • by epyT-R ( 613989 )

          Not when you have to convert them to something else.. Also, it depends on the source and the encoder used.

        • They aren't actually lossless but no one can ever tell the difference in a proper double blind study.

          It depends on the song. Even at 320kbps LAME, I can tell the difference between the compressed and uncompressed versions of a particular remix of Maroder's "The Chase" because the square waves go all to shit.

      • by lgw ( 121541 ) on Thursday March 03, 2016 @01:23AM (#51626959) Journal

        Something is either lossless or not, it's a binary

        Assuming that everything is black or white is a poor start to a discussion about visual fidelity.

    • It means instruments can tell the difference but humans can't. Welcome to 1989.

      I'll still need two cables for 8K at 120Hz, eh? That's OK, I've also been waiting for this kind of resolution for thirty years, so I'll be an early adopter this round. Maybe I'll upgrade to a 1.5-based spec when they get around to it.

    • Don't assume everything is black or white.

      There are varying degrees of "lossy" - though I agree, "visually lossless" doesn't give a very scientific description of where this lies on the scale. But there is a scale.
      • There are varying degrees of "lossy"

        Yes, there are. But there is only ONE degree of lossless. Just like there are an infinite number of positive numbers, but only one zero.

      • no, it's a binary attribute. it's either "lossless" or "lossy". the fact that someone may not notice does not change that fact.

      • Losing information is losing information. It doesn't matter what marketing BS you smear around it, a fact is a fact. Whether it makes a difference to the end-user is a subject on its own. 3 and 2.999999996 for many things are perceptibly identical, but in fact are not. That rounding error in another context may well spell disaster.
        • That rounding error in another context may well spell disaster

          True, in another context. But to be fair in this context the rounding error is basically meaningless. It would be nice if they were more up front about this being lossless though.

        • Losing information is losing information. It doesn't matter what marketing BS you smear around it, a fact is a fact. Whether it makes a difference to the end-user is a subject on its own. 3 and 2.999999996 for many things are perceptibly identical, but in fact are not. That rounding error in another context may well spell disaster.

          Whine whine whine. This is about displaying a picture to look at. If you can't tell that there is a difference by looking at the picture, it fucking doesn't matter if you could measure a difference. You aren't looking at the measurement all the fucking time - and if you did you couldn't tell if it was send over this link.

    • Re: (Score:3, Insightful)

      by TeknoHog ( 164938 )
      This. I thought the whole idea of digital display connections was to make things bit-exact. Let's just go back to VGA and the fun of adjusting displays to the signal. Actually, let's go all the way back to analogue computers while we're at it.
      • No, I'm pretty sure the whole idea of digital display connections was to remove the unnecessary digital to analog and back to digital conversion.

      • analogue computers while we're at it.

        The audiophiles would love that... they could be videophiles too! I'm off to patent "sending video display signals via a coathanger (on a computer)" right now.

      • by AmiMoJo ( 196126 )

        There is a whitepaper with little detail: http://www.vesa.org/wp-content... [vesa.org]

        Reading that, it sounds pretty far from lossless. They say that it mostly worked at 8bpp, but wasn't perfect. So not visually lossless at all. They tested with still images too, no work on how it copes with movement. Could be some nasty artefacts as things change from frame to frame, much like how constant bitrate video gets noisy and then suddenly clears up as movement settles down.

        Considering the effort require I'm surprised they b

    • by chuckugly ( 2030942 ) on Wednesday March 02, 2016 @10:29PM (#51626413)
      Very true, however if you have to choose between 30fps and shallow color (which is brutally throwing away a fixed amount of data) or an algorithm that can much more intelligently decide which parts of a 60fps HDR stream matter least, the 'lossy' version is very likely to look better and exhibit better fidelity with a 60fps HDR uncompressed original, even if nothing is 'lost' in the standard color 30fps version after the downconversion.
    • by epyT-R ( 613989 )

      Yup.. there's already too much 'lossy' in hd for the term to mean much anymore.. I don't want to compound it with lossy in my display connection.

    • by Art3x ( 973401 ) on Thursday March 03, 2016 @02:41AM (#51627091)

      I thought the same thing until I read an article by David Newman, an engineer for Cineform. He personally defined visually lossless as "when the compression error falls well below the inherent noise floor of the imaging device" (Visually Lossless and how to back it up [blogspot.com]).

      He says, more or less, that if you set a camera on a tripod and shoot a still life of, say, a bowl of fruit, there still will be a difference from one frame to the next in the video, even in a totally uncompressed signal. This can mainly be blamed on noise in the image sensor. All sensors have a noise floor. So first you measure what that noise is. Then you measure how much degradation a certain compression introduces. If the difference between the uncompressed and compressed signal is less than or equal to the difference between uncompressed frames, then you might call the codec visually lossless.

      Actually he takes it one step further. He averaged 72 frames of the stationary object to mostly remove the noise even from the image sensor. He then saw whether the compressed image differed from this "golden frame" by more than any given uncompressed frame differed from it.

      Yes, yes, yes, there's no telling what standard VESA used, but at this point I think visually lossless can have some meaning. Usually, in fact, video that's called visually lossless is very, very good and can only be discerned at much closer-than-average viewing distances and often with various image enhancements to bring out the noise. In normal viewing conditions, most video professionals, and certainly even more consumers, cannot tell the diffference between the original and any of the codecs that tout themselves (scientifically or otherwise) as visually lossless.

    • by thsths ( 31372 )

      Indeed. "Nearly" is marketing speak for not.

  • by FlyHelicopters ( 1540845 ) on Wednesday March 02, 2016 @08:34PM (#51625871)

    I have to say that I'm more excited about 4k at 120hz than 8k at 60hz, but it is all an improvement.

    As it stands now, 4k displays are wonderful for work, I am typing this on my office computer which has a pair of Acer 32" 4k displays on it.

    Acer 32" 4K IPS display
    http://amzn.to/1poiivZ [amzn.to]

    They are beautiful monitors. Not perfect color and I wouldn't suggest them if 100% color accuracy is your goal, but for general business use, they are just about the perfect combination of size and resolution. My home machine runs a trio of Dell 30" 1600p monitors, and while they are nice for gaming, I can tell the difference between a 30" 1600p monitor and a 32" 4k monitor when it comes to text in Windows. Almost all "jaggies" are gone at 4k, the text is the closest I've ever seen a monitor to get to "paper" look. The 30" 1600p monitors still show "jaggies" in Windows text.

    Now for gaming, they aren't quite there yet. Between the slower response time of IPS and the inability to get decent GPU performance, 4k is a rough experience. I tried several games and I found that while they are beautiful, the limit is the GPU power.

    I did try only a single GPU (GTX 980 TI), I imagine a dual SLI GPU configuration would be better, but I didn't have a second 980 TI to try that out with. 8k will be awhile in terms of gaming, if due to lack of GPU power if anything else.

    ---

    TL;DR - 120hz should be the new standard, it will reduce eyestrain and open up options for gaming and movies that don't exist at 60hz, while the HDR improvements will also be wonderful. I'm not convinced that 8k will show up any time soon or even be needed, but time will tell on that one.

    • by Anonymous Coward on Wednesday March 02, 2016 @09:08PM (#51626045)

      According to the specs for that 32" monitor you linked the PPI (pixels per inch) is 3840/29.1 = 131. The correct text scale setting for your 4K monitor is only 125% (the ideal value for your 4K monitor is 131/96 = 136%). Your Dell 30" 2560x1600 monitor should be running at 100% text scale (aka. 96 PPI), at least that was what I used on my old one.

      Going from 100% to 125% is nowhere near enough to remove the "jaggies" in the Windows text rendering. The most likely reason you like the text better on your 4K monitor is better (newer) IPS display technology in your 4K monitor.

      The ideal monitor size for a 4K monitor for office work is 24" as that puts the scale at exactly 200%. The ideal monitor size to replace your old 30" is a 5K monitor at 27" as also puts the scale exactly at 200%.

      Why 200%? Because that's the magical scale that makes it a "retina" monitor where the OS can upscale old applications linear fashion without aliasing articles like bluring.

      As for HDR improvements, as long as Microsoft's Display Window Manager (DWM) only supports 24 bpp you will never see the HDR feature unless you launch a fullscreen game..

      • by Anonymous Coward

        I have a 40" 4k. Much better for office work as you have some real estate to work with.

        • by jcdr ( 178250 )

          Fully agree. My first 4K screen was too small (28" or 32" I don't remember) and my vision suffered with it. I tested the 4K 40" screen Philips BDM4065UC I was very impressed by it. Can't go back on anything other. My girlfriend have the same monitor now and find the ones she use at work pretty ridiculous in comparaison.

      • My 30" panels have 4 million pixels on them. My 32" panels have 8 million pixels on them. The 32" panels have sharper text because they have double the number of pixels to work with.

        The pixel density and sharpness is a nice improvement over the 30" panels, it is a clear and obvious improvement. It is worth noting the 30" panels and 32" panels are actually about the same size, since the 32" panels are 16:9 and the 30" panels are 16:10.

        The 30" panels are also better screens, in terms of color balance and s

      • by troon ( 724114 )

        Bit of pedantry: the 29.1" width includes the bezel. If it's a real 32" diagonal and a real 16:9 aspect as per the specs, the screen width is 27.9" so 138ppi. #pythagoras

      • by AmiMoJo ( 196126 )

        It depends on your eyesight... 4k @ 27" is pretty good for most people, with either 200% scaling if you have good vision or 150% if you prefer readability. Okay, at 150% you don't get perfect 2x scaling, but 97% of the time it looks perfectly fine.

    • by Archfeld ( 6757 )

      I totally agree. I watch my hockey in 1080 HD at 240hz and it is unbelievably clear with almost no background blur at all. So much better, that watching 60hz broadcasts makes me think I am losing my mind.

      Go Sharks !!!

      • I totally agree. I watch my hockey in 1080 HD at 240hz and it is unbelievably clear with almost no background blur at all. So much better, that watching 60hz broadcasts makes me think I am losing my mind.

        Go Sharks !!!

        You are aware that the original video signal is still stuck at 1080i@60Hz, and that the display interpolates images in between? And that the fact that this looks "better" is proof that the whole process is not "visually lossless"?

        • by Archfeld ( 6757 )

          Honestly no, video interpolation is far from my realm of interest, or understanding. What I am aware of is the background crowd no longer blurs when the camera swings quickly, and the puck remains focused nearly all the time. Tennis matches seem to benefit greatly from the lack of background blur as well. I rarely watch anything but sports so I can't make a statement about anything else. Thanks for the info though.
          Knowing is half the battle, not giving a shit makes up most of the rest of the conflict.

    • by dbIII ( 701233 )
      I got a 28 inch 4k screen because it was the largest I could find that could swivel to portrait mode without an extra stand. MS Windows 7 has problems with the thing around every three weeks and will not work using displayport unless I power off both the computer and the monitor for more than a few seconds. Linux works with the nvidia driver which is a very similar codebase to their MS Windows driver so I don't know what is going on.
      However, with that size or above being able to swivel doesn't really matt
    • Just a few months ago, I replaced my multiple monitors with a 55" Samsung JU7500 TV. Apparently all the 2015 Samsung TVs can do proper 60hz 4K over HDMI 2.0. I picked this particular model for its fast response time: 34ms @ 4K (PC mode) and 21ms @ 1080p (game mode). Color accuracy is obviously not 100% as this is a PVA panel, but the curved screen helps with uniformity, and a little calibration goes a long way. I went SLI 980, mostly to see how Crysis 3 would look (amazeballs). You really don't need to

      • Thanks for the info, I was not aware that any 4k TVs were 60hz. Shame they don't put Display Port on them, that would make it easier.

        You bought a 7500 mode, what do you think of the 6500 models? Do they have 60hz HDMI 2.0 inputs?

        http://amzn.to/1QtpZaV [amzn.to]

        That is a 55" unit for $900, far less than the cost of the unit you bought. It appears to be a slightly lower quality and 120 refresh rate instead of 240, but that would be fine for me.

        At $900, it costs the same as a 32" Acer 4k computer monitor, making it a

        • I think the 6500 would be more than acceptable. I got very good promo pricing on the 7500 during the holidays, otherwise I would have gone with the 6500 or 6700. I suppose you could buy from a store with a lenient return/satisfaction policy if you need the peace of mind. I based my decision on the detailed reviews from RTings.com. Here's their analysis of the 6500:

          http://www.rtings.com/tv/revie... [rtings.com]

          The lack of DisplayPort is no longer an issue, as there is finally have a true DP->HDMI 2.0 adapter (Club

  • Looks like all the connectors are going away and everything is standardizing on USB Type-C connector. Good. Everything piggybacks over USB 3.1 including power. This looks like the best thing.

    • by Parafilmus ( 107866 ) on Wednesday March 02, 2016 @09:15PM (#51626071) Homepage

      Hmm... so you're saying my keyboard's gonna have an AT connector attached to a PS/2 adapter attached to a USB adapter attached to a Type C adapter? Sounds good to me. Bring it on!

    • by Anonymous Coward

      Sounds dangerous. Once all devices use USB with no exception, the USB license could suddenly charge an extra $20 for each device with USB support and there is nothing we can do about it. While I'm pro standards, I'm not happy with a standard, which isn't free. Old standards like the serial and parallel ports were completely free. Anybody could make their own device and hook it up to the computer. USB devices needs to be signed and a license to make signed USB devices cost $4000/year and I think that's for e

      • I work for a business that makra only a couple dozen unita a year (scientific instruments), but we don't pay usb licenses. We just use chips or modules that already paid the fees. And your simple serial and parallel were not well standardized, as you can find varying voltages and even pin usages on serial connections, which was a huge mess for esoteric equipment that needed to assume some voltage, etc.

    • Looks like all the connectors are going away and everything is standardizing on USB Type-C connector. Good. Everything piggybacks over USB 3.1 including power. This looks like the best thing.

      Ah, the "universal" serial bus. This will be great until USB 4 which is a completely new technology, but will have USB 2 and 3 pins bolted on to maintain backwards compatibility.

    • Looks like all the connectors are going away and everything is standardizing on USB Type-C connector. Good. Everything piggybacks over USB 3.1 including power. This looks like the best thing.

      Yeah great. Now everything depends on whether that cheap Chinese USB cable you bought is close enough to standard, not just the speed of your external hard drive.

    • Isn't it great? Before, we had USB A, USB B, Mini A, Mini B, Micro A, Micro B, Micro B USB 3.0 edition, and Apple's various proprietary connectors over the years. Now we can add one more into the mix that will be deprecated in time, so you get to upgrade your stuff yet again in a few years. It's the same shit every time.
  • I realise there are few, affordable, 8K displays on the market but shouldn't a standard assume forward compatibility if it already supports 120Hz at 4K?

    • by Anonymous Coward

      8k at 30 Hz necessarily follows, but not 60 Hz.

      Remember that a 8k display is twice larger in both dimensions than a 4k display, so it needs 4 times as much data per frame to drive a 8k monitor. To compensate, you need to quarter the frame rate.

  • by gweilo8888 ( 921799 ) on Wednesday March 02, 2016 @09:05PM (#51626027)
    Anybody want to drop a fortune on an 8K monitor only to have it ruined by a shoddy cable [pcworld.com]? Anyone? Bueller?
    • Re: (Score:2, Funny)

      by PPH ( 736903 )

      Type C means Chinese, right?

    • by aliquis ( 678370 )

      Anybody want to drop a fortune on an 8K monitor only to have it ruined by a shoddy cable? Anyone? Bueller?

      Use USB they said - it only requires one cable for everything they said:
      http://greenlightgo.org/educat... [greenlightgo.org]

    • by aliquis ( 678370 )

      My joke could had been fun if I had made it correctly, one article had a lot of cables but I didn't knew whatever all was standard USB ports or not so I googled it and just took one but that one didn't even have the type C port so .. that was pretty stupid. 03:35 local time so I blame that. Also there was the SuperSpeed cables as-well.

      So you've got A, B in normal and mini and micro versions:
      https://upload.wikimedia.org/w... [wikimedia.org]
      And then you've got SuperSpeed A, B and micro-B:
      https://en.wikipedia.org/wiki/... [wikipedia.org]
      http [usb.org]

    • by Nemyst ( 1383049 )
      Monitors of that size wouldn't be using the USB cable for power under most circumstances, so the shoddy power connection wouldn't matter.
    • The laptop wasn't ruined by a USB cable. It was ruined by lack of I/O protection on a port. There's nothing stopping this happening on any monitor.

      • by jcdr ( 178250 )

        "Benson Leung
        +Stephen Warren Our initial failure analysis on my Pixel shows that not everything failed. The main battery charger has reverse-polarity protection and did not fail in this case. The Pericom PI3USB9281 that we use on Pixel for negotiating BC1.2 and Apple charging protocols is NOT tolerant to negative voltages and most definitely was destroyed. Furthermore, something killed the ST PD microcontroller that handles this system's power delivery logic. We are still investigating."

        Found into https://p [google.com]

  • "drive 60Hz 8K displays and 120Hz 4K displays..."

    Is this adequate? Wouldn't you want 240Hz so that you don't get eyestrain? I don't watch TV but I've heard that somewhere.

    • by Osgeld ( 1900440 )

      this is for computer monitors and almost all of them are 60HZ cause no one makes a good computer monitor anymore

      besides with compression and even USB3 speeds just cause it can drive a 60 or 120hz screen doesnt mean its video is actually refreshing at 60 or 120Hz, things similar in function have been known to drop frames and kind of smudge the leftovers together for a kind of "tween" effect ala 1990's music videos

    • > Wouldn't you want 240Hz so that you don't get eyestrain? I don't watch TV but I've heard that somewhere.

      In my experience the magic number for a solid frame rate seems to be ~96 Hz .. 120 Hz, especially for CG.

      While it is easy to tell the difference between 30 Hz, 60 Hz, and 120 Hz, I haven't seen any studies comparing 144 Hz and 240 Hz.

      8K @ 60 Hz with HDR (10-bit to 12-bit) is definitely "good" enough for this camper.

  • by NormAtHome ( 99305 ) on Wednesday March 02, 2016 @09:27PM (#51626141)

    I'm guessing that most 32inch 4K display's use an LG panel but a lot of user reviews I see of the LG, Dell and Asus complain of moderate to severe backlight bleed not to mention other assorted issues i.e. could not display at 60hz via displayport using the correct certified cables; some people say displayport on these monitors is broken and even with certified cables the computer and monitors don't see each other all the time. For what people are paying for there should not be these problems.

    • by Anonymous Coward

      For what people are paying for there should not be these problems.

      When people pay for a product, it should do what it says it does without causing problems. Period. If a product claims to do something and it doesn't work as it claims, it's a problem regardless of how much people pay for it. If a product only had to work if the cost is at least $500, then we would get swamped with $400 non-working hoaxes. The law regarding false advertisement and similar doesn't limit itself to only work on some price scales.

    • by nojayuk ( 567177 )

      I've been using a Dell 32" 4k display for about 15 months now, running it at 60Hz over DisplayPort from a budget AMD video card, the cheapest I could buy at the time that had DP output. I don't game on this monitor, other than Spider and the occasional sudoku puzzle.

      I used the DP cable supplied by Dell with the montor (UP3214Q) and it's worked perfectly although it took a little time to set it up and get it to run at 60Hz since it defaulted to 30Hz out of the box for compatibility reasons. No backlight ble

      • I'm glad someone is getting a good experience with these, but it really seems to be hit and miss, you can read reviews of LG, Dell and Asus with plenty of people complaining about the issues I mentioned. Then some people seem to be fine like yourself but also plenty of people with issues, if I'm going to lay out $1,200 I want something that's guaranteed to be problem free.

    • by dbIII ( 701233 )

      some people say displayport on these monitors is broken and even with certified cables the computer and monitors don't see each other all the time

      A weird thing is I'm getting an occasional displayport detection problem (once every three weeks or so) with MS windows but not on linux so there may be some issues that still have to be sorted out. Annoying but if I reboot I can use the monitor in MS windows.

      • I should have said that from my limited experience and from what others have said at least some of the displayport problems are in software.
        Another is how some monitors work badly directly plugged into a machine but work perfectly if a Matrox two screen displayport splitter box is plugged into the same screen. It's looking a lot like problems in software, probably a step above the manufacturers drivers (because on linux the module supplied by nvidia does all of the work and communicates directly with X - n
    • Are you guessing or do you know?

      I have multiple 32" 4k monitors, the Acer version, and they all work perfectly...

      32" Acer 4k B326HK
      http://amzn.to/1pouMDM [amzn.to]

      What monitor are you using that you're having trouble with?

      • I'm not guessing because multiple bad reviews on sites for all of the brands I mentioned plus the model you have says these monitors still need some work, I'm not laying out $900-$1,200 for a monitor that potentially has serious issues.

        On Amazon people have these complaints about the B326HK:

        -This panel has a forced overdrive setting that you cannot turn off. Whenever you move a window/anything around there will be a very noticeable almost flashing purpleish shadow following anything you move around (this dr

        • Two thoughts:

          1. You can return it at no charge within 30 days to Amazon if the specific unit you buy has any of those issues.

          2. People with problems complain, people without problems use the product.

          Like I said, I own 5 of those monitors, they all work perfectly as far as I can tell. My wife has one on her desk and I have two machines at the office that have 2 each on them and they are lovely to use.

          I can tell the uniformity and color accuracy is not as good as the Dell 30" professional monitors, but I was

    • by AmiMoJo ( 196126 )

      The two biggest panel manufacturers are LG and Sharp. However, the panel is separate from the controller board which is the part that has 60Hz issues and the like. The cheapest ones don't even have scalers.

      • I mentioned the panel because the most commonly talked about problem on the LG, Dell and Asus 32inch curved models seems to be a serious backlight bleeding problem which I figure is a problem with the panel. I'm sure (as you say) that some of the other issues such as the 60hz issue and intermittent link problems are controller related.

  • This is a very small step in the right direction.

    The bandwidth requirements of modern monitor cables are insane. We have up to nearly 40gbps between a PC and a 4K monitor with 60Hz.

    The strange thing is: it is already been proven that 50mbps are more than enough (with compression). There are three orders of magnitude in between.

    IMHO the next monitor cable should be something like CAT6A and the GPU and the monitor should speak over an IP connection. This would greatly simplify a lot of things.

    • Not exactly what you are getting at but the main technology that separate Displayport from previous display standards is that the video signal is in fact packetized (but it uses as they call it "micro-packets"). This is how they've been able to get up to 8k/60hz now without changing the cables or the connector. It's not quite as high level as an IP connection but more akin to PCI-Express rather then just a raw stream of bits over a wire.
  • I don't think any DP1.3 devices are consumer-available yet, and they're already releasing the DP1.4 spec. Not that it's their fault, but I wish we could have faster movement on the hardware side.

    • I am with you, I would love to see more Displayport connectors out in the consumer field. The connector is better than DVI/HDMI and it's an open VESA standard. Unfortunately it seems the media companies and the consumer electronics manufacturers decided on HDMI due to content protection and wanting to control the signal path (never-mind that Displayport has and does support HDCP) even though HDCP hasn't prevented anything and is really just a huge PITA for anyone working with it in a commercial AV setting

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...