Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Displays Graphics Games Technology

Multi-Display Gaming Artifacts Shown With AMD, 4K Affected Too 148

Vigile writes "Multi-display gaming has really found a niche in the world of high-end PC gaming, starting when AMD released Eyefinity in 2009 in three-panel configurations. AMD expanded out to six-screen options in 2010 and NVIDIA followed shortly thereafter with a similar multi-screen solution called Surround. Over the last 12 months or so, GPU performance testing has gone through a sort of revolution as the move from software measurement to hardware capture measurement has taken hold. PC Perspective has done testing with this new technology on AMD Eyefinity and NVIDIA Surround configurations at 5760x1080 resolution and found there were some substantial anomalies in the AMD captures. The AMD cards exhibited dropped frames, interleaved frames (jumping back and forth between buffers) and even stepped, non-horizontal vertical sync tearing. The result is a much lower observed frame rate than software like FRAPS would indicate and these problems will also be found when using the current top-end, dual-head 4K PC displays since they emulate Eyefinity and Surround for setup."
This discussion has been archived. No new comments can be posted.

Multi-Display Gaming Artifacts Shown With AMD, 4K Affected Too

Comments Filter:
  • by Anonymous Brave Guy ( 457657 ) on Tuesday September 17, 2013 @11:19PM (#44880809)

    AMD also seem to have some serious problems, which seem to be worsening with each new driver, on their premium workstation cards when driving multiple displays. We've seen numerous video playback issues, including glitches away from the video area itself, on multi-display configurations. The most likely culprit at the moment seems to be changes in the GPU memory timing. I really hope they fix this soon, because our "professional" workstations are giving our professionals headaches right now.

    • by gagol ( 583737 )
      It is sad how video cards have become gaming toys, with the "pro" version being of the same quality with some features not crippled... I remember when we had those Matrox cards to go with our video editing workstations. Those things were stable as hell. Too bad they did not do well in the 3D realm.
      • by Taco Cowboy ( 5327 ) on Wednesday September 18, 2013 @12:01AM (#44880969) Journal

        I remember when we had those Matrox cards to go with our video editing workstations. Those things were stable as hell

        Back then there were more vendors competing fiercely in the market, and all of them were on their toes as they knew even one slip could turn out to be totally fatal.

        Nowadays, other than AMD and Nvidia, what other serious players do we have ?

        None.

        With the market turns into duopoly both the players no longer have the urge to bring new and innovative features into their new products.

        How many times we have heard of the horror stories brought on by their crappy drivers ?

        Other than lamenting online, the users (no matter if they are casual gamers or professional users) have no other option but to wait for a newer version of the drivers, or roll back the drivers to one that worked.

        ps. I still have several of those Matrox cards with dual video outputs.

        • by aXis100 ( 690904 )

          Matrox are still making some serious professional 2D video cards, my favourite at the moment is a low profile quad head card we use with our operator workstations. They are no good for 3D graphics, but in many situations that's perfectly fine.

        • by Bigbutt ( 65939 )

          Plus bringing up crappy video drivers brings all sorts of fanboi responses.

          My dual AMDs were pretty much crap, blue screening on start pretty much from the start and even having the company check them found no issues with the hardware. One update bricked the system and required a full reinstall of Windows XP.

          I finally replaced them with dual nVidias which also had crappy driver issues from the get go. I stumbled on a forum comment suggesting I use the 306 drivers and the system has been stable ever since (I

        • by Kjella ( 173770 )

          With the market turns into duopoly both the players no longer have the urge to bring new and innovative features into their new products.

          If AMD doesn't get any more urges soon, it might end up being a monopoly. Here's Anandtech's take [anandtech.com] on the server market right now:

          At the end of last year, AMD was capable of mounting an attack on the midrange Xeons by introducing Opterons based on the "Piledriver" core. That core improved both performance and power consumption, and Opteron servers were tangibly cheaper. However, at the moment, AMD's Opteron is forced to leave the midrange market and is relegated to the budget market. Price cuts will once again be necessary. Considering AMD's "transformed" technology strategy , we cannot help but be pessimistic about AMD's role in the midrange and high-end x86 server market. AMD's next step is nothing more than a somewhat tweaked "Opteron 6300". Besides the micro server market, only the Berlin CPU (4x Steamroller, integrated GPU) might be able to turn some heads in HPC and give Intel some competition in that space. Time will tell.

          I think we all know FX-8350 is no match for Intel's high end in the desktop market either and they're struggling with power efficiency in the laptop market. AMD is exiting all the markets where they're exclusively competing with Intel and entering all the markets where they're competing with Intel and half a dozen ARM competitors. As the saying goes, out of the frying pan and into

        • Other than lamenting online, the users (no matter if they are casual gamers or professional users) have no other option but to wait for a newer version of the drivers, or roll back the drivers to one that worked.

          No, I think we have at least one other option: next time we're specifying new workstations, we can just use (relatively) cheap gaming cards, instead of paying a factor-of-several premium for workstation cards. The latter are often the same basic hardware, but cost more because their "certified" drivers supposedly have better performance and guaranteed compatibility with major content creation applications. Why pay the premium if the reality is that the premium drivers are no better (or, in this case, much w

          • I whole heartedly agree with the anonymous brave guy. As a consumer, I don't need server class or high end workstation class equipment for my personal machines, and I can typically buy hardware that is just as perceptibly stable to me. This is different if you're specifying out requirements for an aviation console or emergency services ground vehicle console or some other high criticality use case, but you also want a metaphorical chain of support to a vendor that you can yank on if it hits the fan in tho
      • Re: (Score:2, Insightful)

        by Anonymous Coward
        I hate to break this to you, but video cards have always been gaming toys. From the days of hires monochrome modes, to CGA, EGA, VGA and then ever faster and faster cards, the driving force was always games. I've always kept on the cutting edge with video cards, from Hercules and ATI in the early days, to Tseng, Matrox and 3Dfx in the 90s to Nvidia from 2000 to current. Know why? Games.
        • by Dunbal ( 464142 ) *
          To be honest the Tseng also worked wonders for Windows 3.1, not just games :) Ahh the bad old days... God I've spent so much money on computers and computer stuff. Sigh.
      • These days, you can't get more features out of your gamer card by tricking the card or driver software into believing it is a "Pro" card any more. If you buy a Pro card now, it's usually based on the previous generation of chipset, with a well stabilized and thoroughly tested driver, compared to the very short time to market that top range gamer cards get. The big problem is that newer chipsets often are run on the same driver and iterations between the chipsets are often nothing more than a die shrink size
      • by AmiMoJo ( 196126 ) *

        Actually there is more to it than just crippling them artificially. The pro cards go through more extensive testing to make sure that their output is pixel-perfect correct. It is debatable how much difference a very slight rendering error or discoloured pixel will be when working in a CAD package, especially when the screen is updated rapidly anyway.

        The pro cards are also calibrated and guaranteed to produce accurate colours, where as the consumer grade ones are not. Of course these days that isn't much of

        • Actually there is more to it than just crippling them artificially. The pro cards go through more extensive testing to make sure that their output is pixel-perfect correct.

          That's the sales pitch. I'm still waiting for any practical evidence that a meaningful amount of extra testing actually happens, or produces measurably better results if it does.

          Historically, a lot of the practical difference between workstation and gaming cards has been in their floating point precision and performance, and that is definitely an area where major product lines have been artificially nerfed. Sometimes this has been embarrassingly obvious, for example when a new, high-spec gaming card that sh

    • Re: (Score:2, Informative)

      Take a gander through the links at https://www.google.com/search?q=\device\video5+Nvlddmkm [google.com]

      Nvidia's 32x.xx drivers have actually been destroying hardware
      • by Anonymous Coward
        yeah don't worry about AMD problems, look over there at nVidia problems instead! seriously can we not have a discussion without some idiot trying to turn it into a flamewar? if you want to discuss nVidia start a different thread instead of hijacking this one.
        • by Dunbal ( 464142 ) *
          See if you were running linux instead of windows then you wouldn't have these.... lol. Welcome to slashdot :P
        • by Mashiki ( 184564 ) <mashiki@gmail.cBALDWINom minus author> on Wednesday September 18, 2013 @06:37AM (#44882187) Homepage

          As a useful point, this has been an on-going issue with Nvidia drivers since about 290ish--and in the last three releases on 400,500 and some 600's where the drivers were so bad that they caused hardlocks across the board. Where either the drivers have been crap, or causing hardware lockups, or the various reports that can't be confirmed of them nuking hardware. In fact, it got so bad back 6mo ago that nvidia was looking for people in the continental US to send their entire rigs in to their hardware labs for testing. So, people thinking that this is a "flameware" or some other asinine thing, need to realize that there's driver issues on both sides. Sometimes however, the issues are more serious than reported for one side or the other. And between the two, nvidia has the more serious driver issue, and that's coming from someone who's last 6 cards have all been nvidia made by evga--three of which that had to be RMA'd because of a sudden hardware failure after a driver update.

          Thinking on this a bit more, it reminds me of how nvidia was at one point blaming the driver reset issue only on "bad configurations" and "PSU power issues" until it was found that undervolting or overvolting(mainly) the cards solved this problem. Especially on the 500 series cards, this was of course after they had adjusted the voltage supplied to the cards downward, in order to make them run cooler.

      • How does that help us with the problems with AMD cards?
        • by Anonymous Coward
          I believe the point was they both are sucking
        • AMD has long had driver performance issues, compared to nVidia. Their hardware started really kicking some ass with the 4000 series and was just dominant with the 5000 series, but the software side has had some issues. I'm not sure what the issue is, maybe they need more people, maybe they need better people, maybe they need a better process. Whatever the case they end up having more issues. Stuttering and rendering partial frames has been one (that they have largely cleared up with single display setups),

          • See, people say AMD has the driver issues, but the only issues I've ever had with drivers in the past 7 years or so was with an Nvidia card (and, actually, that was more a problem with the game). OTOH, the only video card hardware failure I've had was also Nvidia. Really, I think it's a case of YMMV. Some people have no problems with AMD, some have tons. Some have no problems with Nvidia, some have lots. I personally buy AMD stuff in part to help keep competition alive (and because their stuff is usually pr

            • I have tried a small handful of ATI/AMD cards over the years and have never had good results. Performance in Windows was always mediocre and I had massive stability or features not working on Linux. I will admit my last attempt was several years ago (less than 4 though). I'm not talking about embedded or mobile chipsets either, but dedicated AGP or PCIe.

              (before you go all AMD fanboy on me (I'm hoping you wouldn't but just in case), I should mention I've been using AMD processors since I abandoned my old Pen

            • OTOH, the only video card hardware failure I've had was also Nvidia.

              With video card failures, be they AMD/ATi or nVidia, you have to realize that the only part those companies make is the actual GPU, they don't make the actual card so the cooling system, caps, RAM and all the other components are made by others and assembled by an OEM so there are many points of failure that are completely unrelated to AMD/ATi and nVidia.

    • by cheater512 ( 783349 ) <nick@nickstallman.net> on Tuesday September 17, 2013 @11:38PM (#44880891) Homepage

      I've got 5 monitors connected to 2 ATI cards (Linux + Xinerama).

      The most interesting artefact I've seen is some apps can corrupt the cursor so the pointer is a little bit of random memory contents.
      But only on some monitors. Move it to another monitor and it may come back, move it to the original monitor and it dies again.

      There must be some really fun bugs in their drivers that rear their heads with massive setups.

      • by gagol ( 583737 )
        Have you tried the radeon driver?
        • I'm running 5760x1200 across three monitors on an ATI Flex card using the radeon driver. No problems here. But then again, I don't game, I don't run multiple GPUs in a CrossFire setup, and I don't get near the ATI binary drivers, so it's all good.

          • I'm running 5760x1200 across three monitors on an ATI Flex card using the radeon driver. No problems here. But then again, I don't game, I don't run multiple GPUs in a CrossFire setup, and I don't get near the ATI binary drivers, so it's all good.

            3 monitors probably works a treat, have you tried with an even number though?

            When I tried running this a few years back it annoyed the crap out of me that alert boxes would always end up centered over all the displays so bang on the boundary of two monitors. What I wanted was two separate displays I could drag windows between but have everything default to appear on the primary monitor like it did under Windows.

            Not bothered experimenting with multiple monitors since as it was such an arse last time. Have th

        • No I haven't. I was going for a obscure setup (3 monitors on one card, 2 on the other) and I wanted it running quickly (ooh shiny) so I just went with the binary driver.
          Probably should give the radeon driver a whirl when I get some time.

        • I can confirm this happens even on dual-monitor setups with the default driver. It is extremely common when playing a full screen game on one monitor and leaving the other up for your background stuff, even with the cursor stuck to the gaming monitor. This happens to me when playing Dota 2.

          It is common to the point where there's threads about it spattered around the internet.

      • by mkairys ( 1546771 ) on Wednesday September 18, 2013 @12:10AM (#44881005) Homepage

        I've got 5 monitors connected to 2 ATI cards (Linux + Xinerama).

        The most interesting artefact I've seen is some apps can corrupt the cursor so the pointer is a little bit of random memory contents. But only on some monitors. Move it to another monitor and it may come back, move it to the original monitor and it dies again.

        There must be some really fun bugs in their drivers that rear their heads with massive setups.

        I actually get this exact same problem on my Windows 7 desktop (3 monitors). The primary display cursor will sometimes have fragments of the cursor graphics or loading animation displayed but moving the cursor across each screens fast and back again can sometimes resolve it. Interesting that its a problem on both platforms.

        • by Stalks ( 802193 ) *

          I too get this issue.

          It is repeatable by moving the cursor along the bottom edge of a monitor boundary and bringing it up at the other side a few times.

          This is also the fastest way of returning the cursor back to normal.

          I have had this issue for nearly 3 years with countless driver updates, no fix in sight.

        • Maybe it's an issue with an odd number of displays? Can you guys try reducing/increasing the number of displays to four and see if it has similar issues?

          We have numerous workstations using AMD video cards and two displays with no issues.
      • That cursor corruption bug is actually very, very old and seems to have resurfaced recently as of 13.4 or so. They were _supposed_ to fix that with the last patch (its in the patchnotes), but I still get it...
      • by antdude ( 79039 )

        I have the same problem, but in my very old WIndows XP Pro. SP3 with dual screen setup (19.5" CRT TV + 19" LCD monitor) and ATI Radeon 4870 video card (PCIe; 512 MB of VRAM). ATI/AMD's software is buggy. I had to downgrade back to old ATI Catalyst driver v9.4 since newer drivers cause Windows XP's clock to slow down with DVI and rare, random hard lock ups with videos. :(

      • You don't need multi monitor to corrupt the mouse pointer in ATI cards, a problem very similar to what you describe happens sometimes when you play certain games in full screen windowed mode. It fixes itself after a restart, or when you open a new app that steals mouse cursors, like the Windows 7 Magnifier.

      • I had an artifact like that pop up on my desktop, except that was under Windows. More peculiarly, it corrupted the pointer slightly differently - columns were out of order - and it did so even after changing the pointer. And just like yours, it was only on one monitor, even though both my displays are being driven by one card. I eventually fixed it by disabling then re-enabling the affected monitor in the Catalyst control panel. I'm sure a reboot would have worked, but who wants to do that?

  • AMD Experience (Score:2, Informative)

    by Metabolife ( 961249 )
    I started off with AMD cards maybe 6 years ago before I tried an NVIDIA card. It really is just a smoother experience overall. I don't know what it is, but I've been shying away from building any new systems with an AMD lately.
    • by gagol ( 583737 )
      I had an ATI RageII with my P2 166MHz. When using hardware acceleration for MPEG(1) videos, it played with inversed hue, even the video bundled with the drivers did not play correctly.
      • by gagol ( 583737 )
        It was a 233MHz, sorry, it was in 1998, 15 years is a long time! the 166MHz was my previous pentium...
    • Comment removed based on user account deletion
    • by afidel ( 530433 )

      I've had both over the years, and both have bugs and issues, though the ATI driver folks are certainly more consistently stupid, and they're the only ones to leave me so ragefaced that I decided to buy a new card instead of deal with a bug (stupid card re-queried the GDI table from the monitor/tv at every boot and overrode the existing settings so even if you forced it to use 1080p if the monitor reported 640*480 it would reset to that every single boot), something not even the fine folks at 3dfx had manage

    • I've run ATi cards off and on since the 8500. Never had issues.
    • I started off having problems with ATI drivers twenty years ago with Windows 3.1 and the Mach32. Even Radius could make more stable accelerated video drivers. Hell, so could S3.

      Today, people are still having serious problems with their ATI video drivers, now they're just called AMD.

      ATI can't code their way out of a nutsack.

    • My favorite AMD card story is my Radeon 6750. In an AMD Phenom II system with an AMD chipset motherboard the thing was nothing but trouble. The video drivers would blue screen the computer every few days. Tearing problems playing videos, and other random glitches. Well, time came to replace the motherboard and CPU, and I went with Intel. I wanted to replace the graphics card too, but lacking the money at the time I held my nose and put the AMD card in the new PC... and haven't had a problem with it sin

  • getting worse? (Score:2, Interesting)

    by gerardrj ( 207690 )

    I was playing flight sims on my Quadra 900 in the late 80s/early 90s with 4 displays. The resolutions and detail may be higher today, but I never had any issues or failures of the system. FA/18 Hornet was my favorite.

    • Wow, another F/A18 Hornet 3.0 player! I still play this occasionally today - I haven't found anything like it. Know of anything comparable that is newer (other than the updated version of Hornet which has a jittery cockpit view)?

      </offtopic>

    • It's not that it's getting worse it's that it's getting more complex.
  • That pretty much hints at a driver issue, or bad GPU sync.

    • They were able to fix the sync on single display options but as for multi displays they couldn't, my guess is due to to much fps loss. Nvidia does same job with hardware on their video cards which they have had since like gtx 8000 series. its something amd will have to do in the future but how long it will take them to get it working is another question. Seems like i heard took nvidia like 2 years of work to do it on their end.
      • by Khyber ( 864651 )

        2 years of work? I bet you ten to one it was two years of negotiating with Matrox for the patent licensing.

  • FUD, Nothing but FUD (Score:1, Interesting)

    by Anonymous Coward
    Hmm...AMD is about to drop Hawaii and retake the single GPU crown and GUESS WHAT! Another round of FUD from the company that cancelled their next premium GPU in the name of funding also-ran ARM devices and niche handheld gaming devices. Fuck off, Jens. You played your hand now reap what you sow.
    • except its true and measurable

      Even me on my shitty 4870 with two monitors have problems under windows 8. Everything is fine with one monitor active, but turn on dual monitors and all of a sudden I get flickering artifacts in 3D game on the main monitor.

  • by Anonymous Coward

    Note that the words "driver" and "version" don't occur on the page. There is a know issue that AMDs been working that sounds a lot like this issue. It's been known for months, they've got a "two phase" plan to attack it, the first of which is implemented in the current beta driver-set.

    The timing of this article is very suspect. They're either reporting on a new problem (and totally failing at providing any relevant data on their configuration), or they're simpy regurgitating an already know issue, like doin

    • by Vigile ( 99919 ) *

      Maybe if you read the story, you'll find the "driver" and "version" are mentioned for both AMD and NVIDIA setups.

      This is not a "bug" bug a substantial issue with advertised features.

  • by Beardydog ( 716221 ) on Wednesday September 18, 2013 @01:11AM (#44881223)
    The biggest problem with multi-monitor gaming is that it's just plain garbage in any kind of "surround" configuration. Apart from Fisheye-Quake and some fancy pants flight sims and racing games, arcing three or more monitors does nothing but waste power and processing capability to render a smeared-out mess on every monitor but the one in the center. Most games aren't even mathematically capable of producing a 180-degree FOV. I've never been quite sure who should get the ball rolling in that department, but I've just decided it should be Valve. I don't have a good reason. Get on it, guys! Ubiquitous support for rendering games to multiple-viewports.
    • by Sabriel ( 134364 )

      What are the odds that VR gear like the Occulus Rift will keep multi-monitor gaming from becoming more than a niche market?

      (and with VR, you can render additional informational displays _within_ the game)

      • I heard you like monitors, so I've rendered some monitors on your monitors so you can monitor while you monitor.
      • Multi-monitor gaming is *already* a niche market. Most gamers have one display, or perhaps one gaming monitor and one or two monitors that aren't used for the game (for media players, chat or, if playing Eve, a few Excel docs).

        The problem is that there's so many things you need:
        3-6 identical monitors, or monitors that are very closely matched in one dimension and in pixel density
        Monitors need to have small or nonexistent borders
        A mount capable of holding them all in exactly the right spots
        A video card (or c

        • Yes. My guess is that most people with multiple monitors have multiple different monitors. Thats at least true in my case, where I ran a 17" 1280x1024 LCD at home for many years on my primary system before finally picking up a 22" 1920x1080 and now run both.

          Multiple different monitors doesnt make for a good multi-display gaming setup.
    • Multi-Monitor setups typically assume that all of your monitors are arranged in a single plane, with the scene rendered based on your nose being some distance away from the exact center of your primary display. Instead each screen should be rendered based on a separate camera, from a POV that is off center.
    • by Luckyo ( 1726890 )

      That depends. If you can adjust field of view in game, or game automatically adjusts it for you, it's of tremendous advantage as it does in fact give you a wider field of view.

      If not, then it is indeed useless.

      As a point of comparison: it's considered cheating in most first and third person shooting games multiplayer to increase your FoV beyond certain limit. This is so because it gives you vastly superior awareness of your surroundings, making it much harder to surprise you with flanking. Multi-monitor set

      • As a point of comparison: it's considered cheating in most first and third person shooting games multiplayer to increase your FoV beyond certain limit.

        An attitude which I never understood. Games designed to enforce a 90 degree FOV fail to take into account that on average, our peripheral vision encompasses about 150-160 degrees for most people.

        This is so because it gives you vastly superior awareness of your surroundings, making it much harder to surprise you with flanking.

        Well, that's sort of the point o

        • by Luckyo ( 1726890 )

          Judging by your response, you do not understand the issue at all. Our peripheral vision and our field of view is in fact irrelevant in the discussion of game balance/fairness.

          The point is that it's possible to project a much wider field of view onto the screen, up to full 360, giving you complete awareness of your surroundings. It would be uncomfortable to use initially until you trained yourself for it, but after you train your eyes and brain to accept it, you would become vastly superior in any game where

          • by sgtrock ( 191182 )

            Oh, I understand the issue perfectly. I've been playing FPSes for about 20 years. Yes, at times I've played on organized ladders (sometimes with a great deal of success). :-) I contend that FOV is in fact the heart of the issue for game balance/fairness or we wouldn't be having this debate.

            Older games that allowed complete freedom of the definition of the FOV were generally limited in the ladder play that I participated in. However, the limits were always larger than 90 degrees.

            My bone of contention isn

            • by Luckyo ( 1726890 )

              I know one person who played with 120deg horisontal FoV view per monitor on three monitor setup. He basically had a 360 degrees panoramic view compressed into approximately 160-170 degrees around himself.
              It was almost impossible to surprise him in games where he would hack FoV to be like that. He would see someone approach in his peripheral vision even if you came from behind. It was utterly silly, and for him it was playable enough to be worth it. I could never get over the whole fishbowl look, but it work

              • by sgtrock ( 191182 )

                I don't play FPSes on consoles so I can't speak to what makes sense for FOV there. I've never been willing to give up the fine degree of control and responsiveness that you get from the keyboard+mouse combination.

                I agree with you that three monitor set ups with 120 degrees per monitor does cross the line. That's a bit much to be able to accept. :-)

          • I played Quake Team Fortress (the original, for Quake 1) quite competitively. So there was no zoom key for sniping and the like, you just had to play with FOV. You made some binds to toggle FOV leves as you saw fit. This lead me to try bigger FOV numbers, and that worked too. So I had 4 FOV buttons, 10, 30, 90, and 160. 90 was where I played most of the time, 30 and 10 were for sniping, which I did rarely. 160 was for flag defense, which is often what I was assigned to. I could watch an entire flag room fro

            • by Luckyo ( 1726890 )

              You'd have to get used to disorienting "fake zoom" effect and fishbowl effect from widening FoV to see the room. But once you do get used to it, it's going to be a great way to play.

      • I'm not just looking for an advantage (and not talking about competitive games, importantly. I know how important FOV equality is at that level), I'm just looking for immersion. Skyrim with an ultra-wide FOV let's me see approaching enemies a little sooner, but it looks absolutely atrocious. Beyond that, human vision can cover a 270-degree field if you allow eye movement (but not head movement). That's 90 more degrees than any shooter will give you.
    • by Nemyst ( 1383049 )
      Actually, the logical candidate is id. Carmack is now working at Oculus Rift, but you can bet he's still influencing id's graphics development, and the Oculus Rift needs games to work with very wide FOVs in order to look good.
      • by Zencyde ( 850968 )
        I'm borrowing an Oculus Rift and, honestly, it's not worth toying with on anything with less than native support. I agree with what you say entirely.
  • I'm running what's called a "5x1P" (that's 5 portrait monitors arrange horizontally) array and have been using Eyefinity since it was released with the Radeon 5870 HD. Multiscreen is kind of the thing over at the Widescreen Gaming Forum. I gotta say, I've been experiencing little issues here and there for a while. Problems with tearing seem to originate more from using alternating port types. When grouped in, for instance, all DisplayPort, there is no tearing. I imagine a lot of the issue arises from having

Technology is dominated by those who manage what they do not understand.

Working...