Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Dell Brings 4K InfinityEdge Display To XPS 15 Line, GeForce GPU, Under 4 Pounds ( 94

MojoKid writes: There's no doubt that Dell's new XPS 13 notebook, when it debuted earlier this year, was very well received. Dell managed to cram a 13.3-inch 3200x1800 QHD+ display into a 12-inch carbon fiber composite frame. Dell has now brought that same InfinityEdge display technology to its larger XPS 15, which the company boasts has the same footprint as a 14-inch notebook. But Dell didn't just stay the course with the QHD+ resolution from the smaller XPS 13; the company instead is offering an optional UltraSharp 4K Ultra HD panel with 8 million pixels and 282 pixels per inch (PPI). The 350-nit display allows for 170-degree viewing angles and has 100 percent minimum Adobe RGB color. Dell also beefed up the XPS 15's internals, giving it sixth generation Intel Core processors (Skylake), support for up to 16GB of memory and storage options that top out with a 1TB SSD. Graphics duties are handled by either integrated Intel HD Graphics 530 or a powerful GeForce GTX 960M processor that is paired with 2GB GDDR5 memory. And all of this squeaks in at under 4 pounds.
This discussion has been archived. No new comments can be posted.

Dell Brings 4K InfinityEdge Display To XPS 15 Line, GeForce GPU, Under 4 Pounds

Comments Filter:
  • by Anonymous Coward on Thursday October 08, 2015 @07:00PM (#50690001)

    This is an incredible achievement. That's like 6 USD! Where do I sign up?

  • by rmdingler ( 1955220 ) on Thursday October 08, 2015 @07:00PM (#50690003) Journal
    Isn't this story supposed to be a different color on the front page?
  • 4KXPS15GPU4 (Score:4, Funny)

    by turkeydance ( 1266624 ) on Thursday October 08, 2015 @07:01PM (#50690009)
    thanks for my new password.
  • Input devices (Score:5, Insightful)

    by dfsmith ( 960400 ) on Thursday October 08, 2015 @07:05PM (#50690027) Homepage Journal
    All it needs is a TrackPoint [] instead of a touch pad/screen and I'll have found my next laptop. A matte screen would be nice-to-have. Lenovo machines are not going in the right direction....
    • by imidan ( 559239 )

      I like touchpads better than trackpoint, personally, but I've used both. For me, a glossy screen is a deal breaker. Matte finish or nothing!

    • So, the 13" does have an AG (= anti-glare) version, but unfortunately this cannot be combined with i7, 16GB RAM or 1TB SSD. If you want these then they force you to take the high res glossy screen. Who thinks of these things??

      I haven't seen the AG version yet, so cannot comment on how it compares with a real matte screen.

    • It'd be beautiful if you could get aftermarket keyboards with a trackpoint added. Probably not possible for most models(I would have expected the falling cost of silicon to make embedding the controller into the keyboard FRU and being able to use a lower pin-count USB/serial/i2C/whatever connection to the motherboard; rather than leaving the keyboard passive and running all the lines from the switch matrix more common; but most laptop keyboard connectors continue to be matrix-type with the actual keyboard c
  • by Anonymous Coward

    Love the design of these but while they have lots of pixels, they're not very great displays. These are decent TN panels but they're not in the league of IPS displays in terms of uniformity (let alone color accuracy). Uniformity is super nice in a laptop display.

    And honestly at 15" 1440p would be plenty sharp. This just seems like more stats for the sake of stats. A 1440p display would likely be kinder to the battery anyway.

    • These are decent TN panels but they're not in the league of IPS displays in terms of uniformity

      Where does it say that this is TN display? I checked the article and also Dell website but couldn't find any mention of non-IPS display. Dell XPS has always been their premium line and used IPS display

      • If it was an IPS (or PLS or some other equivalently premium panel type), I would expect them to say so in the specs and/or product overview. It seems unlikely that they'd fail to brag if their tiny 4k panel was IPS.

    • Well, attfa, if you opt for the HD panel (presumably 1080p), you get 17 hours of battery life. And that definitely wouldn't suck.

    • Windows DPI scaling still sucks at non-integer multiples. 3840x2160 means you can run 1920x1080 @ 2x, while 3200x1800 and 2560x1440 mean you have to have things unreasonably small at 1x, annoyingly large at 2x, or blurry somewhere in between.

      • In fairness to Windows, non-integer multiple resizing simply isn't possible to do well unless all your graphics are vector(and even then, the designer's care and attention can have a strong influence on whether the result actually looks good to people at different scales; but at least there is a mathematically 'correct' answer).

        If you have bitmap elements, integer-multiple resizing is both relatively trivial and possible to do 'correctly'. Non-integer multiple, like lossy compression, can be done in surp
      • 1920x1080 at 15" is arguably small, though. I find a 15.6" 1080p laptop to be painful. Same at 4K and scaling is likely easier to read but there's that trend of makes pushing the higher res number they can regardless of the end user experience. Or maybe some people only use emacs and xterms, know to configure a web browser for a default zoom level or use Metro apps.
          3200x1800 would give an equivalent 1600x900 which in my opinion feels right (Apple uses equivalent 1440x900).

  • by Chas ( 5144 ) on Thursday October 08, 2015 @07:44PM (#50690221) Homepage Journal

    Maybe not with Intel graphics. But, if Dell's previous problems with mating NV graphics are anything to go on, this machine, while looking pretty and sporting phenomenal stats, will probably also have massive thermal issues resulting in instant system shutdowns.

    As sexy as this sucker is, I'd prefer not to be the guinea pig.

    Still, 10 hours of battery life? SEXAH! Oh no! A display with a ridiculous resolution doesn't give me 17 hours of battery life! DARN!

    • At least nVidia has significantly improved their heat/power troubles with Maxwell, and the 960m isn't that powerful.

      I've said it before, but thin and powerful notebooks like this and the MBPs make me wish for reasonably priced Thunderbolt GPUs. This model even has the shiny 40Gbps TB3 port for one.

      Actually, what I really want is for Microsoft to stick a TB port on their next Xbox and let me use it as a GPU.

      • Thunderbolt GPUs are limited to only X4 pci-e 2.0 or 3.0

        • Yes, and that's not a big deal [] for most games and applications.

          • x4 2.0, where the performance drop becomes noticeable with around 15%.

            Also x4 is the max of TB let's say base case 3.5-4.0

            • No need to infer a base case for TB. Unlike most other protocols, the specified speed is the actual throughput and not the speed of the physical layer. So when they claim a bandwidth of 20Gb/s you actually get 20Gb/s - same as if you were using an internal port.
      • I have yet to hear any clear explanation for why Intel appears less than cooperative about the idea of Thunderbolt being used for GPU purposes. There have been a few, heavily integrated and close to model-specific, releases; but the "Here is a box with an x16(mechanical) PCIe slot inside, and a thunderbolt port" market is pretty slim, with the exception of some very, very, expensive cardcages from outfits like Magma [], clearly aimed at audiences with expansion cards that make gamer toys look disposably cheap.
    • by ADRA ( 37398 )

      My GF plays games like skyrim for hours on my couple year old XPS 15 and though the fan is certainly in full swing, there's essentially never a BSOD / hard crash. Hell, I used to play games on my yet older and far larger 2011 XPS15 and though it was heavy and hot as lava, I never got the stability issues you claim. I can't talk about other categories or other vendors, but I've generally been very happy with my XPS15's, even at the high price point they sell for. Oh, and touchscreen is full functional but a

  • What "either integrated Intel HD Graphics 530 or a powerful GeForce GTX 960M" means is that the nVidia driver will make regular windows, and apps like Firefox/Chrome use the slow Intel card for all your regular stuff. Google maps or anything that uses WebGL will slow to a crawl. Only games are "allowed" to run on the real GPU.
    At least, that's how the last laptop I got a year ago with a setup like that worked...
    I have a Core i7-4500U, 16GB RAM, and a GT735M, and it is absolutely painful to use certain things like Google Maps.

    • by msobkow ( 48369 )

      I read that as being a choice between two video options, not as an active split between the two at the same time.

      • by iserlohn ( 49556 )

        It should be both active, just Nvidia Optimus as usual.

      • Will depend on the OS. The MacBook Pros have had this for some time and MacOS will use the dedicated GPU in certain situations. Depends on many factors - what the application requires, is the computer plugged in, how much battery life is available... Apple has more control of software and hardware so implementing this sort of solution is easier for them. I've heard some complaints but not too many. Do not know how Windows manages this. And Linux? Without capable hardware in the hands of developers on

    • Can you not disable the Intel chip in BIOS? It's the only way I could get FreeBSD to recognize my nVidia card.

      • There have been several different flavors of Intel Integrated/Nvidia combinations on the market; with slightly different requirements and options depending on the details of how they are implemented.

        My memory is a little fuzzy; but I think that the earliest implementations had actual 'video out' from both the IGP and the GPU, with switching silicon on the motherboard that sent one or the other to the LCD. Those offered the most visible control over which graphics device was in use(the one that wasn't was
      • by Ark42 ( 522144 )

        On my laptop, no. You CAN disable the nVidia GPU though.

    • Only games are "allowed" to run on the real GPU.

      Anything you tell it to will run on the GPU.

      • by Ark42 ( 522144 )

        The nVidia driver actually greyed out and prevents you from selecting the nVidia GPU for apps in it's known-list. Firefox and Chrome are on the list of programs that can only use the Intel GPU. You can always copy Firefox.exe to Firefox2.exe and then it's not on the known-list. You can browse to it from the nVidia control panel thing, then set that to use the nVidia GPU. Unfortunately, it tends to crash the whole OS a lot if you do that, which seems pretty ridiculous. I tried both Firefox and Chrome and eve

        • Mine's never greyed anything out.

          Unfortunately, it tends to crash the whole OS a lot if you do that.

          That's probably why it's greyed out then...

          I've had the same experience though - not the whole OS, but instability when running Firefox on the GPU.

          • by Ark42 ( 522144 )

            Basically, the end result is that I paid extra for an nVidia card when I bought this, thinking it would *replace* the Intel one, but it did not. Several desktop computers I built from parts all work just fine with nVidia cards, and don't have an Intel GPU or funny Optimus drivers getting in the way of things. Google maps always runs super fast and smooth, and nothing ever crashes. Next time I buy a laptop, I'm going to pay extra attention, and if you can't entirely 100% disable the Intel GPU, then there is

            • You should be able to disable Optimus in the BIOS, and I think that leaves you with just the nVidia GPU. Or your BIOS might give the choice of which to use exclusively.

              I tried it once, briefly, to see if it gave me smoother 60fps YouTube video - it didn't, but it could have been any number of things beyond that, and it involved reinstalling drivers (of which nVidia gives me a confusing number to choose from)), so I might have got things into a mess. I haven't reinstalled since, and I'm still slightly suspic

              • by Ark42 ( 522144 )

                I looked into this before, and in the newer setups like my laptop, it seems common that your choices for GPU in the BIOS are Intel-only, or Hybrid. You cannot select just the nVidia one. There is probably some reason in the hardware that it's not possible now.

    • Google Maps is really awful, since the "let's make it 10x slower" update. Maybe WebGL itself isn't ready for wide consumption except in contrieved set ups, but how much GPU power do you need for a 2D application? Google Earth runs fine on 10-year-old integrated graphics, and butter smooth on old low end graphics card.

      The "right" solution would be for Google Maps to improve through Intel driver updates, browser updates and Google writing code that works better.

      • by Ark42 ( 522144 )

        Even with the crummy update that Google didn't need to do, Google Maps runs significantly faster and smoother on way older, slower hardware (custom built desktops) where there is not a hybrid GPU setup. Having a 100% dedicated nVidia card that everything always uses is great. Having an Intel GPU that is used for anything at all makes having the nVidia GPU a pointless waste of money when buying a laptop.

        • I get it but I'm a bit surprised. I think of Haswell graphics as powerful, though 15W Haswell surely is significantly slower than 15W Broadwell or Skylake, or 37W Haswell.

  • 3 = 4 for large values of 3. It is a 3K display not 4K. They forever messed up the hard disk capacity by conflating k = 1000 in SI units with K=1024 in computer parlance. Same thing is going to happen to displays too.
    • K was never uniformly 1024 in computer parlance. You're simply ignoring history and the computer industry outside your own experience if you believe it.

      K as Ki (i.e. 1024) was always the case for RAM because powers of 2 were extremely natural because of the nature of it.

      For everything else, not so much. Baud rates were always in kilo, not kibi, i.e. 1000s of symbols per second and this rather naturally translated to kb/s not kib/s. Using kib makes no sense for serial protocols. And basically everything usin

      • Seriously a downmod? Looks like we have the rabid defender of the kibi-as-kilo brigade active here.

      • Windows seemed fond of displaying things in KB (that are KiB), e.g. a 720,043 KB file. Was another source of failed CD-R burning, if you failed to account for the difference between a file size in MB (MiB) and thousands of KB (KiB).

        Network speed and hard disk size are arbitrary, like wise e.g. a sound file. But I'm still partial to K = 1024 as even then buffer sizes and sectors size are in "binary" K.

  • by Anonymous Coward

    The best laptop screen resolution ever is 1920x1200.

    Of course 3840x2400 would also be accepted :)

    It's all about the ratio: 16:10.

    Excellent for real work - not just video!

    • I had 2 old Dell Precision M4300 with 1920x1200 displays - those were beautiful. One is now dead but the other still serving my daughter. An 18yo's eyesight allows for really small font sizes being used, so the number of facebook posts being shown at once is incredible (she does lots of work on the laptop too, to give her credit she deserves)
    • Yes... Going 16:9 is fine on a 27" screen but 16:10 is much better for laptops.
  • And yet all of this will be covered with crap in the form of McAfee and a hundred other apps you don't want, but take hours to remove. Or, you can try installing Windows from scratch, which also takes hours, and then your fancy new hardware won't work until you find Dell's special drivers for each device.

    My company's purchases are too small for a business account, so we end up buying consumer hardware. We were only buying from the Microsoft Store, since that at least is junkware free. Now, the Microsoft Sto
    • by Anonymous Coward

      Getting rid of a "Get Skype" button and a "Get Office" button adds $200 to the cost of your laptop?

      While I agree that all the pre-installed stuff is annoying, it takes less than 5 minutes to clean bloatware off a standard Windows 10 install. And surely there's some program to automate the process.

      Having a McAfee program available to pre-install doesn't really slow down the computer, anyway.

      • "Available to pre-install" is not the same as "pre-installed". Unless something major has changed recently, Dells come with McAfee actually installed. Under Windows 8.1, removing it was a multi-step process (run the uninstaller, run MCPR, then repair Windows Update which McAfee somehow breaks). That's not even counting all the other junk, like say Wild Tangent.
    • You don't get the "Browbeat your rep" option; but I'm pretty sure that Dell will sell you Optiplex and Latitude systems in quantity 1, if you have a credit card. I think even Precisions and at least the more boring Poweredge stuff should be available as well.

      You obviously don't have to go with Dell; but unless they've changed something recently; buying small quantities of business class machines should be no more difficult than buying consumer grade.
      • Many of the Latitude series also come with McAfee pre-installed.

        But you're right, there's actually exactly one Latitude that falls near the requirements, the Latitude 14 5000. It has a 14" screen, and costs $1000, making it $100 less than a Macbook Air. Of course, it still comes with an Office "trial", which would likely have to be uninstalled before I could install Office through our subscription. I'm not sure what else would be pre-installed.

        As for alternate vendors: Lenovo's obviously out. I generally as
  • by rlk ( 1089 ) on Thursday October 08, 2015 @08:38PM (#50690503)

    and then it will be a *real* beast.

  • by labnet ( 457441 ) on Thursday October 08, 2015 @09:57PM (#50690777)

    The XPS 9530 is my main machine (predecessor to this update) which also runs a 3200x1800 IGZO display.
    So it seems the main change is making the bezel smaller. Meh.

    The great thing about these latest XPS's, is the high res display and support for external 4k displays. It was sad the PC industry got stuck for 10 years 1920 x 1200 ( then 1920 x 1080); and it was only the tablet/phone industry that dragged them into the high res age.

    I do a lot of CAD and some programming, and have a 4k 27" sitting above the dell 15" are able to use both displays at 100% scaling with no problems.
    It's a great productivity improver; if your eyes are good enough. I find a lot of people who look at my displays say they couldn't deal with the text size.

    The one thing I dislike about the XPS15, is a lack of native Ethernet port and it uses a different size power plug to what dell have used over the last 10 years.

  • by Overzeetop ( 214511 ) on Thursday October 08, 2015 @10:02PM (#50690797) Journal

    A laptop you can do real work on, with a display aspect ratio that's only meant for watching movies.

    Bring back 4:3, make it 3:2 or - hey, how about 1.41:1 to match ISO 216/DIN476 sizing (and just call it an even 5k at 16:11.33)?

    • by Tomahawk ( 1343 )

      I've gotten used to 16:9 monitors in work, to the point that 4:3 looks wrong to me now.

      My monitor at home is 16:10 (1920x1200). I don't really notice the difference, but at times those extra few lines do help.

      that said, I don't code much any more. When I do, though, the 16:9 or 16:10 monitors just don't cut it. Evening turning them sideways means they are too narrow (I'm not an 80-character-per-line type). So I understand where you are coming from.

      But, for most of what I do day to day, whether in work

  • My last Dell laptop sounded like hair dryer. Never again.

  • For those, like me, that don't know what this means, here's a nicely written article explaining it: []

    Wikipedia also cover the topic: []

  • I've been using my XPS 15 9530 for about a year now and it was one of the best gifts I've ever given myself. Someone was concerned about thermal issues. Between the keyboard and the display hinge is a vent that runs almost the entire width of the chassis. The vent is split into three sections, so I am guessing it is two inlets and an outlet or vice versa. The laptop can get warm when playing Fallout or synthesizing a large FPGA design. I can hear the fans kick in when it warms up, but they aren't overly lo

So... did you ever wonder, do garbagemen take showers before they go to work?