Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Portables (Apple) iMac Apple Hardware

Apple Announces New iMacs With Better Screens And Modern Processors; Refreshes MacBook Lineup (arstechnica.com) 134

Apple today announced updates to its iMac line and MacBook lineups at WWDC, giving its all-in-one desktop, and laptop series more powerful specifications and the latest Intel chips. From a report: Apple is bringing Intel's 7th generation Kaby Lake processors to the new iMac, along with what Apple calls "the best Mac display ever," offering 500 nits of brightness, or 43 percent brighter than the previous generation. The 21.5-inch model now can be configured up to 32GB of RAM, while the 27-inch goes up to 64GB, twice what had previously been offered. The new iMacs also are getting two Thunderbolt 3 USB-C ports, making it Apple's first desktop computer to embrace the port standard. Graphics cards are getting a spec boost in the updated iMacs, too. The entry level 21.5-inch model will have an Intel Iris Plus 640 GPU, while the 4K 21.5-inch models will get Radeon Pro 555 and 560 graphics cards. Meanwhile, the 27-inch 5K model will have a choice of Radeon Pro 570, 575, and 580 graphics cards, topping out at 8GB of VRAM. The 21.5-inch iMac will start at $1099 and the 4K 21.5-inch model at $1299. As expected, Apple also refreshed the MacBook lineup. From a report: Today Apple provided a minor but wide-ranging refresh to its modern MacBooks and MacBook Pros, adding new processors from Intel and making a handful of other tweaks. The new processors are from Intel's "Kaby Lake" family, and some of them have been available for the better part of a year. Compared to the outgoing Skylake architecture, Kaby Lake introduces a gently tweaked version of Intel's 14nm manufacturing process, provides small boosts to CPU clock speeds, and supports native acceleration for decoding and encoding some kinds of 4K video streams.
This discussion has been archived. No new comments can be posted.

Apple Announces New iMacs With Better Screens And Modern Processors; Refreshes MacBook Lineup

Comments Filter:
  • Will the 18-core processor be the new one that Intel recently announced or the older 18-core processor for servers?
    • Yes. The i9s Intel announced are just cut down Xeons. They're the same fucking thing.

      • by Anonymous Coward

        Not quite. They are (like all the previous enthusiast chip, which used to be branded i7) based on the Xeon E5 / E7 die. The desktop chips, meanwhile, share a die with the E3 Xeons.

        They have several things missing: ECC support, muliprocessor support, probably something related to the data bus. But, they are selected carefully to he capable of very high clock speeds, and are unlocked to allow full overclocking: Xeons have nothing like that. Last year's 6950 was a 10 core that was a lot faster than the equ

        • The iMac Pro uses ECC RAM, it was in the keynote.

        • "They are" implies that they have working chips and partner boards at the moment. There is probably an engineering sample running around somewhere of the chips, but the boards they have can't run the higher core chips at full clock, and they gonna have to release a v2 of the 2066 socket to maintain power demands of the 14,16, an 18 core chips. You'll be lucky to see the 18 core released by January, and probably considerably later.

  • Pros still need a real computer that lets us add and remove PCIe cards.

    • I think the external PCIe enclosure you may have seen is the answer to that.

    • by slack_justyb ( 862874 ) on Monday June 05, 2017 @02:31PM (#54553119)

      Really? I'm not denying that, but I am questioning it. Programming for some time now and I can't remember the last time I needed to crack open the case to change out an expansion card of all things. I know a couple of folks in the graphic arts department and likewise, most of their editing and asset management hasn't required a change of things that would typically go into a PCIe slot. So I am curious as to which fields require a constant refresh of what's in the PCIe slots?

      Now if we are talking gaming, end user side, I can see that. So with a flexible enough definition we can call them professional? I'm a little out of the gaming loop so I don't want to grant a title to gaming that it doesn't have, but at the same time don't want to snub a legitimate group there. But gaming development, of which I don't do (sorry mostly deal with standard grade C++ and database programming) maybe then there's a need for it?

      I'm just struggling to put a solid finger on who exactly needs a constant refresh of cards but at the same time doesn't need a refresh of CPU/RAM/etc at the same time. Is this a common thing in that industry? Not hating on your comment or anything but it now has my curiosity peaked.

      • apples prices for ram / hdd / video cards suck it at the point to where for the update price you can just buy the bigger one but with apple you need to pay the upgrade price and you don't get to keep the old one.

      • My big concern about the Macs in general, isn't the quality of the upgrade, but the lack of them in the past few years.

        The 2002-2012 Macs where really the premium system you can get. But after that they just kinda lagged and updated some of the specs to make sure they are not obsolete.
        I was a Mac user, then I switched over to Lenovo Think Pads. Because if I am going to get a boring old laptop. I might as well get one that is solid. For me to switch back, Apple will need to show evidence of a long term com

      • by enjar ( 249223 ) on Monday June 05, 2017 @03:36PM (#54553615) Homepage
        GPU capability has been outstripping CPU capability for some time, and are easily upgraded as PCI devices. In terms of compute, the nVidia 1060 GPU was introduced in 2009 and could do 77.6 GFLOPS of doubles computation. There have been five generations since (Fermi, Kepler, Maxwell, Pascal, Volta). Pascal is the most widely available and can do something like 5000 (yes, 5000) GFLOPs of doubles. On the consumer grade gaming cards in the GeForce line there have been pretty much the same generational leaps, plus the development of high DPI screens, 4K monitors, etc. Given that companies will buy PCs for a three or four year lifetime, depending on how things fall you could update the GPU 2-3 times and extend the life of the machine, especially if you also do something like move from a spinning HDD to a SSD as well.

        In comparison, processor speed, core counts and RAM amounts have increased only modestly, and for many users the currently available amounts of RAM are still OK.

        • GPU capability has been outstripping CPU capability for some time

          I get that GPU speeds have been on the rise, but a GPU cannot do everything a CPU can. So upgrading a GPU isn't exactly like upgrading a CPU. Additionally, GPU can only help with software specifically written to use that kind of acceleration. Each element in a GPU by itself is slower than a CPU so the speed bump only comes when a task can be spread across as many elements as possible. That is, while 77.6 GFLOPS might be what it can do, you have to be able to spread your task out enough to get to that va

          • by enjar ( 249223 )
            Where I work we have code and software used for GPGPU computing using CUDA. I have servers that have been in service for about four years now and are approaching end of life. They have had three generations of GPUs in them. More recent developments in machine learning/neural nets/machine vision are using the Pascal cards (GTX 1080, Titan, 1080Ti) to do that kind of work. For $1000 or less you can get 12 teraflops of single precision math out of those things. That was a significant improvement from the earli
            • That's very interesting, thank you. So GPUs play an important role in neural nets? I can see where that would be very important to people like Facebook and what not. Additionally, I would assume that they have big GPU setups in the workstations to allow for slicing and dicing of data sets from some server.

              I guess being mostly in a text editor I don't see the fuss over 4K monitors, but I figured that for people with spreadsheets and presentations and handling assets like our digital media department does

              • by enjar ( 249223 )
                GPUs play a very large role in deep/machine learning, which includes among other things:
                • image classification (facial recognition, what's in an image, detection of cancer from mammograms), automated driving and the like
                • natural language processing (Siri, Google assistant, Amazon Echo)

                As for 4k monitors, with pretty much every reference being online, more screen area is great. A developer in our organization might have an IDE open, Stack Overflow, some other browser tabs, mail client, our application, etc.

        • by Anonymous Coward

          Question: Will these new Macs finally support OpenGL 4.5? As far as I can tell [apple.com], the highest version that Apple currently supports (as of Feb 9, 2017) is OpenGL 4.1.

          OpenGL 4.1 was released almost 7 years ago on July 26, 2010, and OpenGL 4.5 was released almost 3 years ago on August 11, 2014. That means Apple is stuck 13-14 GPU generations behind Windows and Linux, which have been using OpenGL 4.5 for the past 5-6 GPU generations.

          p.s. Don't tell me to use Metal/Vulkan, and don't tell me to check for OpenGL ex

      • I'm just struggling to put a solid finger on who exactly needs a constant refresh of cards but at the same time doesn't need a refresh of CPU/RAM/etc at the same time.

        Programmers for games, 3D animators, and anyone doing any GPU intensive activities. GPUs are making decent strides in performance while CPUs are remaining relatively lame in comparison. As we load the GPUs up more and more it is finding a new purpose as one of the most critical components in a computer. /Disclaimer: I'm on my 3rd GPU upgrade on my 6 year old CPU and RAM combo which I still don't feel the need to touch.

        • Programmers for games, 3D animators

          I assumed as much, but I would have figured that the major amounts of rendering would be done on a farm while the modeling would be done on the workstation. But I assume that we've progressed to a point that work stations are now doing a lot more work than just the basic modeling and the farm is doing even more complicated things?

          Thanks!

          • Oh rendering still is done on a farm, but don't underestimate just how much effort goes into making that GPU fan spin while modelling.

            But I came up with anther one looking farther down in the replies. There's a bitching about HEVC vs AV1 and hardware decoders. ... Video editors, the once bread and butter of Apple. They may have some desire to either upgrade their GPU for something which processes some CUDA code faster or, something more dedicated like this: http://www.advantech.com/produ... [advantech.com], though I note t

      • It IS a common thing in film production; being able to add a video I/O box is a huge asset for a computer being used for media work. A good graphics card isn't always enough, you also need to be able to get an SDI I/O option in order to use software like PreLight, for example.

        For serious media production, you need a LOT more computing power than for programming... for working with for example 8K footage in real time while grading, or with dual 4K streams at 120fps... hence Baselight X includes a couple of

    • Apple promised a refresh of Mac Pros in 2018 [digitaltrends.com] and indicated it would be a "modular" system.

      • Apple promised a refresh of Mac Pros in 2018

        What they actually said was "not this year." And while it may well be in 2018, they didn't say that.

        • Good point. It looks like a lot of the press are just calling it the "2018 Mac Pro", even though Apple never actually gave a firm date.

          • Yes. The iMac "pro", which they showed yesterday, will appease a reasonable portion of the people waiting (and waiting) on a new Mac Pro. But not all, of course.
            • From what I understand, they also refreshed the current Mac Pro model as well, so people who happen to need high-end Mac hardware *now* at least have some better options.

    • by Lumpy ( 12016 )

      No we really dont. only reason I built an ATX machine this last time with the 7700K is because you have no other choice but to do a full PC.

      The motherboard came with everything I would ever need outside of the pair of GTX1080 Ti video cards and I will not upgrade them ever. When that time arrives, the whole box get's replaced. I'm betting the screws will never be removed unless I have SSD drive failures.

    • For what? bragging rights? Pros need computers that WORK. They don't get paid for wasting their time getting their machines to work. They want to be using their machines TO MAKE MONEY. If I get a new workstation, I want to be spending my time in Photoshop, not hoping that the drivers for the card I just put in will actually work. Or finding out which expansion I put in, put everything else out of whack.
  • by Anonymous Coward on Monday June 05, 2017 @02:22PM (#54553037)

    I don't give a crap if the new macbook is a hair slimmer.. give me back some usable ports. Really don't want to buy (then have to search for) an adapter every time I want to use a peripheral

  • "The [entry level] 21.5-inch iMac will start at $1099" :

    Entry level for $1110

    LMFAO

  • by pablo_max ( 626328 ) on Monday June 05, 2017 @02:30PM (#54553105)

    So... for over $3k (27inch) Apple has seen fit to grace this thing with a 580 card? Something in the range of a NVIDIA GTX 1060, which can be had for about 260 bucks?
    I guess they really have given up on the desktop market.

    • Re: (Score:1, Interesting)

      by Anonymous Coward

      Something in the range of a NVIDIA GTX 1060, which can be had for about 260 bucks?

      I'm no Apple fanboy (I own nothing of theirs), but are you comparing gaming benchmarks? I assure you that the target market for these machines are users of applications that would perform vastly better on a Radeon Pro 580 than a GTX 1060.

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        I love how any post defending Apple has to be prefaced by "not an Apple fanboy". It just proves Apple puts out so much crap that you would have to assume only someone blindly in love with the company would defend them.

        • by Anonymous Coward

          It's probably more down to the tediously predictable nature of Slashdot posts these days.

          (I love the captchas though.)

    • I guess they really have given up on the desktop market.

      It's more that Apple has never really gone after the gaming market.

      • Desktop does not automatically equal "gaming".

        We are in the early days of VR and MR. Both of these things require heavy graphics cards to keep you from puke all over your shiny new headset.
        Even Microsoft has clearly seen that VR is here now and working to push productivity apps to VR.
        It is an exciting time for computing. Unfortunately, Apple has decided that "Sheeple TM" will not be attending this future.

        • by Anonymous Coward

          Even Microsoft has clearly seen that VR is here now and working to push productivity apps to VR.
          It is an exciting time for computing. Unfortunately, Apple has decided that "Sheeple TM" will not be attending this future.

          Apple made the right call. VR is the technology of the future, and always will be.

          Sure, Microsoft can waste a lot of time and treasure creating this:
          https://www.youtube.com/watch?v=VFkyV7d5t8o [youtube.com]

          Apple is going to focus on things people want to buy; overpriced mediocre hardware with brand appeal.
          Nobody looks cool in a VR headset.

        • by Anonymous Coward

          By the time VR/AR becomes actually viable for anything other than geeks and games, we'll be enough years down the track that the standard built in GPU in a MacBook at the time will cope with it just fine.

        • by Teckla ( 630646 ) on Monday June 05, 2017 @07:46PM (#54555469)

          We are in the early days of VR and MR.

          I'm going to go out on a limb and predict VR is going to be the new 3D TV.

    • So... for over $3k (27inch) Apple has seen fit to grace this thing with a 580 card? Something in the range of a NVIDIA GTX 1060, which can be had for about 260 bucks? I guess they really have given up on the desktop market.

      Not that, they've never been for what YOU consider the desktop market, that very small slice of users who rips open their machine and replaces a graphic card every time a new generation of games comes out. Apple's never been for the hardcore gaming market, and there really isn't any point since even Blizzard is retrenching on Apple support. The Mac desktop market is focusing on productivity and artistic creativity markets.

    • by AHuxley ( 892839 )
      But metal 2 will make sure the CPU can do so much more. The low end gpu will not have to work as much and it will all be ok.
  • I am still holding on to my Mid 2010 Mac Pro with its 2xCPU, 2xSSDs, 3xHDs, Bluray etc It even has an HD-DVD which can play my HD-DVD (bought for nothing after the "defeat" of HD-DVD), but that requires a boot to Windows, which I haven't done for at least a couple of years I admit... I am also holding on to my Late 2012 Mac Mini with a Quad Core i7 and 16GB RAM. These were my favorite Apple lines, however I can't upgrade since with a newer version I would get a downgrade in either performance, or expandabil

    • There was no Mac mini on that "Macs" keynote slide. They've already said there's a new Mac Pro being designed, but I think the mini is pretty much dead. I guess we'll know once their store comes back online. Either the Mac mini will have been silently updated to Kabe Lake CPU, they will still be the same as the 2014 "upgrade" or they'll simply be gone.

      Hold on to your quad-core i7 from 2012 because so far, it's the best Mac mini ever made.

      • The last Mac I'll buy for the foreseeable future.

        Even the craziest Apple fans I know are hesitating to upgrade their desktop and laptops. They still use the phones though.

      • Ridiculous, isn't it? The Mini is a great little machine and pretty affordable too. I do my iOS development on one, but it's starting to show its age...
      • Same boat as you, 2012 quad 16GB mini. Upgraded since purchased to an SSD and it runs like a champ.

        Apple has zero interest in selling a $600 box that you can hook to your own $300 4k 28" monitor, they want you to buy a $3000 iMac.

    • by Ichijo ( 607641 )

      Yes, a Mac Mini Pro with the same specs as the iMac Pro would make a very nice desktop machine. But of course they would charge $3,000 for the base version!

      • by Jeremi ( 14640 )

        Yes, a Mac Mini Pro with the same specs as the iMac Pro would make a very nice desktop machine. But of course they would charge $3,000 for the base version!

        I think a large part of the iMac Pro's price has got to be that 5K screen. A display-less Mac Mini could probably be produced and sold much more cheaply, and historically has been.

  • Imac pro better have easy open back or it's said joke at $5K.

    What is point of 128G ram and 2 storage slots when you have to buy them at apples prices and can't easily put your own it?

    • I know that on the iMacs it has always been easy to add/change the RAM yourself. I've had two different models and on both it was a panel that popped out that gave access to RAM. However that doesn't guarantee that Apple will do the same with the Pro.

  • Snooze Fest (Score:5, Insightful)

    by JustAnotherOldGuy ( 4145623 ) on Monday June 05, 2017 @02:48PM (#54553227) Journal

    As above, this is just the Apple Snooze Fest.

    Minor boosts in some specs, no compelling or interesting new features, but a new, higher price. No ports, no expandability, and lord have mercy on your soul if you ever need to get it repaired.

    • by AmiMoJo ( 196126 )

      Some of it sounds like complete bunk. The screen gets stupidly, retina-burningly bright. Er... Okay. I'll turn that down to 20% so I can stand to look at it for more than 20 seconds.

  • And no touch screens either?

  • by Gabe Ghearing ( 3618909 ) on Monday June 05, 2017 @03:16PM (#54553441)
    I was hoping they'd add a 32GB option to the highest end MacBook... That's gotta come in the next year or so. It seems that there aren't suppliers for 32GB LPDDR3 setups(LPDDR3 being super-low-power RAM) and Intel CPUs won't support LPDDR4 until the next generation(Cannonlake). LPDDR4 uses a lot less power and supports much higher densities than LPDDR3.

    Intel was supposed to have Cannonlake out last year...
    • Agree agree agree. I want my 64GB

  • by davidwr ( 791652 ) on Monday June 05, 2017 @03:18PM (#54553463) Homepage Journal

    I typically upgrade the RAM and storage at least once during the life of my computers.

    Apple is nice and worth paying for but without a way to upgrade it 2-3 years from now, I'll get a non-Apple notebook the next time I need one.

    • by epine ( 68316 )

      I'm with you, 100%

      New Kaby Lake iMacs arrive from Apple [arstechnica.com]

      New iMacs will be upgradeable to 64 GB of RAM on 27-inch configurations and 32 GB RAM on the 21.5-inch models.

      Does "upgradeable" merely mean ka-ching ka-ching pre-sale "configurable" at 4x street price?

      If so, buh bye, sweet spot. No sale.

      Once upon a time I would have trusted Ars to know the difference. These days, I'm not so sure.

    • by pubwvj ( 1045960 )

      I used to feel this way too but have concluded that I prefer the more integrated approach. I simply buy a computer with the maximum storage, RAM and other capabilities now and then use it to get my work done. In the past it made sense but now the maximum configuration isn't really all that much more expensive and I would prefer to have the more efficient, sleeker and rugged design that the loss of expandability gives. Added storage is external over very fast busses. Internal storage is enormous now a days.

  • by UnknownSoldier ( 67820 ) on Monday June 05, 2017 @03:26PM (#54553525)

    * MacBook Pro -- STILL limited to 16 GB ? Really?
    * It supports a wide color gamut -- what about 9-bit / channel or 10-bit / channel ???
    * Radeon GPU? UGH, I where is the nVidia GPU option to run CUDA code?

    * Sooo, what happened to the Mac Pro ? Thanks for giving us the finger Apple.

    * No new Mac Minis ?

    * iMac Pro -- at $4,999 isn't this just another Mac Pro ?

    * If Apple was serious about games -- the could EASILY blow Microsoft and Sony out of the water. WHERE is the gamepad??? Or the ability to use Android / PS4 gamepads?

    * HomePod -- You guys don't understand bass at all. I have a 12" driver on my sub. Why would I downgrade to a wimpy 4" driver ???

    I love my MacBook Pro -- but Apple really is becoming more clueless.

    • I have a 12" driver on my sub. Why would I downgrade to a wimpy 4" driver ???

      Most people (especially those with spouses) go for practical and small rather than trouser-flapping loud. Why do you think Bose is so popular? Because people didn't want huge high-end super fidelity floor standers, but small "milk carton" speakers that manage to deliver decent (good enough) sound quality. If the Siri part of this speaker actually delivers, it might actually sell. Not holding my breath though, if Homekit thus far is any indication of how well Apple understand integration in the home.

      F

    • by timholman ( 71886 ) on Monday June 05, 2017 @05:44PM (#54554593)

      * iMac Pro -- at $4,999 isn't this just another Mac Pro ?

      It's worse than that - it's a betrayal of everything that Apple claims to believe in.

      Have you looked at photos of it? It has legacy ports, including an SDXC port, four USB 3 ports, an Ethernet port, and a headphone jack. What where the Apple engineers thinking? The iMac Pro should have nothing except USB-C ports, with lots of optional dongles, just like the MacBook Pro.

      The iMac Pro is doomed from the start. It's like putting propellors on a space shuttle. I can't imagine anyone paying good money to have their pristine new iMac Pro marred by those disgusting outdated openings on the back.

    • If Apple was serious about games -- the could EASILY blow Microsoft and Sony out of the water. WHERE is the gamepad??? Or the ability to use Android / PS4 gamepads?

      If they were serious about games, they'd also release a desktop machine without discrete graphics and without a built-in monitor. And they'd support remotes on Macs again, and allow you to run an AppleTV-like interface, so you could use the computer as a console. Hell, they could just partner with Steam and use their gamepad and Big Picture mode.

      They're not serious about games.

      • Why would they be?

        • Well first, I'm not saying they should be. I use my Mac for work, and I'd rather have them continue to refine the desktop experience than make a half-assed play for the gaming market.

          However, they took a chunk of time in the conference to talk about 3D performance and VR, which would easily lead someone to think they're interested in gaming. That's not quite how I took it, but I could understand someone making that assumption.

          Also, there is a strategic reason for Apple to go after the gaming market: it'

          • There is no point in going after a desktop gaming market. That ship sailed long ago when developers realised that the only market woth developing for is Windows for commercial games. Blizzard was the last major company to do cross platform development and even they stopped doing dual development for any new games, only supporting their existing Warcraft, Hearthstone and Diablo markets. Nothing that Apple can do will change that, so what would be the point?
            • Meh. People also said there was no point in Apple going after the desktop market because Microsoft had that sewn up. Also, there was no point in going after the smartphone market because companies like Motorola were too entrenched to compete with. There was no point in making tablets, since Microsoft had tried and failed, showing that nobody wanted a tablet. For that matter, when Microsoft made the first XBox, people thought that was silly because Sony and Nintendo were too unstoppable.

    • by AHuxley ( 892839 )
      People who need 10-bit / channel can afford to buy a real computer, real software and ensure color works from capture to edit to sale/release.
      They need a real CPU, a real OS, lots of ram, fast storage and a supported display. The software has to work with the 10-bit / channel and so does other hardware.
      The new Apple products are good for consuming really nice looking video work not made on Apple computers.
      Smart people need hardware and software that supports everything their capture, software, color.
    • *where is the nVidia GPU option to run CUDA code?
      >> They have Metal 2. Maybe get a LINUX box and save money.

      * Where is the Mac Pro?
      >> The iMac Pro IS the Mac Pro.

      * Or the ability to use Android / PS4 gamepads?
      >> emulating other platforms just gives a new platform for the other platforms crap -- it's not going to make money.

      * HomePod - 12" sub
      >> Get the excelsior dominator 5000 with 50" subs to blast your inferior 4" neighbors into jello!
      >

    • Apple is not only not serious about games, Steve Jobs was downright hostile to them. Apple is today more receptive to gaming.... but only on IOS devices.
  • It doesn't really say... are the imacs still using mobile CPUs?
    • by larkost ( 79011 )

      My guess is that the 21" iMacs, and the bottom-end 27" iMac are using Mobile versions, since they are limited to 32GiB RAM. The upper 2 models of iMac 27" are definitely using a Desktop part since they support 64GiB of RAM. That logic goes even more for the iMac Pro since it can address 128GiB of RAM.

      On the GPU front it is always a little hard to say, since Apple usually negotiates custom versions of the chips used that typically sit somewhere between the retail Desktop and Mobile chips (so more powerful th

  • by andrewa ( 18630 ) on Monday June 05, 2017 @03:38PM (#54553641)

    I already committed to a Dell XPS 32GB for my next laptop though, I didn't have any faith on Apple being "courageous" enough to compromise on making a model that's slightly thicker than a previous model.

    • I already committed to a Dell XPS 32GB for my next laptop though, I didn't have any faith on Apple being "courageous" enough to compromise on making a model that's slightly thicker than a previous model.

      Eat your words then... they're doing so with the Iphone 8.

  • Looks like the Air lineup is unchanged. Which on one hand, means they get more underpowered (by comparison) every year, but also means that they don't get the "who needs more than a single USB-C port?" wrecking ball.

    So at the moment I consider that a win. For me I think it hits a perfect balance of size, weight, compute power, and battery life. At least for the computing load of non-power-users. And is now the only Apple notebook not stricken by the USB-C-is-all-you-need syndrome.

    • They went from 1.6GHz to 1.8GHz, option to upgrade to 256 or 512GB SSD even on the low-end model, but still stuck at 8GB.

      • They went from 1.6GHz to 1.8GHz, option to upgrade to 256 or 512GB SSD even on the low-end model, but still stuck at 8GB.

        Thanks for that, it wasn't obvious on their web site that anything had changed at all. It would have been nice to go beyond 8G, but relatively small price to pay to preserve a usable array of ports.

  • I can't believe Apple "innovated" a black screen inside the car. Joking aside ...

    * The wider color gamut on the MacBook Pro is nice -- just wish they said if they supported a native 9-bit / channel or 10-bit / channel or are they stuck with 24-bit color ?

    * Nice to see Apple finally taking 120 Hz serious on the iPad Pro.

    * 20 ms latency on the Pencil is a good step in the right direction. Technically you want sub 8 ms time (1000 / 120 = 8.33 ms)

    * The iMac Pro looks nice ... but are expensive as hell. $4,999

    • by Anonymous Coward

      Considering OS X and Photoshop CC for Mac have both supported 10 bit color since 2015, it's a pretty good bet that all their new systems, with DCI-P3 wide gamut displays are all taking advantage of those bits...

      Their Tech Specs page for the pro ( https://www.apple.com/ca/imac-pro/specs/ ) mentions "1 Billion Colors", in 5K and 4K UHD, and only "Millions of colors" in 4K, which I'd take as confirmation.

  • Are they joking? (Score:5, Interesting)

    by ilsaloving ( 1534307 ) on Monday June 05, 2017 @03:55PM (#54553793)

    Apart from the iMac Pro (which I'm afraid to see the price of...), their 'improvements' are no less a joke now than they were last year.

    They barely count as incremental, and nothing that is truly important. Their $4000 machine still only has 16GB RAM, a 4GB graphics card, and still no way of connecting to *anything* externally without buying an armload of attachments.

    I miss the days when their "Pro" laptops actually were. Apple is probably the singularly best example of what happens when a company replaced an Engineer CEO with an MBA. Cook needs to be fired and replaced with someone that can provide actual leadership instead of just coming up with new ways to milk the dongle dollar.

  • Wow! They decided to use modern processors! No wonder the geniuses at Apple make the big bucks!
  • If I was any more underwhelmed by this the paramedics would be rushing me to the hospital for a heart attack.
  • It looks like Apple stopped selling them (new). :(

  • only 2 TB buses in the imac pro? over 4 ports they should have the pci-e for 4 buses. This does bold well for the next mac pro.

  • $1,400 for 64GB upgrade WTF???

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...