I'm very interested in this observation... it strongly suggests that 16Gb RAM is insufficient for a tablet, which would be quite remarkable given that 16Gb is still seen as entirely reasonable for most PCs... [Yes, I'd have to concede that isn't a like-for-like comparison, but it does give an interesting perspective].
I haven't written software for iOS, so I'm not familiar with either how much RAM applications typically require, but also whether or not iOS supports any form of virtual memory management to
16gb limit means software can be written that consumes that much and still delivers a good experience. With hardware pushing boundaries, it is a kind of built it and they will come type of system. No one writes software you can not run on the most decked out versions of hardware your customer base has. Expanding the performance envelope just means opening the door to new tools in the future, not unlocking tools that were already written but can not run on anything yet.
Tell that to the knobs doing Javascript, HTML5 and other Web stuff, where Chrome tabs wind up at 4+ GB each. The web developers don't give a rat's ass, because someone's RAM and storage starved Mac can't run their code.
8GB and 16GB is bordering on shameful. One row of tabs in a Web browser can force one of those machines to swap, much less if one decides to run an actual application.
For a low-end Mac Mini or a MacBook Air, 16GB is fine. For anything else, the world has moved to 1 TB SSDs, and 32 Gigs of
I'm very interested in this observation... it strongly suggests that 16Gb RAM is insufficient for a tablet, which would be quite remarkable given that 16Gb is still seen as entirely reasonable for most PCs...
How is this any different from the new iMac or last year’s Macbook Pro? If this is supposed to be a “Pro” level system then expecting to do things like edit multiple 4k streams, process large multi-track or visualize architectural models is what users would expect. If you’re just doing web browsing, email and some basic office tasks then 8GB is probably plenty.
Individually, na.
Aggregate, I can give a relevant observation.
I just bought a new M1 MacBook Air last week. The price was agreeable, and I have a mild addiction to buying cool laptops.
Right now I'm sitting at about ~11GB used, and about ~5GB cached files.
So that's pretty good.
I've got a web browser (Firefox), an iTerm window with a dozen or so tabs, iMessage.
Fire up MatterMost, Zoom, Adium, and I'm now over 13GB, with my cached files correspondingly down to around 3GB.
Interesting, but is that "memory pressure", or is it opportunistic consumption?
On the mainframe systems I've used over the years, the OS optimizes resource consumption to get the maximum out of the machine at all times. So if you're just running a couple of appliations, then why not let the OS deploy spare RAM to act as cache. If you launch more applications or open apps draw more memory, you can just release and overwrite the cache occupancy on a least-recently-used basis.
Interesting, but is that "memory pressure", or is it opportunistic consumption?
Well the full 16GB is "used" at all times, when the applications are using 11GB the OS consumes the remaining 5GB for cache, as the memory usage of the applications increases the cache is reduced.
On the mainframe systems I've used over the years, the OS optimizes resource consumption to get the maximum out of the machine at all times. So if you're just running a couple of appliations, then why not let the OS deploy spare RAM to act as cache.
That's not a mainframe thing, macOS does that for example, that's what the OP pointed out: Fire up MatterMost, Zoom, Adium, and I'm now over 13GB, with my cached files correspondingly down to around 3GB.
In other words, in an optimized OS we might expect to see the memory occupancy as close to 100% all the time.
Yes that's what he said he's seeing on macOS, 11GB applications + 5GB cache = 16GB 100%, then 13GB + 3GB cache =
The same near-top-line cpu in both tablets and desktops. Its a recognition that the space between a tablet and a full blown workstation isnt in the processor anymore. Its pretty much the size of the screen, and the presence or absence of a keyboard and mouse.
I don't care what they do with the iPad operating system as long as they fix the god-awful multi-tasking interface. It's the least discoverable and jankiest way to try to do two things at once on a computer. Truly execrable.
The obvious thing is to let you link two iPads and use them as a single device, with different apps on different screens. It wouldn't matter if the second iPad just became a slave device showing video and sending back touch events. There are surely those who would buy a second or even third iPad to use it that way. (Not me, but we all like to mock those who buy every possible Apple product, and this would be a good way to milk them further.)
Either iOS evolves into a desktop OS, possibly a crippled one. Or macOS evolves into a mobile OS, basically: Same OS under the hood, but different interfaces.
Or the two merge at some point and become compatible but different "flavours" of the same underlying code.
Everyone was afraid that Apple was going to turn Macs into iPads.
This makes me think Apple are going to turn iPads into Macs.
You mean an ARM SoC with non-upgradeable soldered memory and storage? That’s the tablet setup that they are now stuffing into their laptops and desktops.
Its a recognition that the space between a tablet and a full blown workstation isnt in the processor anymore. Its pretty much the size of the screen, and the presence or absence of a keyboard and mouse.
No. Maybe the difference between a tablet and a normie desktop PC you'd be right, but there's a shitton more to an actual workstation than a processor, GPU screen size and input device. Unless you use your workstation as a playstation that is, but even that term has a distinctly high performance meaning these days.
What do you think a workstation is?? I don't think the iPad can hold a candle to my workstation on you know workstation tasks. It had a weedy little CPU, a weedy little GPU and not much storage. Because it's a tablet not a workstation.
My workstation has a 12 core ryzen, 64G ram, as 2080 Ti, and a big spinning disk in addition to the flash storage.
Fair enough. Yes, if you're a hardcore video editor doing massive rendering tasks, computational engineer, or theoretical scientist, yup you need every bit of power you can get and a dedicated GPU with as many cores as you can afford.
That's about 1% of the market and, yes, it's a pretty darn important part. And buying Apple for those applications is probably not the way to go.
Um, that's Apple's market. With the Mac Pro and Final Cut Pro, quite a lot *maybe even the majority) of high end professional video editing happens on the Mac. Lots of science happens on the Mac too.
Up until now, buying Apple for those applications was absolutely THE way to go. Looks like Apple may be trying to destroy their pro market, we'll have to see where the Mac Pro goes next.
Even internal to Apple most of their design studio uses Windows workstations because that’s where the CAD software they use runs best (much of it doesn’t run at all on Macs). It’s kind of their dirty little secret.
yep that's what a workstation is for. For me, video editing, deep learning and other miscellaneous computation tasks. My work workstation similar, except not video editing and much more compiling. It's faster than the much newer macbook pro I have for work and importantly doesn't sound like a tornado when it gets under load.
My 10 year old Thinkpad W510 gets more use because I do more websurfing than video ediing, and the workstation is not located in a convenient place for doing such things casually.
..., if you're a hardcore video editor doing massive rendering tasks, computational engineer, or theoretical scientist, yup you need every bit of power you can get and a dedicated GPU with as many cores as you can afford.
That's about 1% of the market and, yes, it's a pretty darn important part. And buying Apple for those applications is probably not the way to go.
For the other 99%, Apple products are great.
Actually, to contradict the both of you somewhat...anyone doing serious scientific computation in the modern era will be using a grid somewhere quite distant from their workstation, and can use a fairly low-spec machine on their desktop. I haven't asked a computer near me to do anything tough for nearly a decade.
Powerful GPUs, big memory and big storage in a workstation are the domain of video editors, gamers and dilettantes.
The architecture is the same but the form factor is not. With active cooling, a bigger power supply, and access to bigger storage and more bulky peripherals, the desktop will still be a much more capable machine and have different scenarios for usage.
People have done comparisons of the uncooled M1 (MacBook air) vs actively cooled M1 (mini). There were performance differences, but they were surprisingly small. As in "cooling barely budges the needle" small.
My take is that the M1 is designed around cooling efficiency. If you have hundreds of watts to burn and can tolerate a 12 pound heat sink, go Intel or AMD with a dedicated graphics card. They're designed for that scenario. You'll get more computing power AND heat your room for the winter.
Ah, good point. But now I need someone to argue with over it or all the fun is gone. And I want the full half hour, not just five minutes, and a connected series of statements intended to establish a proposition, not just contradiction.
It's almost like there was an industry event on this week. But surely that can't be it.
Look you really need to get over yourself. If hearing updates from the big names in Tech worries you that much then rather than only turning of Slashdot on April 1st, mark your calendar for the 4 times a year Apple host events. You may also want to turn of Slashdot when major Microsoft events are on, for E3, WMC, or any other major event takes place. God forbid you see a news site actually repeat what was said at such an event.
I for one am impressed you wasted so much time posting in something you weren't interested in, and off topic at that. Maybe it's time to find a hobby.
And we're all impressed you wasted so much time being an asshole, and an off topic asshole at that.
There's a difference. I enjoy being an arsehole. I get up early because there are otherwise not enough hours in the day in which to be an arsehole. I certainly don't go around being an arsehole and then at the same time complain about being one. That would be kind of stupid and a waste of my time which is better spent being an arsehole to stupid people on the internet.
I'm glad you're impressed. I'll consider that validation of all my good work, even if you meant it in jest while (as expected) completely mis
Quite apparent. It's also a sign of a personality disorder.
I get up early because there are otherwise not enough hours in the day in which to be an arsehole.
Yes, we're familiar with your posting history. You are on this site every. fucking. day. proving you're an asshole. Not only is it indicative of a personality disorder, it shows you have no life to speak of. Sad.
That would be kind of stupid and a waste of my time which is better spent being an arsehole to stupid people on the internet.
Were you the inspiration for this? [xkcd.com] If not, you're just another run-of-the-mill asshole on the internet who says plenty of stupid shit himself. Completely unremarkable.
I'm glad you're impressed. I'll consider that validation of all my good work, even if you meant it in jest while (as expected) completely missing the fucking point.
Missed point? My point was and is you're an assole. You have 100%
The iPad Pro is an almost notebook. Does tablet consumption and basic editing nicely , light, compact. Like a multi tool versatile but not going to beat a specialist tool. Pay a premium but if it fits work flow design does well for its intended use. Wish I could rationalize one. But use a mini for basic tablet most work prefer dual screen and more processing, keyboard , trackball.
You've totally and utterly misunderstood the "pro" here -- the iPad Pro is used by people whose jobs and workflows are enhanced by mobility. Sports coaches. Field engineers. Medics. Artists. Real estate agents. And on and on. They will use iPads to analyse athletes' movements, get an exploded AR view of a machine they're repairing, document wound progression, draw an image on-site, create a floorplan. All of those tasks are much better done with an iPad than with a desktop computer.
They will use iPads to analyse athletes' movements, get an exploded AR view of a machine they're repairing, document wound progression, draw an image on-site, create a floorplan.
I have to say they did it right on the iPad.
We put a keyboard on it and then we end up with a small ultrabook, of course the bone is not the same but the idea is there.
I could crack but the price always abused.
Too bad.
But I must say that for once its makes me want.
https://getappvalley.com/ [getappvalley.com] https://tutuappx.com/ [tutuappx.com] https://tweakbox.mobi/ [tweakbox.mobi]
The only way to learn a new programming language is by writing programs in it.
- Brian Kernighan
16 gb max? (Score:0)
Re: (Score:3)
I haven't written software for iOS, so I'm not familiar with either how much RAM applications typically require, but also whether or not iOS supports any form of virtual memory management to
Re: 16 gb max? (Score:2)
Re: (Score:-1)
Tell that to the knobs doing Javascript, HTML5 and other Web stuff, where Chrome tabs wind up at 4+ GB each. The web developers don't give a rat's ass, because someone's RAM and storage starved Mac can't run their code.
8GB and 16GB is bordering on shameful. One row of tabs in a Web browser can force one of those machines to swap, much less if one decides to run an actual application.
For a low-end Mac Mini or a MacBook Air, 16GB is fine. For anything else, the world has moved to 1 TB SSDs, and 32 Gigs of
Re: (Score:0)
You're visiting some retard sites if your tabs are using more than one gig per.
Re: (Score:0)
I'm very interested in this observation... it strongly suggests that 16Gb RAM is insufficient for a tablet, which would be quite remarkable given that 16Gb is still seen as entirely reasonable for most PCs...
How is this any different from the new iMac or last year’s Macbook Pro? If this is supposed to be a “Pro” level system then expecting to do things like edit multiple 4k streams, process large multi-track or visualize architectural models is what users would expect. If you’re just doing web browsing, email and some basic office tasks then 8GB is probably plenty.
Re: (Score:2)
Aggregate, I can give a relevant observation.
I just bought a new M1 MacBook Air last week. The price was agreeable, and I have a mild addiction to buying cool laptops.
Right now I'm sitting at about ~11GB used, and about ~5GB cached files.
So that's pretty good.
I've got a web browser (Firefox), an iTerm window with a dozen or so tabs, iMessage.
Fire up MatterMost, Zoom, Adium, and I'm now over 13GB, with my cached files correspondingly down to around 3GB.
That's far too much memory pres
Re: (Score:2)
On the mainframe systems I've used over the years, the OS optimizes resource consumption to get the maximum out of the machine at all times. So if you're just running a couple of appliations, then why not let the OS deploy spare RAM to act as cache. If you launch more applications or open apps draw more memory, you can just release and overwrite the cache occupancy on a least-recently-used basis.
In other words, in an optimi
Re: (Score:2)
Interesting, but is that "memory pressure", or is it opportunistic consumption?
Well the full 16GB is "used" at all times, when the applications are using 11GB the OS consumes the remaining 5GB for cache, as the memory usage of the applications increases the cache is reduced.
On the mainframe systems I've used over the years, the OS optimizes resource consumption to get the maximum out of the machine at all times. So if you're just running a couple of appliations, then why not let the OS deploy spare RAM to act as cache.
That's not a mainframe thing, macOS does that for example, that's what the OP pointed out:
Fire up MatterMost, Zoom, Adium, and I'm now over 13GB, with my cached files correspondingly down to around 3GB.
In other words, in an optimized OS we might expect to see the memory occupancy as close to 100% all the time.
Yes that's what he said he's seeing on macOS, 11GB applications + 5GB cache = 16GB 100%, then 13GB + 3GB cache =
Re: (Score:2)
I'm very interested in this observation... it strongly suggests that 16Gb RAM is insufficient for a tablet
It's true. I don't know what I would do with a tablet with 2GB RAM any more.
(It's 2021 and you still don't know how to use b/B... WTF)
Re: (Score:3)
What tablet had 16Gb ram ten years ago? The Galaxy Tab, announced in Oct. '10, had 512Mb.
Re: (Score:0)
Re: (Score:2)
You know what those are called?
Tablets.
Re: (Score:2)
It's 16GB RAM (memory), you are thinking about primary storage and the article states that these iPads can have up to 2TB.
how hard to sideload finder? (Score:2)
how hard to sideload finder?
Re: (Score:2)
What is "sideload finder"?
Re: how hard to sideload finder? (Score:2)
Big deal (Score:-1, Troll)
underwhelming (Score:2)
Re: (Score:3)
They think their customers are "going to love it" (tm) and "can't wait to see what their customers do with it" (tm).
Like putting gold in an Etch A Sketch (Score:1)
Like putting gold in an Etch A Sketch
The lines are blurring (Score:4, Insightful)
Re: (Score:1)
Its a recognition that the space between a tablet and a full blown workstation isnt in the processor anymore.
It certainly shows how far the manufacturers of desktop CPUs have fallen.
Re:The lines are blurring (Score:5, Insightful)
Everyone was afraid that Apple was going to turn Macs into iPads.
This makes me think Apple are going to turn iPads into Macs.
Re: (Score:3)
I don't care what they do with the iPad operating system as long as they fix the god-awful multi-tasking interface. It's the least discoverable and jankiest way to try to do two things at once on a computer. Truly execrable.
Re: (Score:1)
Re: (Score:2)
There's still the difference between iOS and macOS and I wonder which way that'll end up.
Re: (Score:3)
IMHO there is no point in putting the M1 in an iPad if the end goal is not to run macOS at some point.
Re: (Score:2)
Well, that's the question.
Either iOS evolves into a desktop OS, possibly a crippled one.
Or macOS evolves into a mobile OS, basically: Same OS under the hood, but different interfaces.
Or the two merge at some point and become compatible but different "flavours" of the same underlying code.
Re: (Score:0)
Everyone was afraid that Apple was going to turn Macs into iPads. This makes me think Apple are going to turn iPads into Macs.
You mean an ARM SoC with non-upgradeable soldered memory and storage? That’s the tablet setup that they are now stuffing into their laptops and desktops.
Re: (Score:3)
Its a recognition that the space between a tablet and a full blown workstation isnt in the processor anymore. Its pretty much the size of the screen, and the presence or absence of a keyboard and mouse.
No. Maybe the difference between a tablet and a normie desktop PC you'd be right, but there's a shitton more to an actual workstation than a processor, GPU screen size and input device. Unless you use your workstation as a playstation that is, but even that term has a distinctly high performance meaning these days.
Re: (Score:2, Insightful)
What do you think a workstation is?? I don't think the iPad can hold a candle to my workstation on you know workstation tasks. It had a weedy little CPU, a weedy little GPU and not much storage. Because it's a tablet not a workstation.
My workstation has a 12 core ryzen, 64G ram, as 2080 Ti, and a big spinning disk in addition to the flash storage.
Re: (Score:2)
That's about 1% of the market and, yes, it's a pretty darn important part. And buying Apple for those applications is probably not the way to go.
For the other 99%, Apple products are great.
Re: (Score:0)
Um, that's Apple's market. With the Mac Pro and Final Cut Pro, quite a lot *maybe even the majority) of high end professional video editing happens on the Mac. Lots of science happens on the Mac too.
Up until now, buying Apple for those applications was absolutely THE way to go. Looks like Apple may be trying to destroy their pro market, we'll have to see where the Mac Pro goes next.
Re: (Score:0)
Re: (Score:0)
And you know this how? If it's because you read it somewhere, post the damn link
Re: (Score:0)
And you know this how?
First hand experience.
Re: (Score:0)
Sure. That's super-convincing, that is
Re: (Score:2)
yep that's what a workstation is for. For me, video editing, deep learning and other miscellaneous computation tasks. My work workstation similar, except not video editing and much more compiling. It's faster than the much newer macbook pro I have for work and importantly doesn't sound like a tornado when it gets under load.
My 10 year old Thinkpad W510 gets more use because I do more websurfing than video ediing, and the workstation is not located in a convenient place for doing such things casually.
For th
Real work is done on non-workstations (Score:2)
..., if you're a hardcore video editor doing massive rendering tasks, computational engineer, or theoretical scientist, yup you need every bit of power you can get and a dedicated GPU with as many cores as you can afford.
That's about 1% of the market and, yes, it's a pretty darn important part. And buying Apple for those applications is probably not the way to go.
For the other 99%, Apple products are great.
Actually, to contradict the both of you somewhat...anyone doing serious scientific computation in the modern era will be using a grid somewhere quite distant from their workstation, and can use a fairly low-spec machine on their desktop. I haven't asked a computer near me to do anything tough for nearly a decade.
Powerful GPUs, big memory and big storage in a workstation are the domain of video editors, gamers and dilettantes.
Re: (Score:2)
The architecture is the same but the form factor is not. With active cooling, a bigger power supply, and access to bigger storage and more bulky peripherals, the desktop will still be a much more capable machine and have different scenarios for usage.
Re: (Score:2)
My take is that the M1 is designed around cooling efficiency. If you have hundreds of watts to burn and can tolerate a 12 pound heat sink, go Intel or AMD with a dedicated graphics card. They're designed for that scenario. You'll get more computing power AND heat your room for the winter.
Apple is getting a lot of Slashvertising here (Score:2, Funny)
Apple Introduces M1 Chip-Powered iMac
followed almost immediately by
The New iPad Pro Features Apple's M1 Chip
,
Apple Announces $29 AirTag, a New Tile-like Item Tracker
followed almost immediately by
Tile Bashes Apple's New AirTag as Unfair Competition
, they've captured almost half the front page already. Why not
This Week's Apple Product Announcements
, which covers all of that?
Re:Apple is getting a lot of Slashvertising here (Score:4, Funny)
Because it makes more threads for you to complain in!
Re: (Score:2)
Re: (Score:2)
https://g.co/kgs/DspcT2 [g.co]
Re: (Score:2)
Re: (Score:2)
Yes it is!
Re: (Score:2)
Re: (Score:2)
Yes it is!
Re: (Score:2)
Re: (Score:2)
Yes it is!
Re: (Score:2)
Re: (Score:2)
... yes it is!
Re: (Score:3)
Re: (Score:2)
I'm sorry, the five minutes is up.
Re: (Score:2)
Re:Apple is getting a lot of Slashvertising here (Score:4, Insightful)
It's almost like there was an industry event on this week. But surely that can't be it.
Look you really need to get over yourself. If hearing updates from the big names in Tech worries you that much then rather than only turning of Slashdot on April 1st, mark your calendar for the 4 times a year Apple host events. You may also want to turn of Slashdot when major Microsoft events are on, for E3, WMC, or any other major event takes place. God forbid you see a news site actually repeat what was said at such an event.
I for one am impressed you wasted so much time posting in something you weren't interested in, and off topic at that. Maybe it's time to find a hobby.
Re: (Score:-1)
I for one am impressed you wasted so much time posting in something you weren't interested in, and off topic at that.
And we're all impressed you wasted so much time being an asshole, and an off topic asshole at that.
Re: (Score:2)
And we're all impressed you wasted so much time being an asshole, and an off topic asshole at that.
There's a difference. I enjoy being an arsehole. I get up early because there are otherwise not enough hours in the day in which to be an arsehole. I certainly don't go around being an arsehole and then at the same time complain about being one. That would be kind of stupid and a waste of my time which is better spent being an arsehole to stupid people on the internet.
I'm glad you're impressed. I'll consider that validation of all my good work, even if you meant it in jest while (as expected) completely mis
Re: (Score:0)
I enjoy being an arsehole.
Quite apparent. It's also a sign of a personality disorder.
I get up early because there are otherwise not enough hours in the day in which to be an arsehole.
Yes, we're familiar with your posting history. You are on this site every. fucking. day. proving you're an asshole. Not only is it indicative of a personality disorder, it shows you have no life to speak of. Sad.
That would be kind of stupid and a waste of my time which is better spent being an arsehole to stupid people on the internet.
Were you the inspiration for this? [xkcd.com] If not, you're just another run-of-the-mill asshole on the internet who says plenty of stupid shit himself. Completely unremarkable.
I'm glad you're impressed. I'll consider that validation of all my good work, even if you meant it in jest while (as expected) completely missing the fucking point.
Missed point? My point was and is you're an assole. You have 100%
seems like they are homgenising their entire produ (Score:2)
iPad Pro with $999 imac-esque stand (Score:2)
Sorry, now I've given them new ideas....
Needs SD Card Slot and Removable SSD (Score:2)
Tweener (Score:2)
Re: (Score:0)
You've totally and utterly misunderstood the "pro" here -- the iPad Pro is used by people whose jobs and workflows are enhanced by mobility. Sports coaches. Field engineers. Medics. Artists. Real estate agents. And on and on. They will use iPads to analyse athletes' movements, get an exploded AR view of a machine they're repairing, document wound progression, draw an image on-site, create a floorplan. All of those tasks are much better done with an iPad than with a desktop computer.
Re: (Score:3)
They will use iPads to analyse athletes' movements, get an exploded AR view of a machine they're repairing, document wound progression, draw an image on-site, create a floorplan.
LOL. You watch too many commercials, dude.
Re: (Score:0)
Those are all routine use cases. There are multiple competing apps for each of them, and plenty of enterprises using them.
Where have you *been* for the past five years?
Re: (Score:1)