Apps Reportedly Limited To Maximum of 5GB RAM In iPadOS, Even With 16GB M1 iPad Pro (macrumors.com) 159
Despite Apple offering the M1 iPad Pro in configurations with 8GB and 16GB of RAM, developers are now indicating that apps are limited to just 5GB of RAM usage, regardless of the configuration the app is running on. MacRumors reports: The M1 iPad Pro comes in two memory configurations; the 128GB, 256GB, and 512GB models feature 8GB of RAM, while the 1TB and 2TB variants offer 16GB of memory, the highest ever in an iPad. Even with the unprecedented amount of RAM on the iPad, developers are reportedly severely limited in the amount they can actually use. Posted by the developer behind the graphic and design app Artstudio Pro on the Procreate Forum, apps can only use 5GB of RAM on the new M1 iPad Pros. According to the developer, attempting to use anymore will cause the app to crash: "There is a big problem with M1 iPad Pro. After making stress test and other tests on new M1 iPad Pro with 16GB or RAM, it turned out that app can use ONLY 5GB or RAM! If we allocate more, app crashes. It is only 0.5GB more that in old iPads with 6GB of RAM! I suppose it isn't better on iPad with 8GB." Following the release of its M1-optimized app, Procreate also noted on Twitter that with either 8GB or 16GB of available RAM, the app is limited by the amount of RAM it can use.
Who is surprised by this? (Score:5, Interesting)
The obvious rationale for this is that Apple would like an app's performance to be consistent across all of its (iPad) devices. Reasonable people might disagree on whether that is worth the trade-off of reduced performance on higher-RAM tablets. However, Apple making that kind of decision on behalf of developers and users is exactly par for the course.
Re: (Score:2)
There is no "problem" here.
(a) You can run more than one program at a time.
(b) Each individual program is limited to 5GB.
(3) If you write programs for a toy computer and you can't do it in less than 5GB, then *YOU* are the problem.
(IV) The only people complaining about this are shitty, incompetent developers.
Kinda ridiculous (Score:5, Insightful)
This week at work I went to install a program that checks an input string against a list of a million "disallowed" strings.
I had glanced at the code and it looked like the programmer was reasonably competent.
When I ran it, I saw it used about 300MB of RAM and took over 80 milliseconds for each lookup.
I spent two hours writing my own version that uses 12 MB of RAM (for the .Net library) and takes 0.6 milliseconds.
So it's 100X faster and uses 20X less memory, by - having a clue how to program. As in, actually knowing WTF I'm doing.
I could have made it faster by using a binary index file for the textual list, but I decided I liked to have all the data easily human-readable in any text editor.
If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be fucking clueless about programming.
Re: (Score:3, Insightful)
If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be fucking clueless about programming.
Or they might be writing an audio app that needs to load gigabytes' worth of audio samples. The lack of imagination I see on Slashdot is staggering, as is the hubris.
Re: (Score:2)
Yup, or 3D rendering, people regularly use more than twice as much RAM as that limit for semi-complex scenes.
Now, I have written software to process very large audio files that did it by streaming chunks of data, but it's always a trade-off: use less memory, get less speed. What works for offline processing does not necessarily work for real-time processing, etc.
Sure, it's a tablet, but it's an interesting design decision...it it *is* a decision, and not some bug.
Re: (Score:2)
I'm a regular user of Blender and I, again, can't see any situation where anyone is going to use a tablet to do 3d rendering. People that do 3D rendering are going to use desktops and laptops with gigabytes of ram and powerful CPUs and GPUs. I just don't see a tablet having that kind of processing power. I could se you using a tablet to display your rendered work.
Re:Kinda ridiculous (Score:4, Insightful)
the SSD is fast enough to stream the fscking audio samples real time.
it's high time for these people to learn how to code properly, instead of just porking out
Re: (Score:2)
the SSD is fast enough to stream the fscking audio samples real time.
Sure, if that's the only thing it's doing. If not, then you're going to want to preload to make sure you're adding absolutely zero latency. If you're using the audio samples in some way that is unpredictable, and doing other I/O at the same time, you're doing to need to cache.
Re: (Score:2)
the SSD is fast enough to stream the fscking audio samples real time.
it's high time for these people to learn how to code properly, instead of just porking out
Or we could use hardware that common people have available to them rather than specifying bizarrely high minimum system requirements of an SSD for a basic app to be functional. I buy RAM because it's a fast form of memory and I expect you developers to use it competently not pussy foot around with a slower solution. If you were a remotely competent programmer you'd look at the available memory and only revert to a slower way of doing something when the faster method is exhausted.
You sound like my wife. Spen
Re: (Score:2)
Re: (Score:2)
"The lack of imagination I see on Slashdot is staggering, as is the hubris."
That's exactly what I thought when I read your post. It's as though you think that everything must be loaded into memory at once, and that doing such things was impossible prior to having more than 5GB available to applications.
The OP is right, the problem here is clueless programmers, and you appear to be one of them.
Re: (Score:2)
Or they might be writing an audio app that needs to load gigabytes' worth of audio samples. The lack of imagination I see on Slashdot is staggering, as is the hubris.
I can't think of any situation where anyone is giong to load gigabytes worth of audio samples on a tablet, into RAM. Maybe into storage, but not RAM. Anyone is going to load that much in audio samples into RAM is going to use a real computer with gigabytes of RAM.
Re: (Score:2)
Re: Kinda ridiculous (Score:3)
wtf. loading assets into ram is exactly how all programs work dude. do you think photoshop works on the image on the hard drive directly? or loads the paintbrush tool from your Program Files folder every time you want to use it?
You are precisely the type of moron we are talking about. Hopefully you are not a software Developer.
Like Google Maps, Photoshop "Tiles" image-segments into and out of RAM. It is the only practical way to manipulate gigantic image files.
And no, Photoshop does not load the Paintbrush Tool from disk every time you use it. It manages a small "Tool Cache" to keep the most recently-used Tools in RAM for quick access. But if you switch to a couple of different Tools and then back to the Paintbrush, you may notice
Re: Kinda ridiculous (Score:2)
Re: (Score:3)
Honestly, I don't have a clue, but this discovery comes from a company that makes image manipulation software. I always thought this kind of applications could use quite some ram due to having to load, display and manipulate multiple layers of high resolution images
A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel). 5 GB should be plenty. Video, on the other hand... there's hardly a limit for how much you could use if don't use streaming from storage.
Re: (Score:2)
A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel).
Camera megapixels are different from display megapixels. There's only a single colour per pixel requiring, in this case, 14-bits. Call it half of your estimate.
Re: (Score:2)
A picture from Sony's flagship A1 camera is 50 megapixels. Even without compression, that should be just 200 MB (4 bytes per pixel). 5 GB should be plenty.
Per layer. And then there's the memory consumed by whatever image manipulation operations you're executing. And hey, if you don't want to experience degradation, you might want to actually increase the resolution before certain types of manipulation...
Video, on the other hand... there's hardly a limit for how much you could use if don't use streaming from storage.
Everything is faster with more memory for caching.
Re: Kinda ridiculous (Score:2)
"makes shitty image manipulation software"
There fixed that for you.
Re: (Score:2)
Congratulations. In software however, not everything is a first year student problem.
Re: Kinda ridiculous (Score:2)
For a problem like that, the obvious solution is a hashset. Nothing really novel.
Re: Kinda ridiculous (Score:2)
can't you do this with one line or bash written in 30 seconds
Re: (Score:2)
Re: (Score:2, Interesting)
Your average Apple enthusiast knows dick about shit when it comes to computing. That's why they love Apple so much despite all the artificial limitations it puts on your computing. Even Microsoft does less to get in your way, although they do plenty of other offensive crap.
You can make Linux have the same look and smell as OSX down to the mousing and keyboard behavior, with literally all the same functionality, but with the added element of choice. The only benefit to OSX over that, which admittedly is an i
Re: Kinda ridiculous (Score:2)
But people who spend a lot of energy defending Apple's lock-in and walled garden approaches to computing are ignorant at best,
Wrong.
Embedded Developer (and Windows Application Dev.) with over 4 decades of paid Dev. Experience here. I both understand the reasoning behind, and agree with, nearly all of Apple's decisions regarding their various "ecosystems" (although I hate that term as applied to computing). I gladly accept the Walled Garden on iOS/iPadOS (and I think that will be relaxed on iPadOS as it matures away from iOS), and know how to do as I please with macOS, and when it is appropriate and safe to use Appleâ(TM)s wel
Re: (Score:2)
Now you lick the boots harder.
Nicely formed, erudite response to my counter-argument.
But I expected no less.
Re: (Score:2)
The average person knows dick about shit when it comes to computing. The average person just wants to run their applications. They don't care about having choices in the operating system. They don't care about being able to compile stuff themselves. They just want to run their apps.
The first car that I owned had a manual choke (look it up if you are too young to know what one is). It gave me much more control over the running of my car than modern cars do, but you never seem me, or anybody else, lamenting t
Re: (Score:2)
Re: (Score:2)
If someone is writing software for an iPad and they're using more than 5 GB of RAM, more than 5 BILLION bytes, they just might be f...ing clueless about programming.
Sigh. Said with the same mindset that said way back when "640K ought to be enough for anybody." (No, Bill Gates didn't actually say that but he still gets credit for it).
Re: (Score:2)
If someone is writing software for an iPad and they're using more than 5 GB of RAM
I'm glad you know how to write a string compare function. I expect a little more functionality than that from my devices and your list of "a million" is laughably small compared to normal datasets that programs often work with, such as a simple image editor.
Re: (Score:2)
Re: Kinda ridiculous (Score:2)
No just you.
Re: (Score:3)
The obvious rationale for this is that Apple would like an app's performance to be consistent across all of its (iPad) devices.
The obvious rationale for this is that iDevices are meant to be toys, and if you do real work Apple wants real money out of you, so you should buy a Mac.
Apple making that kind of decision on behalf of developers and users is exactly par for the course.
Yes, and the course is full of alligators and sand traps which Apple put there in order to make sure you have only one path to the green.
Re: Who is surprised by this? (Score:2)
The obvious rationale for this is that iDevices are meant to be toys, and if you do real work Apple wants real money out of you, so you should buy a Mac.
Wrong.
Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.
However, it is obvious to anyone with half a brain that one of the biggest reasons Apple split-off iPadOS was that they see that Class of products more as fulfilling a "general-purpose computing device" role, and as time goes on, we will start to see some decidedly "non-Toy" Applications and Dev. Tools (Swift Playgrounds notwithstanding) specifically for iPadOS.
I have a feeling that WWDC 2021 will have some quite
Re: (Score:2)
Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.
Well, they aren't, unless you place artificial limitations on them. And the problem isn't even that they do that. The problem is that they don't let you turn them off. I get why people want the walled garden, the illusion of security is very appealing. But insisting that locking you into it is for your own benefit is just internalization of abuse.
Re: (Score:2)
Apple thinks Smartphones are essentially sophisticated Appliances. I wholeheartedly agree.
Well, they aren't, unless you place artificial limitations on them. And the problem isn't even that they do that. The problem is that they don't let you turn them off. I get why people want the walled garden, the illusion of security is very appealing. But insisting that locking you into it is for your own benefit is just internalization of abuse.
There are ample alternatives for the adventurous. Go forth and be Happy (or is that Hacked?).
And the Security of iOS/iPadOS and the App Store is very real. Nothing is 100% perfect; but the relative malware percentages between iOS/iPadOS and Android speaks (quite loudly and clearly) for itself.
Re: (Score:2)
However, Apple making that kind of decision on behalf of developers and users is exactly par for the course.
Apple providing hardware that is entirely useless on the other hand is not par for the course. Their normal MO is to not provide hardware that people actually need.
Re: Cannibalizing product range (Score:3)
The obvious rationale for this is that Apple would like an app's performance to be consistent across all of its (iPad) devices.
Or, the rationale might be that Apple wouldn't like the iPad product range to cannibalize potential sales of M1-based MacAir.
If their Pro tablet become too good, less people would be likely to buy entry-level laptop(*)
(* - basically the same innards, just a slightly larged screen without touch, and a permanently attached keyboard).
Except for the fact that the M1 MacBook Air is undoubtedly a lower-margin product than the iPad Pro.
A base model M1 MBA (8GB/256GB), with 2 USB4/TB3 Ports, 13.3â 2560-by-1600 display, keyboard, etc., and with the ability to run full macOS plus at least a growing subset of iPad Apps, is US$999; whereas a 12.9â base-model WIFI-only M1 iPad Pro, with 256 GB SSD, 1 USB4/TB3 Port, no Keyboard, and iPadOS only, is US$1199. The iPad, however, does have that spectacular display, plus touch ability.
Re: (Score:3)
Unlikely. Apples always been protective of the performance of its i-devices (with the strange exception of the battery-protection slowdowns on older devices it was doing or a while until everyone started yelling at them about it). Its backgrounding policy used to be very strict , designed to stop the slow-downs that where besetting Androids at the time. Its relaxed that to a small extent in the era of faster arm chips but its still fairly adamant that certain behaviors will never be accepted for backgrounde
Re: (Score:2)
A lot of games benefit from more than 5gb of RAM. Fast storage helps but only to a certain extent.
The other issue is that people paying out to get 16gb of RAM expect that it gets used in a meaningful way. I'm sure iOS keeps apps in memory so the user can switch faster, but they probably expected better in-app performance too.
Re: (Score:2)
I assume that theres something like that going on here. Keep in mind 5 Gb is actually plenty for MOST usecases.
Yeah, 625kB should be enough for anybody!
nobk.
There are exceptions where extensive ram useage is advantagous, and I expect at some point apple will probably deal with these via policy
(Ie we know what's good for you better than you do)
Face it, Apple's policies are NOT for your benefit. They are for Apple's.
Re: (Score:2)
625kB? WTH?
Re: (Score:3)
5 Gb
vs
5 GB
Are we nerds or not?
Re: (Score:2)
I didn't see the 5Gb in the parent post. /facepalm
Re: (Score:2)
Re: (Score:2)
If their Pro tablet become too good, less people would be likely to buy entry-level laptop(*)
Yeah, it's not like they would ever just put less ram in the thing and make a larger margin off it.
Come on guys, stay consistent!!
Comment removed (Score:3, Insightful)
Re: (Score:2)
Well it seems the first time any iOS device is ahead of the competition for the amount of RAM. So yes, it is unprecedented, even though the iPad Pro is blurring the lines with the Mac Book
Re:Unprecedented (Score:4, Informative)
Microsoft sold versions of the Surface Pro 4 [wikipedia.org] with 16 GB of RAM starting something like 5 years ago. The Surface Pro 7+ is now available with 32 GB of RAM.
I suppose you could argue that it's not exactly the same market, but it is pretty close.
Re: (Score:2)
I was talking about Android competitors to iOS phones/tablet.
Of course there are PCs with more RAM.
Re: (Score:2)
I was talking about Android competitors to iOS phones/tablet. Of course there are PCs with more RAM.
There are already Android phones with 18G of ram though.
Re: (Score:2)
I suppose you could argue that it's not exactly the same market, but it is pretty close.
It's the same market approached from a wildly different side. Microsoft trying as hard as possible to limit the possibilities for customers used to a highly advanced and versatile OS, Apple trying as hard as possible to advance the possibilities for customers used to a horrendously crippled toy, and they are meeting in the middle carrying a lot of baggage.
Re:Unprecedented (Score:4, Interesting)
Well it seems the first time any iOS device is ahead of the competition for the amount of RAM. So yes, it is unprecedented, even though the iPad Pro is blurring the lines with the Mac Book
Sure, because the only thing that matters for computer performance is the amount of RAM. As for the iPad Pro it's not going to replace the Mac Books any time soon for people doing any serious every day work and that's not because the iPad is lacking in raw CPU power but because the iPad UI just plain sucks for doing serious work and the whole concept of a tablet doesn't really work all that well for most types of day to day work. By the time you've turned your iPad Pro into something that is half way usable by adding a keyboard you've basically turned it into a MacBook Air with a touch screen and a shitty, limited UI. Now cue a big noisy clown posse of Apple haters cracking shitty jokes ...
Re: Unprecedented (Score:2)
"By the time you've turned your iPad Pro into something that is half way usable by adding a keyboard you've basically turned it into a MacBook Air with a touch screen and a shitty, limited UI. Now cue a big noisy clown posse of Apple haters cracking shitty jokes"
Same exact thing with Android. Yeah, I had a hackneyed (highly portable) practice programming enviroment set up with a rooted Kindle Fire and a Bluetooth keyboard that was just barely suitable for what I was intending it for, but I then bought a "r
Re: (Score:2)
but they expect you to only use these machines to consume and throw real money after fake online baubles.
Old man yells at cloud.
Re: (Score:2)
The UI of iPads is actually not shitty or limited (versus macOS/OS X).
Or do you have something special in mind?
Re: (Score:2)
Re: (Score:2)
The equivalent of right clicking has an inbuilt delay. The equivalent of click and drag has an inbuilt delay. Selecting text has an inbuilt delay and interacting with any small menu items is a chore.
Never mind the fact that you have to lift your hand off the keyboard physical keyboard, onto the screen to select anything and the screen then becomes grimy and covered with oily residue. You can use the virtual software keyboard but that takes up screen real-estate, it's slow, glitchy and using it covers the screen with even more oily residue and gunk.
Re: Delays everywhere! (Score:2)
Never mind the fact that you have to lift your hand off the keyboard physical keyboard, onto the screen to select anything and the screen then becomes grimy and covered with oily residue. You can use the virtual software keyboard but that takes up screen real-estate, it's slow, glitchy and using it covers the screen with even more oily residue and gunk.
You do realize, of course, that if your fingers are oily enough to leave "oily residue and gunk" on a touchscreen, that they are leaving that same gunk on a keyboard and mouse, or even stuff like a remote control, right? Just because you can't see it as well, doesn't mean it is clean.
Pro Tip: Wash your greasy hands after eating things like Cheeseburgers, French Fries and Cheetos if you don't want to leave "oily residue and gunk" on stuff you touch.
Re: (Score:2)
Re: (Score:2)
Have you ever used one of these things? All the things I listed require press and hold, you have to hold for a specified amount of time, that is a delay. This is true of android and apple.
Word ... That delay is possibly the most irritating thing about using any tablet. With a physical keyboard the response is instant, navigating is quick and precise because the trackpad is right there in front of the keyboard a minuscule hand movement away, you can do a ton of stuff with keyboard shortcuts, doing any of this does not cover the screen in oily gunk ... the list goes on. I use tablets, I use smartphones but I would not want, for example, to code on such a device for hours on end although I'll a
Re: (Score:2)
Then go into settings and change it.
Sorry, there is no delay that is in any way noticeable or annoying.
Re: (Score:2)
Re: (Score:2)
At least it is not as annoying as you want to make us believe.
And why you not use a two finger press and hold is beyond me anyway.
Re: (Score:2)
Re: (Score:3)
Unlock (Score:2)
Re: (Score:2)
Bet a jailbreak “fixes” it first
Re: (Score:2)
Or, the makers of Chrome, Firefox, Photoshop et. al. could fix the fucking memory leaks in their fucking apps unfair as that demand may be.
How about the makers of Safari [apple.com]?
Whoops!
Solution is obvious. (Score:5, Funny)
-
Or thrice?
Re: Solution is obvious. (Score:2)
Deja-Moo (Score:2, Troll)
(2010 Apple iConsumer) "My iPhone4 signal sucks."
(Steve Jobs) "You're holding it wrong."
(2021 Apple iConsumer) "My iPad memory support sucks."
(Tim Cook) "You're RAMming it wrong."
Surely that should be... (Score:2)
Pansies! (Score:5, Insightful)
My first mac only had 128K of RAM and it was pretty fancy. That was back when real men and women wrote code, close to the metal. These spoiled wimper-snappers don't know how good they got it. "I only get 5 GB, wah!"
Re: (Score:2)
My first computer had 4K of RAM.
Get off my lawn.
Re: (Score:2)
I would assert that a Mac Plus running Word 3.x is faster and more responsive than virtually any computer made today when it comes to UI based stuff. Amazing what people could produce when hardware was a limiting factor.
Re: (Score:2)
Re: (Score:3)
My first mac only had 128K of RAM and it was pretty fancy. That was back when real men and women wrote code, close to the metal.
The 128k Macintosh was barely capable of performing its mission because it was underdesigned for it. The system had graphics-only output yet had absolutely zero graphics acceleration hardware, and in fact Macintoshes didn't get any until the Macintosh II line, and even then only if you bought a graphics card that was more expensive than an entire PC (The 8(bullet)24 GC). And as a result performance was frankly atrocious compared to the direct (but to be fair, slightly later) competition that used the same C
Re: (Score:2)
My first mac only had 128K of RAM and it was pretty fancy.
Pansy. Back in my day we didn't have any RAM at all. If I wanted to kill someone with a chainsaw with Doom I had to do it the old fashioned way and end up on the 7pm news.
I never understood the "back then we didn't have X" argument. No shit sherlock, you also didn't do Y. Back then you would have posted that message by getting out pen and paper and putting your comment up at the local library.
640k ought to be enough for anybody (Score:2)
...or something.
Wow (Score:4, Insightful)
When I was growing up, programmers often were limited to 64 KILOBYTES or less of RAM. Often "tricks" were employed to pull off some truly amazing shit (check out the C64 demo scene)
Maybe this is a good thing, as programmers will have to be more skilled and efficient with code, rather than expecting the user to throw more ram and cpu at the problem they've created.
Now if websites were given stricter limitations on resource usage. :-\
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
The graphical tricks were very impressive, the sound much less so. Thanks for the volume warning.
Re: (Score:2)
Maybe this is a good thing, as programmers will have to be more skilled and efficient with code
Who said the programmers are being inefficient with code? RAM is used as the fastest available memory storage medium. It exists to speed up the user experience. I highly doubt a single person is hitting the 5GB limit with their app, and rather running into both speed issues and limits in what the app does.
64KB of RAM? I suspect limiting programmers that low may make it difficult to edit 4K 60fps video from the device's own camera.
Websites? It's not websites which use RAM. It's the fact that we expect our br
5 gb ought to be enough... (Score:2)
All memory, or just allocated memory? (Score:2)
Does that 5 GB limit affect purgeable memory or just RAM allocated with malloc/new/CFAlloc/alloc/*?
If you're using more than 5 GB of RAM that isn't purgeable, you're more than likely doing something wrong, but if you can't use more than 5 GB of purgeable memory when it isn't otherwise in use, that's probably a bug.
See also: mmap.
and... (Score:2)
this is A GOOD THING !
time for these people to learn to code lean and mean.
they should use temporary files for things they don't need in ram such as the undo stack, the SSD is sufficiently fast for this to be of absolutely no consequence for the user experience
Re: (Score:2)
this is A GOOD THING !
TREAD ME HARDER DADDY
Filter error: Don't use so many caps. Filter error: Don't use so many caps. Filter error: Don't use so many caps. Filter error: Don't use so many caps.
What's the problem? (Score:2, Insightful)
Crash? (Score:2)
I don't develop in Swift or Objective-C, but for all I've been able to find out in short time, it should be possible to catch when an allocation fails, and handle it -- avoiding a crash.
Even if a large allocation fails, it might still be possible to do smaller allocations for such as showing an error message to the user.
Re: (Score:2)
The behavior sounds a lot like quotas being enabled, which for a iPad makes sense since you do not want any one app to be able to eat up ALL the system resources. That being said, if it is simply a quota, they should really include an option for changing it since there are
Not the right tool for the job (Score:2)
If you _really_ need more than 5GB to run an app, then a tablet is not the right tool for the job.
Re: This may be a good thing... (Score:2)
What kind of ADD riddled fuckwit is buring 16 giga on tabs they are not even using
Computer Torture (Score:2)
With Web browsers easily burning through 16 GB...
(Arrogant Me) "That's bullshit. No way."
(Curious Me)
* Right-clicks Bookmarks *
* Open All in Tabs *
(Browser) "That's 7 fuckloads of tabs. You sure?"
(Curious Me) "Yup, let's do this."
(RAM) "Wait wut are we doing agaAAAUGH! AAAUGH! MAKE IT STOP!"
(CPU) "You're a dick."
(Arrogant Me) "Huh. I'll be damned."
Re:This may be a good thing... (Score:4, Informative)
With Web browsers easily burning through 16 GB on the desktop with just a basic set of tabs, having an OS force designers to actually tighten the belt with their bloated RAM usage is a definite good thing.
It's not the bloated browser that is doing this. It's the bloated websites. You should look into the code on these modern monstrosities you visit sometime and take a look.
Re: This may be a good thing... (Score:5, Interesting)
Take a look at the source of a Google Groups page- ooUcFUCKh!
Never mind that this is a Usenet archive with posts made from machines that had far less combined ram and disk byte amount than the pages being served up!
Re: (Score:2)
Look at the source of the Google search page...
Re: (Score:3)
As your GP says, but your P does indeed get wrong.
This is very much the case for "modern" websites. Between the demands of clueless "design" types and the lack of skill of webmonkeys, the sheer volume of garbage underlying even a trivial blog-style webpage was already out of control a decade ago. Then FB and Google got their spyware into the mix, and the data volume and CPU use quadrupled. It's insane. :P
The developers of these "Baby's First Website" toolkits should have known better, and should be shot.
But
Re: (Score:2)
Actually it's not. The code on the website doesn't account remotely for the memory load. The reason for RAM use is features we expect from our browsers. While some people may be happy with Lynx or Firefox 2.1, I for one except my browser to be capable of running a full WebRTC video conferencing, able to act virtually as an OS to groupware programs like Office or Docs, I expect it to include a high speed runtime compiler capable of processing Javascript fast enough to play Quake http://www.quakejs.com/ [quakejs.com] at fu