Firefox Too Big To Link On 32-bit Windows 753
An anonymous reader writes "Firefox has gotten so large that it cannot be compiled with PGO on a 32-bit linker anymore, due to the virtual memory limitation of 3 GB. This problem had happened last year with 2 GB, which was worked around by adding a/3GB switch to the Windows build servers. Now the problem is back, and things aren't quite that simple anymore."
This only affects the inbound branch, but from the looks of it new code is no longer being accepted until they can trim things from the build to make it work again. The long term solution is to build the 32-bit binaries on a 64-bit system.
Trying to do too much (Score:5, Insightful)
Re:Trying to do too much (Score:5, Funny)
And this would be the point where the fact that the Firefox devs have been trying to do too much with a "browser" becomes beyond blatantly obvious.
Firefox is a great operating system, if only it included a decent browser :P
Re:Trying to do too much (Score:5, Informative)
Chrome doesn't even try to build with PGO, last time I checked.
http://groups.google.com/a/chromium.org/group/chromium-dev/browse_thread/thread/533e94237691e2f6 [google.com]
Re:Trying to do too much (Score:5, Informative)
Re:Trying to do too much (Score:4, Funny)
Don't worry, I've modded you up!
Oh, wait...
Re:Trying to do too much (Score:4, Interesting)
Re:Trying to do too much (Score:5, Informative)
An excellent takeaway from this article.
Unfortunately, it's completely incorrect. TFA is talking about the build process on a 32-bit host, specifically that VS builds using profile-guided optimization require more memory than is available in the address space *DURING THE BUILD PROCESS*, not an issue encountered by the resulting binary.
I know you want a chance to get in a quick dig at Firefox, but this isn't the article for that.
Re:Trying to do too much (Score:5, Insightful)
I think it's still indicative of the problem GP mentions. The more code you are trying to pull in, the larger the footprint during the build process. You don't see a 'Hello world' program requiring a 3GB+ build footprint do you? No, because it's not doing enough to warrant that. Likewise, Firefox apparently *is* trying to do a lot. More than it used to at any rate.
Re:Trying to do too much (Score:5, Informative)
Well you're right in that Firefox does need a hell rather large amount of RAM to build... but it's not just them; all browsers are trying to do a lot nowadays.
Chrome doesn't exactly have light build requirements either. In fact, the Chromium project already seems to have dropped 32-bit build environments:
(From "Build Instructions (Windows) - Build Environment" [chromium.org])
That's why I think that the parent poster's implication that it's due to Firefox becoming "bloated" is basically hogwash. Browsers are more complex than they were in the mid-90s. That's what happens when you add 10+ years of new formats and technologies that must be supported for a browser to be considered "usable". Directing one's ire at Firefox is unwarranted, IMHO.
Re:Trying to do too much (Score:4, Insightful)
Just because Chrome is bloated too doesn't mean Firefox isn't.
Re:Trying to do too much (Score:5, Insightful)
Thank you for posting this. Filling the virtual address space of the linker probably does indicate some problems with the Firefox source code - crazy big translation units, for example - but it doesn't imply anything about the size or quality of the resulting binary.
I thought people on this site were supposed to know something about computers.
Re:Trying to do too much (Score:5, Insightful)
The "translation unit" involved here is the whole binary. We're talking about link-time code generation with profile-guided optimization, not regular compiles.
So it doesn't indicate much of anything about the Firefox source code other than general overall quantity of code being compiled...
Re:Trying to do too much (Score:5, Informative)
> And that is to make lots of parts be separately loaded
> modules
This actually leads to pretty serious performance penalties because of the way that web specs tend to interdepend on each other. It also loses you optimization opportunities.
Pieces of functionality that are not interdependent with other stuff (e.g. audio and video backends, webgl) are in fact either in separate libraries or being moved there.
For the other stuff, there did use to be more modules. It turned out that in practice most users needed them at startup (to show the browser itself and restore their tabs), so the only thing multiple modules got you was a slower startup and slower runtime code.
It's not an accident that both Firefox and Chrome ship almost all their code in a single library (binary in the case of Chrome). It turns out that for web browsers specifically this works somewhat better than the alternatives.
Oh, I just checked and looks like Opera also also links all its code into a single library, at least on Mac.
And Safari links all the core WebKit code into a single library.
Not sure what IE does nowadays, but last I checked mshtml.dll in fact included all the actual browser bits.
Now it may be that all the people involved in all these projects can't design worth anything. Or maybe they did some measurements that you haven't done and found that this approach works better....
Re:Trying to do too much (Score:5, Informative)
Define "too much"?
It's been over a year since Chrome had to turn off PGO altogether and move to 64-bit builders even without PGO, because they ran into this same problem.
So maybe your issue is with the fact that all "browsers" as you call them are trying to do too much? They should drop the fast graphics and jits and video and audio support and all that jazz, right?
The code gets larger, and yet things dissapear! (Score:4, Funny)
E.g. where's the status bar in recent firefoxes?
Re:The code gets larger, and yet things dissapear! (Score:5, Informative)
Re:The code gets larger, and yet things dissapear! (Score:5, Insightful)
If Seamonkey adopt this silly fade-in fade-out floating toolbar-as-status-bar-replacement, that'll be the end of the Mozilla line for me. Seamonkey has always been the sensible browser for Netscape-heads from way back when since Mozilla Suite died a death and SeaMonkey came into being.
Re: (Score:3)
What do you plan to use otherwise? Chrome does the same thing, so you're stuck with IE and Safari.
Honestly, I cannot fathom what is preferential about an always-open status bar. For me, the status bar was always of such situational use that the first thing I did on a new browser install was disable it. Having it auto-hide is a much better choice. It's there when you need it (which is to say, rarely), and not there when you don't (which is to say, most of the time). I guess if you really want to see a p
Re:The code gets larger, and yet things dissapear! (Score:5, Interesting)
The other day a coworker walked in to ask me what I use as my browser. I said Safari. He asked why not Chrome. I told him I had Chrome installed, and used it occasionally. I couldn't remember exactly why I prefer Safari, so I started both of them up. Safari was done launching in a couple of seconds, almost before I had time to click on Chrome. We continued our discussion of why I don't use Chrome while waiting for it to launch.
Checking the .app size... Safari is 35 MB, Chrome is about a quarter of a gig.
interaction of two things (Score:5, Informative)
Size of the Firefox codebase is one factor of course, but the amount of RAM needed by Visual Studio to compile code with all optimizations turned on (especially PGO, which is extra RAM-intensive at the compilation stage) is also a major factor. Notice that this only happens in the 32-bit Visual Studio builds specifically.
Re: (Score:3, Informative)
Re:interaction of two things (Score:5, Informative)
Sure, except that (especially in C++ code with templates) VS uses FAR less memory than the GNU toolchain when compiling the same code. This isn't a VS problem, it's a Firefox problem.
Re:interaction of two things (Score:4, Interesting)
Nope. It is the MS linker problem. It's not about memory usage, it's about stupidly memmapping ALL of the input files during startup. So it's very simple to check if you may have a problem: add up the size of all the .obj files. If it's above 1-1.5G or so, it won't fit as linker needs address space for its own transient data and you need to boot with the /3g switch. If it's above 2G or so, then even the /3g switch won't help you -- you need a 64 bit host.
Too big to link (Score:5, Funny)
Firefox devs requesting immediate RAM bailout.
Re:Too big to link (Score:5, Funny)
Occupy Swap Space, 2011!
be there, or be nullified!
Big deal (Score:5, Funny)
It takes 16GB to compile Android.
Compare Chrome. Is it a plug-in, app, or OS? (Score:5, Informative)
Re: (Score:3)
Argh, I meant BIGGER than Windows!
Re: (Score:3)
"You might as well compare apples with a car."
Not a great analogy. Apple might release a car one day.
Re: (Score:3)
Pretty sure the Prius has the Apple-fan car market buttoned up pretty well at this point.
Re:Big deal (Score:5, Interesting)
Looks like they're having similar problems:
https://code.google.com/p/chromium/issues/detail?id=21932 [google.com]
Re:Big deal (Score:5, Informative)
Oh, also, it looks like he was trying to say that compiling Chromium with PGO would use at least 9GB of RAM but he hit Shift too early
Re: (Score:3)
Specifically, GameboyRMH here is referring to comment #10
https://code.google.com/p/chromium/issues/detail?id=21932#c10 [google.com]
Last paragraph in the TFA is... confusing (Score:3, Insightful)
"First tests indicate that, for example, moving parts of the WebGL implementation to one side could save 300 KB. In a test run, the newer version of Visual Studio required less memory than the one that was previously used, and 64-bit Windows offers 4 GB of address space."
So, first of all, saving 300KB on WebGL seems like a pittance. Then, there's what appears to be the blatantly incorrect statement of 64-bit windows offering 4GB of address space - shouldn't that be way bigger, or am I stupid?
Re:Last paragraph in the TFA is... confusing (Score:5, Informative)
http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx#memory_limits [microsoft.com]
Re:Last paragraph in the TFA is... confusing (Score:4, Informative)
32 bit programmes (with the large address aware flag set) get 4GB of address space on 64 bit Windows, compared to 2GB / 3GB on 32 bit Windows.
Re:Last paragraph in the TFA is... confusing (Score:5, Informative)
Also MSVC only has 32-bit binaries
Clarification: the only linker for 32-bit targets is, itself, 32-bits.
The linker which targets 64-bit Windows is still 64 bits. If they had a 64-bit-to-32-bit cross compiler (they have the reverse, but not 64-to-32) that would solve Mozilla's problem. (Well, for some definition of "their problem.")
Re:Last paragraph in the TFA is... confusing (Score:4, Informative)
I don't understand the issue (Score:5, Insightful)
It sounds like a build-tool chain problem, not an issue with the eventual binary produced for Firefox.
Why not just run the 64-bit tools on a 64-bit platform and have them output 32-bit code, the same as can be done by virtually any cross-platform compiler system. I can't imagine worrying about the fact that I can't run the builds on an outdated 32-bit OS as long as I can produce the binaries for such platforms.
I shudder to think what it would have been like to develop for some of the military communications systems I worked on for my first job if we had to run the compilers on that pathetically slow mil-spec Sperry AN-UYK system (magnetic core memory -- talk about slowing down the CPU! But it's radiation hard!) We only tested on the Sperry -- all builds and editing were done on a VAX.
In modern terms, could you imagine having to run your editors and compilers on an iDevice instead of OS/X?
Re: (Score:3)
In modern terms, could you imagine having to run your editors and compilers on an iDevice instead of OS/X?
That depends on how long Apple will continue to sell the MacBook Air instead of replacing it with a high-end iPad, but the purported death of the PC is another topic for another day.
Re: (Score:3)
> Why not just run the 64-bit tools on a
> 64-bit platform
There is no 64-bit version of the MSVC linker, and no plans for one according to public statements from Microsoft.
One can run the 32-bit linker on a 64-bit Windows, which gives you 4GB of address space to play with. That's the medium-term solution for Mozilla, but it does require updating the entire Windows build farm to 64-bit Windows and shaking out any compat issues that result. Doable, but will take a few weeks probably.
Re:I don't understand the issue (Score:5, Informative)
There is no 64-bit version of the MSVC linker
Clarification (I've posted this a few times): there's no 64-bit version of MSVC that targets 32 bits. There is a 64-bit version which targets 64 bits, but that doesn't help the current situation.
The problem is VS's PGO architecture (Score:5, Informative)
Summary should read: Visual Studio is too teh suck to link Firefox on Windows with PGO.
Firefox links just fine with VS, if you don't use PGO. The problem is that Visual Studio's PGO routine loads all our object files in at once, then uses a ton of memory on top of that. And the linker for 32-bit programs is itself a 32-bit program; if there were a 64-bit x86-32 linker, we wouldn't have this problem. But so far, Microsoft has not given any indication that they will release a 64-bit to x86-32 cross-compiler.
Note that Chrome doesn't build with PGO at all, last time I checked.
http://groups.google.com/a/chromium.org/group/chromium-dev/browse_thread/thread/533e94237691e2f6 [google.com]
Note: Visual Studio 2010 seems to help a bit, but not much. We use VS 2005 because it's the last version whose CRT supports Windows 2000 and Windows XP before Service Pack 2.
Re: (Score:3)
> We use VS 2005 because it's the last version whose CRT supports Windows 2000 and Windows XP before Service Pack 2.
That's all very nice, but if you run an obsolete and out of support OS, why would you want to run the latest browser? Those users should be more than happy with Firefox 3.6.
Re:The problem is VS's PGO architecture (Score:5, Informative)
Not FF's fault (Score:4, Informative)
The software product I work on also used PGO at one point. Our software is roughly 2-3 million lines of C++ code, and links with 40-50 external libraries. Under Window XP, VS2008 could not link our product with PGO turned on.
Before you complain about the bloat of FF, please understand that PGO uses many many times more memory than just compiling a regular release build. This is a Visual Studio linker problem, not FF problem.
Old timer chimes in (Score:5, Insightful)
Back in my day we didn't have gigabyte memory and disk space was at a premium.
It might seem a bit strange now, but back in the good 'ol days we used to have to break up a project into separate components, just in order to compile it!
This is where your interface and API design skills came in handy. If you could partition some piece of the project off into it's own DLL, you could effectively break up the project into smaller pieces that could be individually compiled.
That's where the name DLL came from originally: "Dynamic link library". You didn't need to have all the code read into memory when you first executed the application - less commonly used features wouldn't get loaded until they were actually asked for.
It's not like it is nowadays, where you actually need all the code to be available all the time. "Rich user experience" they call it.
I suppose it's just the future overtakin' us. Them good old days is gone forever.
Re:Old timer chimes in (Score:5, Informative)
It might seem a bit strange now, but back in the good 'ol days we used to have to break up a project into separate components, just in order to compile it!
Yep, we used to do this. Then we merged them together, because it greatly improves the startup speed of Firefox.
We have a lot of smart people working on this stuff, believe it or not.
Too funny reading these comments (Score:5, Insightful)
Folks are crawling all over each other to show how ignorant they are. "I ditched firefox for Chrome cause it's lighter!" only to ignore the fact that Chrome also has the same problem with the PGO thing running out of RAM so they don't even bother with trying those optimizations anymore. "Geez Firefox needs more RAM than the kernel to compile? Something's wrong!" Yes if the Linux kernel was built with PGO on VS 32-bit it probably would run out of RAM there too. Then there's the guy that claims this PGO problem is evidence that the Firefox devs need to go back to remedial school. I'm sure he could do it far better (and avoid the PGO linking optimization running out of memory too!).
Hilarious reading. At least I choose to laugh rather than cry at people's inability to read and understand the issues here.
Re:Too funny reading these comments (Score:4, Interesting)
I think you miss the subtext that some of us have. If you HAVE to enable PGO in order to get a decent speed out of your binary in the first place (and that's the ONLY way you think you get that increase), then your code isn't that great in the first place (i.e. you are using lots of inefficient methods on the most-used paths of your code).
The problem isn't DIRECTLY related to the size/quality of the codebase but the fact that they don't even CONSIDER turning off PGO because of the performance drop means they have no idea how to tune the underlying code without using PGO (and PGO-optimised code will NOT necessarily result in the best possible code for any particular user at all!)
Size of build != Size of executable (Score:3)
This is basic stuff to anyone who actually maintains a build, but Slashdot hasn't been a forum mostly populated by engineers for a number of years, now.
This appears to be due to link-time optimization blowing up the resident memory size of the linker, taking it past 3GB (which is already a non-standard hack the 32 bit build has had to do). Firefox is large, yes, but this has nothing to do with the final binary - which appears to be about 100MB total including all libraries in the Aurora builds.
I used to routinely run out of 32 bit address space compiling executables for a 64MB embedded ARM platform. This was due to symbol bloat, not executable size (which was 8MB). I also ran out of space compiling for a DSP with 288KB of RAM and 1MB flash, but that was mostly piss poor tools (Tasking). In fact, doesn't Chrome and even the Android sources already require building on a 64 bit host?
Re:Wow (Score:4, Insightful)
Some of us actually think that using a web browser is more important than compiling a web browser.
Seriously. My resource usage rarely goes about 1 GB with multiple applications open. These days, the hard drive is a far bigger bottle neck than RAM. Well, unless you're compiling Firefox it appears.
Re: (Score:3)
People keep saying stuff like this, but in my experience it's just not true. Right now I'm clocking Firefox at 596,900 kB commit + 75,000 kB for the plugin-container.
I have never seen it above 1 GB, although I've only sampled maybe once/month.
Are you using another measuring technique than I, or are you just trolling?
Re: (Score:3)
Iceweasel, 35 tabs open, 1 week uptime, 352MB of resident memory ;)
I still play 8-bit video games (Score:3)
Re:Wow (Score:5, Funny)
He was just trying to say that 1080p gives him a big endian.
Re: (Score:3)
there aren't any other than Acrobat that support layers that can be turned on or off at runtime (which have been supported by Adobe Acrobat since, I believe, v6 or so).
Why would you want to do that? I've never seen this feature used (I'm not saying it doesn't have a use, just that I can't conceive of one).
Re: (Score:3)
Why would you choose slackware and then seek technical help? Are you sure you're their target user?
Re:Wow (Score:5, Insightful)
No, it has nothing to do with running Firefox. It has everything to do with running Visual Studio's linker.
This matters only to Firefox developers.
Not that they shouldn't care, mind you, as that is some seriously monolithic code. But it won't make any difference to Joe Sixpack.
Re:Wow (Score:5, Insightful)
Not that they shouldn't care, mind you, as that is some seriously monolithic code.
They're talking about using link-time optimisation (LTO). That means that you compile each compilation unit to an intermediate representation and then run optimisations on the whole program. This takes a staggering amount of memory (which is why no one bothered with it off high-end workstations with very expensive compilers until very recently), but can sometimes be worth it. It actually helps modularity, because you can keep your source code nicely separated into independent components without worrying about efficiency, and you can do better data hiding.
As a trivial example, consider an accessor method. Something like getFoo() { return foo; }. Without LTO, you'd want to put the declaration of the class and this method in the header so that every compilation unit could separately inline it. This reduces modularity, but it saves you the cost of a function call just to access a field in a structure. With LTO, you can make the class opaque (if you're a C++ junkie, using the Pimpl pattern, if you're a C programmer by just not declaring the struct in the header). You'll get a single copy of the function emitted, and then the link-time optimiser will inline it everywhere.
Re: (Score:3)
Blame MS.
The sane option for windows from vista on would have been to deny users the choice and install a 64bit os if the hardware supports it.
Personally I have run 64 bit since xp, although you can only call that a test, vista 64 was ready for all.
Yes I know, the 1 in 16384 people that insist on having some old POC device. Well they can keep an old machine around for it if it is that important.
The better option would have been to deny users the choice and only offer 64-bit builds of Vista and 7.
"But it doesn't work with my 14 year old scanner!!!" Then keep using Windows XP?
There were already growing pains with Vista (shitty GPU drivers from AMD and Nvidia, a new audio framework, and users not being administrator by default), so it would have been the perfect time to force everyone to move to x64. They couldn't force people to switch with 7 because they wanted to get people off of Vista ASAP (ev
Its the compiler, stupid. (Score:5, Informative)
Or just, yknow, stop running a bloated resource hog of an INTERNET BROWSER.
Read TFA agin. Oh, I know this is /.
So read the summary again.
it cannot be compiled with PGO on a 32-bit linker anymore, due to the virtual memory limitation of 3 GB
It is the compiler which is having ressource problems. The profile-guided optimiser needs more than 3GB to be able to do its optimisations. And apparently, the Windows its running on can't do PAE to use more than 3GB neither.
Re:Its the compiler, stupid. (Score:5, Informative)
It is the compiler which is having ressource problems. The profile-guided optimiser needs more than 3GB to be able to do its optimisations. And apparently, the Windows its running on can't do PAE to use more than 3GB neither.
PAE allows 32-Bit computers to use more than 4GB of ram, but it doesn't allow Windows to assign more than 3GB to any single process.
Re: (Score:3)
More specifically windows requires non-overlapping kernel and user address spaces. So does a regular linux kernel.
There were patches for linux ( http://lwn.net/Articles/39283/ [lwn.net] ) that implemented a "4G/4G" system with independent user and kernel address spaces but afaict interest in them was largely lost as most newer systems became x64 capable . I doubt anything similar exists for windows.
Re:Wow (Score:5, Insightful)
Re:Wow (Score:4, Funny)
He was gonna answer that question but he took and arrow to the knee.
Re: (Score:3, Insightful)
You mean some people still run a 32-bit OS?
Not only that, but apparently Windows cannot use PAE - Physical Address Extension [wikipedia.org] to address more than 4GB (according to the WP entry, PAE is supported, but the 4GB limit is still enforced - due to some obscure licensing problems).
The problem is with virtual memory. A process still uses 32bit memory addresses to reference memory. This means that a process can still only address 4GB of memory. If you were to use more than 32bit memory addresses to get more memory, suddenly you aren't a 32-bit OS anymore.
PAE only helps the OS be able to manage more than 4GB of memory.
Re:No PAE?! (Score:5, Informative)
Not only that, but apparently Windows cannot use PAE - Physical Address Extension [wikipedia.org] to address more than 4GB
Sure it can, you just have to either pay for a server edition or hack the restriction out of the kernel.
But more physical address space doesn't help here, the problem here is virtual address space for running an effective but memory hungry profile guided whole program optimisation process. Nromally 32-bit windows has a maximum of 2GB virtual memory per process (and this is one big process we are dealing with). This can be increased to 3GB at the cost of reducing the kernel address space to 1GB.
Going to a 64-bit OS (which allows 4GB of virtual address space for 32-bit processes) will buy them a little bit of time but it's not a long term soloution. Really they need a 64 to 32 cross toolchain (which according to other posts here MS do not offer) if they want to keep using profile guided optimisation as the codebase grows.
Re:Wow (Score:4, Funny)
Re: (Score:3, Insightful)
No, no it doesn't, but I'm impressed, turning this against MS instead of complaining about the real culprit, kudos.
Re:whose bloat (Score:5, Informative)
Speaking of someone who regularly does large C++ builds, MSVS is nowhere near a bad culprit here. The linker is essentially doing code generation -- link time optimization. Why? Because LTO gives a substantial performance benefit. The profile-guided optimization mentioned in the summary gets them about 10% over even doing non-profile-guided LTO.
One project I've worked on has single files which cause GCC to take over 6 GB to compile when you compile with -O2. Who's bloated now?
(Takeaway: broadly speaking, MSVS is actually very competitive, at least compared with GCC, when comparing similar settings.)
Re:whose bloat (Score:5, Interesting)
Re: (Score:3)
"One project I've worked on has single files which cause GCC to take over 6 GB to compile when you compile with -O2. Who's bloated now?"
emm ... those single project files?
Re: (Score:3, Insightful)
Templates out the wazoo. Don't blame me, I didn't design it. (Or write most of it; I'm only tangentially related nowadays.)
Point being, there are features which, while valuable, are costly. I blame our source, our fondness of templates, and the compilation consequences that templates almost necessitate way more than I blame GCC for having a poor implementation of templates.
But similarly, going "stupid bloaty MSVC, it shouldn't support this feature which can give double-digit percentage speedup because it ta
Re: (Score:3, Informative)
You need to go re-read your source on PAE (or tell them to reread their source).
PAE does not increase the memory space available to a single process, so your statement that "So GCC *could* use 6GB on a 32bits machine" is absolutely false. What PAE allows is multiple processes to, in total, take more than 4 GB. (So you could have a 32-bit machine with 6 GB of RAM and have, e.g., two GCC processes, each taking 3 GB with no paging.)
Re:whose bloat (Score:5, Funny)
Re:whose bloat (Score:5, Funny)
Can we turn this into a competitive sport? Please?
Re:Are you serious? (Score:5, Funny)
Re: (Score:3)
Yes, most people still run 32-bit hardware.
Pretty much anything purchased in the last few years is going to be 64-bit capable.
If you're running 32-bit, it probably isn't the hardware holding you back. It's probably your software.
I'm still having to reload machines with 32-bit Windows XP because we've got software that won't support anything else.
Re: (Score:3)
If "most people still run 32-bit hardware", then surely the "reality of the present" is that 32-bit builds are needed.
If Mozilla abandons 32-bit builds, then whoever eventually steps up to maintain these unofficial 32-bit builds will have the same problem. And all the 32-bit users who go to getfirefox.com will get turned away to some random 3rd party site? I'm sure that will help firefox's popularity.
As you say yourself, 32-bit Windows is far from obsolete. So it would be pretty retarded to just abandon the
Re:Time to move on, perhaps? (Score:5, Informative)
No, the long-term solution involves freezing the 32-bit version as an eternal final-state "stable" branch, and moving on to the 64-bit world.
Um.
Some 90% of our users are on 32-bit Windows. Just because *you're* not one of them doesn't mean that they don't matter.
It's nice that you don't expect us to support your aging XP boxes, but I think you'd find you're the minority in this respect.
(Also, all phones are 32-bit, and will be for at least the next few years.)
Re: (Score:3)
That's 90% of your users. Though steam says [steampowered.com] that nearly 45% of their users are using a 64bit OS, and it's increasing at a rate of 1% a month.
Re:Time to move on, perhaps? (Score:5, Informative)
I am a Mozilla guy.
There's an official 64-bit version for Linux. We've been shipping that since before I can remember. There are also nightly builds for 64-bit Windows, but we're not shipping these even as Aurora at the moment.
64-bit Linux isn't listed on most of our download pages. I'd argue it should be there, but I'm not in charge. :)
Anyway, here are links to get all the builds we produce:
Nightly builds: http://nightly.mozilla.org/ [mozilla.org] (has win-64 builds)
Aurora builds: http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/latest-mozilla-aurora/ [mozilla.org]
Beta builds: http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/9.0b6-candidates/build1/ [mozilla.org] (I don't know the directory for the latest beta build, unfortunately, so you'll have to update this URL each time you go looking.)
Release builds: http://releases.mozilla.org/pub/mozilla.org/firefox/releases/latest/ [mozilla.org]
Re: (Score:3)
Re:Time to move on, perhaps? (Score:4, Insightful)
Possibly not. But part of the reason is that it is not possible [microsoft.com] to set 64-bit IE9 as the default browser on Windows Vista or 7.
That, and no official releases of 64bit Firefox, Chrome, (Safari?) or Opera for Windows. So it's really a team effort to keep 64bit browsing off Windows.
Re:Time to move on, perhaps? (Score:5, Informative)
No, you missed a fact.
Visual Studio 2005 is a 32-bit app on any Windows platform.
Visual Studio 2010 is a 32-bit app when running on a 32-bit platform.
Mozilla builds on 32-bit platforms can no longer support the PGO linker.
Visual Studio 2010 is a 64-bit app on a 64-bit platform.
Mozilla builds on 64-bit platforms can PGO link just fine.
When Visual Studio 64-bit compiles a 32-bit app, that app can run only on XP SP2 or later.
Mozilla has millions of users on pre-XP SP2 platforms.
So Mozilla has a choice: change nothing but stop PGO linking the 32-bit versions (sub-optimum for 32-bit users), go forward on a 64-bit only path and disenfranchise the old users from getting any new functions, abandon them completely (which is irresponsible in terms of security), cut back on new features for all, or take an axe to the existing code. Only one is an easy choice.
Re: (Score:3)
Special.
Third choice: not using a broken toolchain.
Re:Time to move on, perhaps? (Score:4, Informative)
Re:Time to move on, perhaps? (Score:4, Insightful)
If you say so, but if MS used PGO for things like MS Office, IE, or even the windows explorer shell I'm sure they'd quickly run out of RAM as well.
So no, Firefox is not more bloated than "*ANY* component within Windows."
Seriously, anyone with the "slightest clue" about what the article is talking about would understand that this issue is not about "deep-rooted problems" in Firefox. And as the other poster mentions, MS knows about this problem and now has 64-bit binaries of the compiler and linker now so this is less a problem. However it only addresses the issue for 64-bit apps; the 64-bit binaries don't appear capable of compiling and linking 32-bit apps, if I read this correctly.
Re:Time to move on, perhaps? (Score:4, Informative)
The MSVC compiler is a 32-bit program
NO:
C:\vs10\VC\bin\amd64>link /dump /headers link.exe
Microsoft (R) COFF/PE Dumper Version 10.00.40219.01
Copyright (C) Microsoft Corporation. All rights reserved.
Dump of file link.exe
PE signature found
File Type: EXECUTABLE IMAGE
FILE HEADER VALUES
8664 machine (x64)
Re: (Score:3)
Atom processors are 32-bit... and they are netbooks, mostly.
Netbook Atoms have been 64-bit since late 2008. Everything except the original diamondville single core does 64-bit.
The only current atoms that don't do 64-bit are the ultra-low-power Z series, which are for UMPCs and tablets.
Re:VS 2005? (Score:5, Insightful)
Seems ironic that the FF team is using stuff from seven years and two major versions ago while at the same time bemoaning that anybody might want to keep a version of Firefox for more then 6 weeks - especially enterprise users.
Interesting how they don't practice what they preach.
Re:VS 2005? (Score:5, Informative)
Point taken.
But FYI: We use VS2005 because it's the last version whose CRT supports Windows XP before SP2 and Windows 2000.
We would love to upgrade, and are in fact devoting a lot of engineering time towards figuring out if we can upgrade while maintaining compatibility.
Re:VS 2005? (Score:5, Informative)
One of the biggest problems with newer versions is the runtimes using EncodePointer/DecodePointer, which aren't available pre-SP2.
Try this:
* Install VS2010
* In your project configuration(s), under Configuration Properties->General, set "Platform Toolset" to "v90".
I can use this configuration with VS2010 and the latest Windows SDK and get binaries that work on XP pre-SP2.
Re:VS 2005? (Score:4, Informative)
I think what he was trying to say -- in the meanest possible way -- is that the setting you chose tells VS to use the old compiler and linker. It doesn't switch out the CRT -- it switches out the whole toolchain. So using that setting is no different from where we are now, afaik.
Re: (Score:3)
What, exactly, would be wrong with just using gcc for all platforms, like an awful lot of projects do? What's VC++ doing for Firefox that can't be done any other way?
Re:VS 2005? (Score:5, Interesting)
I can't speak to Firefox specifically, but in my hands, for my project, VC++ produces code that is VASTLY superior to gcc. With gcc, I can often get significant speedup by hand-optimizing code; with VC++, my bog-simple code gets automatically optimized better than my most aggressive manual efforts. Like it or not, Microsoft has the currently best compiler.
Re:VS 2005? (Score:4, Interesting)
Then I find that more worrying than anything. The firefox code is literally so bad that not only do they have to do profile-guided optimisation but must have an optimal compiler with all the options turned up in order to get it to run like the stunned sloth that it does on my systems?
It just reeks of horrendous code. Makes me wonder what the hell all those other large open-source projects are doing that's so much better than the Firefox code that they can outperform it using "only" the sub-optimal gcc.
Re:Oh, now we admit it is getting bloated (Score:5, Insightful)
I'd say it nicer but then I'd risk not hitting +5 Funny.
Re:Eg? (Score:5, Insightful)
> 1) What the hell are you doing with your code to be
> that large?
How large? It's a few million lines of C++ code, just like every other browser. What it's doing is implementing all the stuff people want to do on the web.
> 2) What the fuck is your linker doing to do that?
Please read up on link-time code generation.
> 3) Why the hell didn't you see this coming and
> prune LONG before you hit the 3Gb limit if you
> already hit the 2Gb limit once already?
This is an excellent question that I too am asking.
> 4) What's the problem with compiling on 64-bit
> computers only,
None, except updating a large build farm from 32-bit to 64-bit can't happen overnight. Needs some staging, testing, etc.
> You're honestly telling me that Firefox is more
> complicated and needs more memory to compile
> than, say, LibreOffice?
Have you tried to compile LibreOffice with LTO and PGO turned on?
> The Linux kernel?
Quite possibly. The kernel is C, not C++; C++ is a lot more of a pain for compilers to deal with.
The whole point of LTO is that you optimize your entire binary as a single object, no matter how your code is structured. It requires more memory, but can produce faster code because the compiler is able to make optimizations it can't make otherwise.
Whether that's "crappy" is a matter of your priorities, of course. 10-25% performance improvement tends to be a high priority for web browsers, though perhaps not for KDE or LibreOffice.