'We're Approaching the Limits of Computer Power -- We Need New Programmers Now' (theguardian.com) 306
Ever-faster processors led to bloated software, but physical limits may force a return to the concise code of the past. John Naughton: Moore's law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. "In terms of size of transistor," he said, "you can see that we're approaching the size of atoms, which is a fundamental barrier, but it'll be two or three generations before we get that far -- but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit." We've now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there's been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called "cores" -- in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.
But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.
But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.
So... (Score:5, Insightful)
they are looking for "real" programmers, like those that used to optimize code, use different algorithms to save RAM, speed I/O, etc.
Basically, they're asking for "real" programmers.
Re: (Score:2)
They do. And while there are some in the younger generations, most of these have been educated to not only accept the bloat, but to see it as normal. There is a reason why C is in the top-3 TIOBE index languages, and it may be not only because of IoT and small embedded systems. I also hear there is a C compiler for Web-Assembler...
Re:So... (Score:5, Interesting)
In my previous job, I had to deal with new developers who didn't understand that computer resources were not unlimited.
They often caused problems by, say, sending a 100 GB file into a process that was designed to handle a 1 GB file. Then wondered why everything bogged down.
I grew up with computers that had 8-32K, so even though I deal with computers with literally 1000+ times more resources, I still remember that resources are not unlimited and design my software accordingly (and my software tends to not cause problems either).
Re:So... (Score:4, Informative)
It's litterally a million times more resources. A machine with 64GB of RAM has a million times more resources than something like a Commodore 64 (which had 64 KB of memory). To put it another way, If you had 1000 users hitting your website at once, you could just 64 MB for each request, which was more than most people had for running the entirety of Windows 95, and that's just to serve a single request.
Re: (Score:3)
> I grew up with computers that had 8-32K, so even though I deal with computers with literally 1000+ times more resources,
I grew up in the era of 48K and 64K as well, specifically Apple ][+, but it is actually much worse then 1000+ times!!!
CPU:
The Apple 2's 6502 ran at 14.31818 / 14 = ~1.022 MHz. Log2( 1 MHz ) = 0.032.
Today's CPUs run at say 4.5 GHz. Log2( 4500 MHz ) = 12.1.
CPUs are over an order of 12 magnitude faster! i.e. 1 MHz * 2^12 = 4189 MHz.
RAM:
The Apple ][+ had 64K (48 KB RAM + 16 KB Language
Re:So... (Score:5, Informative)
No [wikipedia.org]:
An order of magnitude is an approximate measure of the number of digits that a number has in the commonly-used base-ten number system.
It's a mix of when it does and doesn't matter... am I storing one counter, on a 64 bit native CPU then let's not think too hard if it needs to be a short or int or long. If I'm storing a hundred million of them, okay let's think about it.
why do Capitalists hate the Market? (Score:5, Insightful)
We need programmers now!
PAY THEM.
Re:why do Capitalists hate the Market? (Score:5, Insightful)
We need programmers now!
PAY THEM.
Now just hold on there, you crazy radical! If we pay them we might not be able to afford a 3rd yacht!!
Comment removed (Score:4, Insightful)
Re: (Score:2)
And while there are some in the younger generations, most of these have been educated to not only accept the bloat, but to see it as normal
The vast majority of "programmers" are simply writing interfaces to databases. The only difference is what platform was being used. When I was in high school, people were writing GUI interfaces on MS access and calling it working software. Now it's JAVA/ PHP.
I've given up calling myself a programmer since I'm much slower writing code than the usual dev crowd. Mind you I've written multiplayer server sofware that managed to handle 1000 users on ancient hardware where my competition couldn't manage to g
Re: (Score:3, Insightful)
What's a "real" programmer and what are you optimising for? It's one thing to write small code, but is that the goal? Someone in another thread mentioned that it's sad a modern hello world takes 10MB while we sent someone to a moon on only 64kb. Well that program that sent someone to the moon came close to being lost in time as it only ran on a single model of processor.
If I write hello world in Java it'll take 10MB to execute, but it'll run on any computer, any OS, and some non computer devices as well.
In
Re: (Score:3)
A modern Hello World program maps in program code that's likely mapped in by many other processes (e.g. standard C library) and thus doesn't take up additional RAM. It consumes a few hundred bytes of runtime memory in a 4K page writing its procedural linkage tables, and a stack frame to handle program control.
Re:So... (Score:5, Insightful)
In the modern world is it more necessary to optimise for speed, or optimise for portability?
Readability should also get some votes.
Re:So... (Score:5, Insightful)
If you wrote a program in 'C' 40 years ago using libc calls, chances are it will still compile and run on just about any platform today with no more than very minor source code tweaks; 40 years ago nobody had even thought of Java. My own experience using Java software is one of frustration with incompatibilities between the different runtime versions. In general I avoid software written in Java.
Java made sense when compilers were proprietary and expensive. Now that compilers are ubiquitous and free, Java is a solution looking for a problem.
Re: (Score:3)
I can agree that Java at least up to 1.5 was changing quite rapidly and it was a challenge to figure out how to handle a new release.
But at least Java has two things that I came to like:
1. The ability to use a "deprecated" declaration so that there's at least a period allowed for migration.
2. That exceptions aside from runtime execptions like division by zero and other common mistakes had to be declared. This is something that C# has completely missed so there you can get unexpected crashes or just catch "E
Re: (Score:2)
You're missing the point. If you want all your programs to be basic console based text applications then sure, portability should be possible, though you did mention the word compile right? A large portion of the what we consider waste these days is nothing more than abstraction. You don't need to compile something like Java, or some scripting language, or a web app because you're fundamentally abstracted from the underlying OS.
My experience with making a C program in windows years ago as that it doesn't ev
Re: (Score:2)
... In the modern world is it more necessary to optimise for speed, or optimise for portability?
The real answer is "Yes"... ;-)
Re: (Score:2)
It's not a language issue, it's a resource issue. You can write reasonably efficient code in most modern languages given things like JIT compilers etc.
A good example is when programmers duplicate objects without thinking about how large those objects are, keeping all data in RAM before dealing with it instead of iterating over smaller amounts, or reading a whole table into ram rather than using a WHERE clause.
There is one popular piece of eCommerce software that eats 50 - 75 MB RAM per concurrent user. it
Re: (Score:3)
Its about knowing what you're doing, but also about what the computer does with what you tell it.
For example, Microsoft's cool new web techology stack, ASP.NET works very well, its easy to code for... but much of that easiness comes from awful programming practices such as reflection. So rather than coding a controller method to accept a set of parameters and compiling the code to turn the http input into a method call (eg like RPC system used to do) they instead rely on reflection to inspect the code at ru
Re: (Score:3, Insightful)
One of my mantras is '!@#$ing Microsoft'
Recently had a teammate show me a simple rewrite of one (Microsoft SQL) line: orig: a is not null or b is not null, new: not (a is null and b is null). Logically equivalent. From two hours to twenty seconds?!?
Not to mention subqueries frequently leading to missed records (fractions of a percent, but when you're dealing with millions/billions of records) ... seriously, WTF kind of crap code in the back-end allows this? To be fair, I've seen the same insanity in Orac
Re: (Score:2)
Whoooosh .. that attitude is also from 20 years ago.
Re: (Score:2)
And yet, the principles of Essbase development, learned ~two decades back, still apply to all the current fancy, trendy BI tools, like OBI (the official descendant) and Tableau (the unofficial, but far friendlier offshoot)
And there are numerous optimization guidelines - both storage and performance - from Assembler that apply to C++.
Those who do not learn from history ... are idiots.
Re: (Score:3)
Re: (Score:2)
Real programmers don’t eat quiche. https://www.ee.ryerson.ca/~elf... [ryerson.ca]
Re:So... (Score:4, Insightful)
Yep, as opposed to Javascript kiddies who think pushing a 2gb framework to the client over network is a good idea.
Re: (Score:3)
Problem is, they want filet mignon but pay for a Sloppy Joe.
New programmers (Score:5, Funny)
It's a good thing that we have all those out-of-work coal miners that need jobs, then.
Re:New programmers (Score:5, Insightful)
It's a good thing that we have all those out-of-work coal miners that need jobs, then.
Who ever marked this down, you missed the point. This isn't flame-bait. It's commentary on the idea that programmers are replaceable widgets. That individuals displaced from jobs can just be trained to code, like a dog learning a trick.
Re:New programmers (Score:5, Insightful)
Re:New programmers (Score:4, Insightful)
Many managers think they can slap the wine bottle out of the hands of the vagrant at the front steps of the building and replace a real programmer.
That was basically our hiring method for two years at a Seattle healthcare group I worked at.
Dev 1: I don't like him. He doesn't know anything.
Dev 2: I don't like him. He's a vicious clown with an attitude.
Dev 3: I don't like him. He stabbed Bob in the break room.
Dev 4: I don't like him. His only experience is with MS Access.
Manager: Okay, he's hired!
Re: (Score:3)
There is a saying in German, the knave thinks the way he is.
Unfortunately, programming is not management. You can't just take any bum from the street and make him do it.
Re: (Score:2)
It's a good thing that we have all those out-of-work coal miners that need jobs, then.
Who ever marked this down, you missed the point. This isn't flame-bait. It's commentary on the idea that programmers are replaceable widgets. That individuals displaced from jobs can just be trained to code, like a dog learning a trick.
Actually, some of the coal miners -can- be trained to code. But the percentage that can, is only a bit higher than the general population.
As usual, the title is wrong (Score:5, Insightful)
You don't just need "New" programmers.
You need Good programmers, who aren't completely dependent on google searches to write a line of code.
Re:As usual, the title is wrong (Score:5, Insightful)
Google is equally indispensable for good programmers. Or do you not recall the good ol' days of rifling through a stack of O'Reilly books?
Re: (Score:2)
LOL@O'Reilly books. All I had was class notes and if I was lucky, a copy of the manual from the manufacturer of whatever I was writing code on.
I asked a professor at a local technical college why they didn't emphasize assembly programming, where you could really gain understanding about what the computer was actually doing. He didn't seem to understand my question.
Re: (Score:2)
Re: (Score:2)
Well there are "Programmers" (spelled with a capital "P" as in consummate professional) and there are "programmers" (spelled with alowercase "script-kiddie"). There has been a need for Programmers for some time now as we are drowning in a sea of programmers.
As the old saying went: "Intel giveth and Microsoft taketh away." This is just not sustainable.
Re:As usual, the title is wrong (Score:5, Insightful)
Re: (Score:2)
Not true, you can just use 9x as many programmers and they'll make you a baby in a month.
Not wrong, but... (Score:4, Insightful)
Of course so much CPU power made programmers lazy but let's not forget here that Intel just didn't do their job due to lack of competition.
AMD has problem that there is growth potential both in the core department but also when it comes to IPC.
How much longer that'll allow for Moore's law to drag along is obviously anyone's guess and it would be very interesting to see how much more work could get done if programmers got as creative as they have been in the past...
But let's also not forget that a lot of wasted power is due to security features. We now know what happened when Intel got creative with predictive execution...
Sooooo we'll see, basically?
Re: (Score:2)
"problem" should be "proven"... sorry, I can't even blame autocorrect... this was written on a PC ;).
Re: (Score:3)
How many more cores do you need?
Last year I built an entry level work station with an R9 3900X. The 24 threads are nice, but are only really utilized in fringe applications like ray traced 3D rendering (can be combined with the way more efficient GPU rendering), compiling of larger projects, video encoding, and or data compression (although that is heavily bottle-necked by I/O limitations of the used
Re: (Score:2)
Of course so much CPU power made programmers lazy but let's not forget here that Intel just didn't do their job due to lack of competition.
I don't think Intel was the only one to blame here. Intel had a problem where they would make architectural improvements and Microsoft was just never bother taking advantage leaving clock speed as the only differentiator. The classic case of this biting them was the Pentium Pro which ran faster on 32 bit systems and slower on 16 bit. It sold poorly since Windows 95/98 were still mostly 16 bit code. The situation has only improved because Intel now goes and adds the advanced features to Linux and that mo
Towers of Abstraction (Score:5, Interesting)
Over the last 10 years, the following has happened: (1) The cloud made an "infinite" amount of computing resources available to developers (for a price.) (2) Every single developer has seemingly abandoned platform-native applications for JavaScript monstrosities hosted in web browsers.
All that abstraction and the shoehorning of every programming problem into what was a page rendering engine costs compute resources. Most complex JS projects require libraries on top of libraries on top of frameworks because web development has become so abstract. Shoehorning every single message interaction into HTTPS has consequences as well...native protocols that are optimized for speed are being replaced with flinging of JSON between thousands of endpoints to do a previously simple task.
There's a reason why embedded developers are still sought after...you just don't have the luxury of infinite resources in a tiny low-power ASIC with a few kilobytes of RAM. I thought the form factors like Raspberry Pi would keep things status quo for a while, but increasingly you have devices with smaller batteries and a low power budget, so just being physically small isn't enough.
Re: (Score:3)
Most complex JS projects require libraries on top of libraries on top of frameworks because web development has become so abstract... There's a reason why embedded developers are still sought after...you just don't have the luxury of infinite resources in a tiny low-power ASIC with a few kilobytes of RAM.
I think you've got the right conclusions for the wrong reasons.
Just for the record, JS itself as a language isn't that slow. There's been enough R&D dollars funneled into JIT and other technologies that a JS program does pretty well. We should expect JS code to run sometimes as fast as C++, sometimes half as fast as C++, usually in the middle. It's rarely an order of magnitude slower. https://benchmarksgame-team.pa... [debian.net]
If not the language, then it must be the libraries - the "towers of abstraction" you d
Re: (Score:2)
Our Web standards suck (Score:2)
For internal or custom business CRUD applications, what I see is that users want the UI to act like desktop GUI's, but Web standards were not designed for that, and forcing them to be like that has created layered "towers of abstraction" (which often fail over time as browsers change).
For all the talk of mobile business computing, must productivity oriented applications are still done with a mouse. Mobile-centric standards like Bootstrap waste screen real-e
They don't need "new" programmers (Score:5, Insightful)
What they're asking for is "old" programmers, i.e. those who grew up with extreme hardware limitations.
New programmers should start their education on something small and simple, like an Atmel ATmega328P, i.e. the first generation of Arduinos.
Re: (Score:2)
Re: (Score:2)
Sure, but we are talking about teaching people to program efficiently. You won't find a halfway sensible problem that a beginning programmer could face that cannot be done sufficiently efficient on contemporary consumer hardware.
Re: (Score:2)
Re: (Score:2)
This is why you probably know when to use what sorting algorithm because in this environment, using the right algo for the job mattered.
Re: (Score:2)
Stop Writing Bloatware (Score:5, Insightful)
Stop writing bloatware. And by that, I mean stop using 24834 layers of software abstraction to compensate for the fact that you don't know how write code that does things.
Learn to write code that does things. Don't learn to write code that asks other libraries to do things. Definitely don't write libraries just to ask other libraries to do things.
All modern OSes, including Linux, are guilty of this. There's no reason Ubuntu 19.10 should be sluggish as all hell and take forever to do anything compared to 10.04 on my old Athlon 965 machine.
Re: (Score:2)
Worst example I have come across so far: Pushing about 1MB (no joke) of JavaScript to the client to essentially render a table that would be perfectly fine to be displayed naively in HTML 2.0. Took several seconds to even compile that JavaScript.
Re: (Score:2)
Some sites even scroll with a lag now because of javascript. Once in a while I visit gizmodo (don't know why anymore its half sponsored content and social justice). The page has visible lag when scrolling. Turn of javascript and its smooth as ever.
Re: (Score:2)
Most of the time, it's idiotic scripts that try to mess around with the scrolling itself (i.e. javascript-powered "smooth scrolling" or some other bullshit).
Re: (Score:2)
Agreed. Does a modern computer feel any faster than one from 20 years ago? Not one bit. Sure SSDs have made things better but browsing and loading large programs like Photoshop isn't any faster. The hardware is orders of magnitude faster but code is still slow. Go back to the 8088 days and again things are just as responsive.
Re:Stop Writing Bloatware (Score:5, Insightful)
I definitely don't agree with having people not use libraries and write code yourself. I have dealt with too much scientific code that did that. You won't beat BLAS/LAPACK in your own code. Don't even try because you will fail and you will fail by orders of magnitude. The code will be buggier and slower. There are a lot of common high performance libraries that have often been around for decades that you should just not try to write yourself.
Re: (Score:3)
Burger King I.T. is not the answer (Score:5, Insightful)
Here's an idea (Score:2)
As soon as we reach the physical limit ... (Score:4, Interesting)
... we'll go back to tweaking our algorithms. No big deal. Give a programmer a PC that doesn't get anymore powerful within a large enough amount of time, and he'll start optimizing. Until then, screw low-level optimisation. I've got pointy-haired bosses who don't want no low-level optimisations and deadlines to meet.
I'm actually looking forward to the time when we are back to squeezing the best out of some low-power Rasberry Pi or something. That will once again make programming a challenge. And then I'll dive into Rust.
For now I'll stick with easy-peasy web languages, thank you.
Blame business (Score:2)
The bottom line is businesses do not want to pay for lean and concise code because it costs money. Businesses are driven to control costs. Thus, they are not interested in stopping bloat. As long as clients continue to buy their products/services nothing will change.
Business need to see the benefit of efficient code. Until then nothing will change.
Re: (Score:2)
The rise of cloud is what they will see- because the computing requirements of sloppy code is high, the cost of running that on a per-CPU-hour cloud platform is even higher.
Hiring a programmer to resolve parts of the codebase to perform faster would make a direct impact on the bottom line of a business.
But, unfortunately, much of the sloppy slow code is put there by the same programmers who work for the cloud platforms (hello Microsoft) and I guess they have an incentive to churn code out that performs badl
need a new programming LANGUAGE (Score:2)
Re: need a new programming LANGUAGE (Score:2)
Need programming languages? No, it sounds like we need programmers that aren't just IDE pilots bolting libraries together.
everyone needs to lean to code!! (Score:5, Insightful)
Everyone who actually can code knows the beauty and elegance in efficient, minimal code. Not everyone's cut out for brain surgery, not everyone should be a coder. ,
Re: (Score:2)
Don't forget the unrealistic deadlines and scope creep this is another reason why they cut corners and end up with bloated crap.
Economics not Physics (Score:3)
Re: Economics not Physics (Score:2)
That depends on your definition of processor. You can order processors in a 6 pin SOT package that have 24 bytes of data memory and 512 words of program memory. It takes actual skill to get them to do complex things.
Two responses (Score:3, Informative)
1) For things that will NOT be run a gazillion times, it's cheaper overall to write it using efficient-to-program-with but terribly-bloated-at-run-time libraries. Personally, I can live with this.
2) Things that run a gazillion times need to be optimized or you wind up wasting lots of energy - read electricity - as well as a lot of bandwidth. For client-side web code and other code that's going to run in many locations, this almost certainly translates into pollution and other things that hurt everyone.
For #2, I expect in 5-10 years you will see "AI-driven optimization/refactoring" run on code that's expected to be used "a lot" so that the overall energy footprint is reduced. Granted, this is nowhere near "ideally optimized code" but it is going to be a lot better than what's typically done today when it comes to code that uses run-time libraries that are far too "rich" for the task at hand.
The pollution issue will gradually take care of itself as the world moves to non-polluting energy sources, but its going to be a generation or more before the pollution impact of #2 becomes negligible. In the meantime, "because customers who are dead because they couldn't breath don't buy our products" is a good reason for companies that put out code that will be run many, many times to want it to be optimized as much as practical.
It is a balance (Score:2)
I worked for a monolithic old company at one point. The history was interesting. They had computers before I was born. A lot of their mainframe coding practices came because of the price of computing power vs the price of the humans that program for it. Some of their legacy code was very difficult to read. This was sort of by design. They were optimizing the code execution time, instead of focusing on making it more readable, maintainable, etc. As processing power became cheaper, their new system reflected
Re: (Score:3)
It's not as simple as that. Let's say your job is designing and building houses. All you life, you have b
Re: (Score:3)
I'm not entirely sure this is even the case - are the metrics between old code and new so different in terms of programmer productivity? I think there is a small difference.
You see, part of the issue is that the old programmers knew their stuff, and could work on a awkward and difficult to maintain program just as easily as the modern programmers who throw things together from a stackoverflow answer and cross their fingers that it works. The old programmers would write compelx code, and then document and co
I have an idea (Score:5, Insightful)
Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster.
I've got just the thing: Let's hire programmers based on a census of how many people of various races, ethnicities, sexes, and sexual orientations we have at our company instead of just hiring the best programmers. To make sure this works we need to pass laws that require us to pay all of these programmers the same amount of money regardless of their abilities, experience, and training. As soon as we do that things should clear up quickly.
Re: (Score:2)
This is the most concise description of the madness currently happening everywhere.
Assembler (Score:5, Insightful)
Re: (Score:3)
And note that the x86 instruction set architecture is an utterly obsolete pile of crap.
CPT Grace Murray Hopper USN spoke to the University of Texas at Austin ACM student chapter in 1973. She told the story of a certain installation. They'd started out on an IBM 650, and wrote a bunch of stuff in 650 Autocoder. When IBM brought out the 1401, instead of rewriting everything for the 1401, they wrote a 650 emulator, and kept the old code running. And then she said that the installation was now in "emulation
No, you need OLD programmers (Score:3)
You need the older programmers teaching efficient coding practices.
But, you also need MUCH better memory management in the operating system. Memory allocation functions in high-level languages are convenient but they're also incredibly slow.
Wow (Score:5, Informative)
"In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. "
Literally nothing about that statement is correct.
1) Bill Gates did not write the basic for the TRS-80. Level 1 was based on Tiny BASIC which was in the PD, and Level II was a minor port of Altair BASIC.
2) Altair BASIC, aka Level 2, was about 8 kB in size, not 16. If there were expansions on the TRS-80 that brought it to that, they were hardware-related, not BASIC.
The entire argument this article makes, that we need highly crafted software for speed *like these BASICs*, is completely the opposite of what actually happened with these BASICs. They were all optimized for size, not performance, and any number of low-hanging-fruit improvements were ignored to save a few bytes. For instance, one can cache variable names, as in Atari BASIC, or use a fixed table of them, as in BBS BASIC, for a massive speed boost. But that takes maybe 127 bytes. One can pre-process numeric literals, so you don't have to do that at runtime, but that expands them from an average of 3 bytes to 6. While optimizing for size is often the same as performance today, due to caching, it was absolutely not the case in the 70s and this line of reasoning simply illustrates the authors utter lack of knowledge of the claims he's making.
Re: (Score:2)
Re: (Score:2)
The Sinclair ZX81 (sold as the Timex Sinclair 1000 in the USA) actually did that, and programmers who were trying to cram as much code as possible into the tiny RAM would declare variables to hold commonly-used numeric values such as 0 or 1, thus throwing away any speed improvement...
Re: (Score:3)
Bill Gate's programming prowess (Score:3)
About Bill Gates writing assembly... there's a story from the early days about how Andy Hertzfeld really wasn't impressed with Bill Gate's abilities as a programmer [folklore.org].
Re: (Score:3, Insightful)
I've read Bill's code - he was a good programmer for someone who was creating software out of whole cloth. My comment would be that some time spent reviewing what other people had coded first for the same problem (before starting his coding) would have improved Mr. Gates' code.
Software complexity (Score:2)
See the Operations and Memory Use (Score:2)
I find it irconic that people largely left Assembly Language for C for portability, speculative that processor changes were coming--that never came.
Object-oriented languages will be the poster-boy for wasting memory and CPU cycles as well they should be because they hide its use, but even on the Arduino which is a C variant, I've seen people code by just writing their variables wherever they are introduced. If the variables aren't created in one general place, you aren't going to be able to devise a system
MenuetOS - All Assembly Operating System (Score:2)
what's old is new (Score:2)
Lotta FUD (Score:5, Insightful)
This is a problem that will naturally take care of itself, the sky is not falling one bit.
A top (if not THE top) rule of optimization is: "don't optimize what you don't need to optimize". The bloat being complained about is mostly pointing out areas in which stuff isn't optimized and doesn't (yet) need to be optimized.
Software will naturally grow to the size of available resources in part because it's inefficient to prematurely optimize where you aren't resource constrained. If time to market is a high priority variable and your desktop app weighs in at 10MB, it probably doesn't make sense to take 25% longer dev time to get it down to 5MB.
Once there's a material benefit/advantage to optimizing away some of this bloat, it'll be taken care of automatically simply because then there is value to doing it. For example, if a mobile app is too hard on the battery, it'll often get poor reviews, causing some number of people to avoid it in favor of another app until the developer fixes the problem - i.e. the app gets optimized for power consumption because there's a good incentive to do it.
In other words (Score:2)
"Writing software is a craft and some people are better at it than others."
So in other words, it's just like every other activity any human engages in. So insightful.
Start with what YOU want from your computer (Score:4, Insightful)
If you want to have the "look and feel" of an overbloated desktop environment where papers fly animated from wiggling folders when you copy your stuff, and especially if you want that in a web browser, you're part of the problem.
Stop with that shit and concentrate on what's required. And suddenly you'll notice that the amount of processing power and ram you have at your disposal is plenty. For nearly anything you could possibly want to do on your desktop.
Let engineers be engineers, force managers to mana (Score:2)
Moore's Law has allowed lazy managers and programmers to cut all corners possible in software development. Managers should get educated about how to run an engineering shop.
4 bite (Score:2)
Seen people use unsigned chars as loop vars, when that often slows the chip as it needs an extra step to up-promote it. And it doesn't save on stack use generally anyway as 4 bytes are reserved minimally on the stack.
pointless pontificating by a faux-acedemic (Score:2)
Tight code never stopped mattering, really .... (Score:2)
Truth is, there's a whole lot of sloppy, bloated, inefficient code in use every day because it gets things done that people need to get done. They *tolerate* the performance problems because they don't really have a choice. But really, it needed optimization YEARS ago, in order to work the way it should have really been expected to work.
I give you, as one example that hits close to home; Microsoft Great Plains ERP! That Financial accounting package is absolutely awful, when it comes to its performance over
We need smarter chips too... (Score:2)
Re: (Score:2)
Given the bloat you have to deal with to just get an OS booted, this IS limited hardware.
Re: (Score:2)
Screw what is considered "limited hardware". We're talking about educating programmers here, their opinion doesn't matter.
Start them on an ATmega328P, which is extremely similar and extremely limited like the first home computers.