Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Hardware Technology

'We're Approaching the Limits of Computer Power -- We Need New Programmers Now' (theguardian.com) 306

Ever-faster processors led to bloated software, but physical limits may force a return to the concise code of the past. John Naughton: Moore's law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. "In terms of size of transistor," he said, "you can see that we're approaching the size of atoms, which is a fundamental barrier, but it'll be two or three generations before we get that far -- but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit." We've now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there's been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called "cores" -- in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.

This discussion has been archived. No new comments can be posted.

'We're Approaching the Limits of Computer Power -- We Need New Programmers Now'

Comments Filter:
  • So... (Score:5, Insightful)

    by franzrogar ( 3986783 ) on Monday January 13, 2020 @10:54AM (#59615472)

    they are looking for "real" programmers, like those that used to optimize code, use different algorithms to save RAM, speed I/O, etc.

    Basically, they're asking for "real" programmers.

    • by gweihir ( 88907 )

      They do. And while there are some in the younger generations, most of these have been educated to not only accept the bloat, but to see it as normal. There is a reason why C is in the top-3 TIOBE index languages, and it may be not only because of IoT and small embedded systems. I also hear there is a C compiler for Web-Assembler...

      • Re:So... (Score:5, Interesting)

        by rlauzon ( 770025 ) on Monday January 13, 2020 @11:20AM (#59615588)

        In my previous job, I had to deal with new developers who didn't understand that computer resources were not unlimited.

        They often caused problems by, say, sending a 100 GB file into a process that was designed to handle a 1 GB file. Then wondered why everything bogged down.

        I grew up with computers that had 8-32K, so even though I deal with computers with literally 1000+ times more resources, I still remember that resources are not unlimited and design my software accordingly (and my software tends to not cause problems either).

        • Re:So... (Score:4, Informative)

          by CastrTroy ( 595695 ) on Monday January 13, 2020 @12:11PM (#59615820)

          It's litterally a million times more resources. A machine with 64GB of RAM has a million times more resources than something like a Commodore 64 (which had 64 KB of memory). To put it another way, If you had 1000 users hitting your website at once, you could just 64 MB for each request, which was more than most people had for running the entirety of Windows 95, and that's just to serve a single request.

        • > I grew up with computers that had 8-32K, so even though I deal with computers with literally 1000+ times more resources,

          I grew up in the era of 48K and 64K as well, specifically Apple ][+, but it is actually much worse then 1000+ times!!!

          CPU:

          The Apple 2's 6502 ran at 14.31818 / 14 = ~1.022 MHz. Log2( 1 MHz ) = 0.032.
          Today's CPUs run at say 4.5 GHz. Log2( 4500 MHz ) = 12.1.

          CPUs are over an order of 12 magnitude faster! i.e. 1 MHz * 2^12 = 4189 MHz.

          RAM:

          The Apple ][+ had 64K (48 KB RAM + 16 KB Language

          • Re:So... (Score:5, Informative)

            by Kjella ( 173770 ) on Monday January 13, 2020 @01:37PM (#59616178) Homepage

            No [wikipedia.org]:

            An order of magnitude is an approximate measure of the number of digits that a number has in the commonly-used base-ten number system.

            It's a mix of when it does and doesn't matter... am I storing one counter, on a 64 bit native CPU then let's not think too hard if it needs to be a short or int or long. If I'm storing a hundred million of them, okay let's think about it.

      • by Thud457 ( 234763 ) on Monday January 13, 2020 @11:51AM (#59615754) Homepage Journal

        We need programmers now!

        PAY THEM.

      • Comment removed (Score:4, Insightful)

        by account_deleted ( 4530225 ) on Monday January 13, 2020 @12:08PM (#59615808)
        Comment removed based on user account deletion
      • by gmack ( 197796 )

        And while there are some in the younger generations, most of these have been educated to not only accept the bloat, but to see it as normal

        The vast majority of "programmers" are simply writing interfaces to databases. The only difference is what platform was being used. When I was in high school, people were writing GUI interfaces on MS access and calling it working software. Now it's JAVA/ PHP.

        I've given up calling myself a programmer since I'm much slower writing code than the usual dev crowd. Mind you I've written multiplayer server sofware that managed to handle 1000 users on ancient hardware where my competition couldn't manage to g

    • Re: (Score:3, Insightful)

      by thegarbz ( 1787294 )

      What's a "real" programmer and what are you optimising for? It's one thing to write small code, but is that the goal? Someone in another thread mentioned that it's sad a modern hello world takes 10MB while we sent someone to a moon on only 64kb. Well that program that sent someone to the moon came close to being lost in time as it only ran on a single model of processor.

      If I write hello world in Java it'll take 10MB to execute, but it'll run on any computer, any OS, and some non computer devices as well.

      In

      • A modern Hello World program maps in program code that's likely mapped in by many other processes (e.g. standard C library) and thus doesn't take up additional RAM. It consumes a few hundred bytes of runtime memory in a 4K page writing its procedural linkage tables, and a stack frame to handle program control.

      • Re:So... (Score:5, Insightful)

        by Paradise Pete ( 33184 ) on Monday January 13, 2020 @11:34AM (#59615658) Journal

        In the modern world is it more necessary to optimise for speed, or optimise for portability?

        Readability should also get some votes.

      • Re:So... (Score:5, Insightful)

        by jrbrtsn ( 103896 ) on Monday January 13, 2020 @11:40AM (#59615706)

        If you wrote a program in 'C' 40 years ago using libc calls, chances are it will still compile and run on just about any platform today with no more than very minor source code tweaks; 40 years ago nobody had even thought of Java. My own experience using Java software is one of frustration with incompatibilities between the different runtime versions. In general I avoid software written in Java.
        Java made sense when compilers were proprietary and expensive. Now that compilers are ubiquitous and free, Java is a solution looking for a problem.

        • by Z00L00K ( 682162 )

          I can agree that Java at least up to 1.5 was changing quite rapidly and it was a challenge to figure out how to handle a new release.

          But at least Java has two things that I came to like:
          1. The ability to use a "deprecated" declaration so that there's at least a period allowed for migration.
          2. That exceptions aside from runtime execptions like division by zero and other common mistakes had to be declared. This is something that C# has completely missed so there you can get unexpected crashes or just catch "E

        • You're missing the point. If you want all your programs to be basic console based text applications then sure, portability should be possible, though you did mention the word compile right? A large portion of the what we consider waste these days is nothing more than abstraction. You don't need to compile something like Java, or some scripting language, or a web app because you're fundamentally abstracted from the underlying OS.

          My experience with making a C program in windows years ago as that it doesn't ev

      • ... In the modern world is it more necessary to optimise for speed, or optimise for portability?

        The real answer is "Yes"... ;-)

      • by gmack ( 197796 )

        It's not a language issue, it's a resource issue. You can write reasonably efficient code in most modern languages given things like JIT compilers etc.

        A good example is when programmers duplicate objects without thinking about how large those objects are, keeping all data in RAM before dealing with it instead of iterating over smaller amounts, or reading a whole table into ram rather than using a WHERE clause.

        There is one popular piece of eCommerce software that eats 50 - 75 MB RAM per concurrent user. it

      • Its about knowing what you're doing, but also about what the computer does with what you tell it.

        For example, Microsoft's cool new web techology stack, ASP.NET works very well, its easy to code for... but much of that easiness comes from awful programming practices such as reflection. So rather than coding a controller method to accept a set of parameters and compiling the code to turn the http input into a method call (eg like RPC system used to do) they instead rely on reflection to inspect the code at ru

    • Re: (Score:3, Insightful)

      by lrichardson ( 220639 )

      One of my mantras is '!@#$ing Microsoft'

      Recently had a teammate show me a simple rewrite of one (Microsoft SQL) line: orig: a is not null or b is not null, new: not (a is null and b is null). Logically equivalent. From two hours to twenty seconds?!?

      Not to mention subqueries frequently leading to missed records (fractions of a percent, but when you're dealing with millions/billions of records) ... seriously, WTF kind of crap code in the back-end allows this? To be fair, I've seen the same insanity in Orac

      • by sosume ( 680416 )

        Whoooosh .. that attitude is also from 20 years ago.

        • And yet, the principles of Essbase development, learned ~two decades back, still apply to all the current fancy, trendy BI tools, like OBI (the official descendant) and Tableau (the unofficial, but far friendlier offshoot)

          And there are numerous optimization guidelines - both storage and performance - from Assembler that apply to C++.

          Those who do not learn from history ... are idiots.

      • by Bengie ( 1121981 )
        MS-SQL really doesn't like OR statements. It can cause the query plan to create multiple queries, one for each OR, or no longer use indexes. If I really need control, sometimes I just build up the query by inserting into temp tables. This removes the query optimizer's ability to do stupid stuff.
    • by sprins ( 717461 )

      Real programmers don’t eat quiche. https://www.ee.ryerson.ca/~elf... [ryerson.ca]

    • Re:So... (Score:4, Insightful)

      by Marxist Hacker 42 ( 638312 ) * <seebert42@gmail.com> on Monday January 13, 2020 @11:53AM (#59615768) Homepage Journal

      Yep, as opposed to Javascript kiddies who think pushing a 2gb framework to the client over network is a good idea.

    • Problem is, they want filet mignon but pay for a Sloppy Joe.

  • by aardvarkjoe ( 156801 ) on Monday January 13, 2020 @10:54AM (#59615474)

    It's a good thing that we have all those out-of-work coal miners that need jobs, then.

    • Re:New programmers (Score:5, Insightful)

      by BravoZuluM ( 232200 ) on Monday January 13, 2020 @11:34AM (#59615660)

      It's a good thing that we have all those out-of-work coal miners that need jobs, then.

      Who ever marked this down, you missed the point. This isn't flame-bait. It's commentary on the idea that programmers are replaceable widgets. That individuals displaced from jobs can just be trained to code, like a dog learning a trick.

      • Re:New programmers (Score:5, Insightful)

        by burtosis ( 1124179 ) on Monday January 13, 2020 @11:52AM (#59615758)
        Many managers think they can slap the wine bottle out of the hands of the vagrant at the front steps of the building and replace a real programmer. Business schools reinforce this thinking. This is quite often from the same management that takes the savings in bonuses for themselves and switch companies before the eventual implosion.
        • Re:New programmers (Score:4, Insightful)

          by JustAnotherOldGuy ( 4145623 ) on Monday January 13, 2020 @12:40PM (#59615948) Journal

          Many managers think they can slap the wine bottle out of the hands of the vagrant at the front steps of the building and replace a real programmer.

          That was basically our hiring method for two years at a Seattle healthcare group I worked at.

          Dev 1: I don't like him. He doesn't know anything.
          Dev 2: I don't like him. He's a vicious clown with an attitude.
          Dev 3: I don't like him. He stabbed Bob in the break room.
          Dev 4: I don't like him. His only experience is with MS Access.
          Manager: Okay, he's hired!

      • There is a saying in German, the knave thinks the way he is.

        Unfortunately, programming is not management. You can't just take any bum from the street and make him do it.

      • It's a good thing that we have all those out-of-work coal miners that need jobs, then.

        Who ever marked this down, you missed the point. This isn't flame-bait. It's commentary on the idea that programmers are replaceable widgets. That individuals displaced from jobs can just be trained to code, like a dog learning a trick.

        Actually, some of the coal miners -can- be trained to code. But the percentage that can, is only a bit higher than the general population.

  • by bobstreo ( 1320787 ) on Monday January 13, 2020 @10:55AM (#59615478)

    You don't just need "New" programmers.

    You need Good programmers, who aren't completely dependent on google searches to write a line of code.

    • by Mordaximus ( 566304 ) on Monday January 13, 2020 @11:26AM (#59615616)

      Google is equally indispensable for good programmers. Or do you not recall the good ol' days of rifling through a stack of O'Reilly books?

      • LOL@O'Reilly books. All I had was class notes and if I was lucky, a copy of the manual from the manufacturer of whatever I was writing code on.

        I asked a professor at a local technical college why they didn't emphasize assembly programming, where you could really gain understanding about what the computer was actually doing. He didn't seem to understand my question.

      • And 90% of the time, google sends me to stackexchange for answers. The other 10% it finds the actual man pages.
    • Well there are "Programmers" (spelled with a capital "P" as in consummate professional) and there are "programmers" (spelled with alowercase "script-kiddie"). There has been a need for Programmers for some time now as we are drowning in a sea of programmers.

      As the old saying went: "Intel giveth and Microsoft taketh away." This is just not sustainable.

    • by avandesande ( 143899 ) on Monday January 13, 2020 @01:00PM (#59616050) Journal
      Converting PHBanese for you. 'New Programmers' is code for 'More H1B slots'.
    • by Njovich ( 553857 )

      Not true, you can just use 9x as many programmers and they'll make you a baby in a month.

  • Not wrong, but... (Score:4, Insightful)

    by Kokuyo ( 549451 ) on Monday January 13, 2020 @10:56AM (#59615486) Journal

    Of course so much CPU power made programmers lazy but let's not forget here that Intel just didn't do their job due to lack of competition.

    AMD has problem that there is growth potential both in the core department but also when it comes to IPC.

    How much longer that'll allow for Moore's law to drag along is obviously anyone's guess and it would be very interesting to see how much more work could get done if programmers got as creative as they have been in the past...

    But let's also not forget that a lot of wasted power is due to security features. We now know what happened when Intel got creative with predictive execution...

    Sooooo we'll see, basically?

    • by Kokuyo ( 549451 )

      "problem" should be "proven"... sorry, I can't even blame autocorrect... this was written on a PC ;).

    • by fazig ( 2909523 )

      AMD has problem that there is growth potential both in the core department but also when it comes to IPC.

      How many more cores do you need?

      Last year I built an entry level work station with an R9 3900X. The 24 threads are nice, but are only really utilized in fringe applications like ray traced 3D rendering (can be combined with the way more efficient GPU rendering), compiling of larger projects, video encoding, and or data compression (although that is heavily bottle-necked by I/O limitations of the used

    • by gmack ( 197796 )

      Of course so much CPU power made programmers lazy but let's not forget here that Intel just didn't do their job due to lack of competition.

      I don't think Intel was the only one to blame here. Intel had a problem where they would make architectural improvements and Microsoft was just never bother taking advantage leaving clock speed as the only differentiator. The classic case of this biting them was the Pentium Pro which ran faster on 32 bit systems and slower on 16 bit. It sold poorly since Windows 95/98 were still mostly 16 bit code. The situation has only improved because Intel now goes and adds the advanced features to Linux and that mo

  • by ErichTheRed ( 39327 ) on Monday January 13, 2020 @10:58AM (#59615492)

    Over the last 10 years, the following has happened: (1) The cloud made an "infinite" amount of computing resources available to developers (for a price.) (2) Every single developer has seemingly abandoned platform-native applications for JavaScript monstrosities hosted in web browsers.

    All that abstraction and the shoehorning of every programming problem into what was a page rendering engine costs compute resources. Most complex JS projects require libraries on top of libraries on top of frameworks because web development has become so abstract. Shoehorning every single message interaction into HTTPS has consequences as well...native protocols that are optimized for speed are being replaced with flinging of JSON between thousands of endpoints to do a previously simple task.

    There's a reason why embedded developers are still sought after...you just don't have the luxury of infinite resources in a tiny low-power ASIC with a few kilobytes of RAM. I thought the form factors like Raspberry Pi would keep things status quo for a while, but increasingly you have devices with smaller batteries and a low power budget, so just being physically small isn't enough.

    • by ljw1004 ( 764174 )

      Most complex JS projects require libraries on top of libraries on top of frameworks because web development has become so abstract... There's a reason why embedded developers are still sought after...you just don't have the luxury of infinite resources in a tiny low-power ASIC with a few kilobytes of RAM.

      I think you've got the right conclusions for the wrong reasons.

      Just for the record, JS itself as a language isn't that slow. There's been enough R&D dollars funneled into JIT and other technologies that a JS program does pretty well. We should expect JS code to run sometimes as fast as C++, sometimes half as fast as C++, usually in the middle. It's rarely an order of magnitude slower. https://benchmarksgame-team.pa... [debian.net]

      If not the language, then it must be the libraries - the "towers of abstraction" you d

    • As an embedded coder, I agree. I'd also add that a lot of *people* who have embraced "new school" scripting languages and high level programming (not that I see anything wrong with high level langs, they have their place) seem very pretty hostile and insecure when I talk to them about coding on the metal. It's like they have some kind of inferiority complex. However, given that coding on big teams with a "methodology" has been so ineffective (that's been my experience, but I'm sure someone will say it's pur
  • by DontBeAMoran ( 4843879 ) on Monday January 13, 2020 @10:58AM (#59615496)

    What they're asking for is "old" programmers, i.e. those who grew up with extreme hardware limitations.

    New programmers should start their education on something small and simple, like an Atmel ATmega328P, i.e. the first generation of Arduinos.

  • by Anonymous Coward on Monday January 13, 2020 @11:00AM (#59615498)

    Stop writing bloatware. And by that, I mean stop using 24834 layers of software abstraction to compensate for the fact that you don't know how write code that does things.

    Learn to write code that does things. Don't learn to write code that asks other libraries to do things. Definitely don't write libraries just to ask other libraries to do things.

    All modern OSes, including Linux, are guilty of this. There's no reason Ubuntu 19.10 should be sluggish as all hell and take forever to do anything compared to 10.04 on my old Athlon 965 machine.

    • by gweihir ( 88907 )

      Worst example I have come across so far: Pushing about 1MB (no joke) of JavaScript to the client to essentially render a table that would be perfectly fine to be displayed naively in HTML 2.0. Took several seconds to even compile that JavaScript.

      • Some sites even scroll with a lag now because of javascript. Once in a while I visit gizmodo (don't know why anymore its half sponsored content and social justice). The page has visible lag when scrolling. Turn of javascript and its smooth as ever.

        • Most of the time, it's idiotic scripts that try to mess around with the scrolling itself (i.e. javascript-powered "smooth scrolling" or some other bullshit).

    • Agreed. Does a modern computer feel any faster than one from 20 years ago? Not one bit. Sure SSDs have made things better but browsing and loading large programs like Photoshop isn't any faster. The hardware is orders of magnitude faster but code is still slow. Go back to the 8088 days and again things are just as responsive.

    • by Ambassador Kosh ( 18352 ) on Monday January 13, 2020 @12:16PM (#59615842)

      I definitely don't agree with having people not use libraries and write code yourself. I have dealt with too much scientific code that did that. You won't beat BLAS/LAPACK in your own code. Don't even try because you will fail and you will fail by orders of magnitude. The code will be buggier and slower. There are a lot of common high performance libraries that have often been around for decades that you should just not try to write yourself.

      • I agree, but I think that if you rely on a library, you should at least understand how it works well enough to be able to modify it yourself and potentially submit any useful changes back upstream. People blindly relying on dozens of libraries they don't understand is another mess entirely.
  • by krolden ( 6296556 ) on Monday January 13, 2020 @11:05AM (#59615516)
    Programmers need something more than $15/hr through a parasite "staffing" service. There needs to be some incentive.
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Monday January 13, 2020 @11:13AM (#59615546)

    ... we'll go back to tweaking our algorithms. No big deal. Give a programmer a PC that doesn't get anymore powerful within a large enough amount of time, and he'll start optimizing. Until then, screw low-level optimisation. I've got pointy-haired bosses who don't want no low-level optimisations and deadlines to meet.

    I'm actually looking forward to the time when we are back to squeezing the best out of some low-power Rasberry Pi or something. That will once again make programming a challenge. And then I'll dive into Rust.

    For now I'll stick with easy-peasy web languages, thank you.

  • You get what you pay for. Period. If you want lean and concise code, you need to pay for it. If you want optimized code, you need to pay for it.

    The bottom line is businesses do not want to pay for lean and concise code because it costs money. Businesses are driven to control costs. Thus, they are not interested in stopping bloat. As long as clients continue to buy their products/services nothing will change.

    Business need to see the benefit of efficient code. Until then nothing will change.
    • The rise of cloud is what they will see- because the computing requirements of sloppy code is high, the cost of running that on a per-CPU-hour cloud platform is even higher.

      Hiring a programmer to resolve parts of the codebase to perform faster would make a direct impact on the bottom line of a business.

      But, unfortunately, much of the sloppy slow code is put there by the same programmers who work for the cloud platforms (hello Microsoft) and I guess they have an incentive to churn code out that performs badl

  • What a truly ridiculous suggestion. We have very fast von Neumman machines, each processing billions of operations per second. And we have more and more of these fast von Neumman machines inside of every CPU -- up to 128 today in a consumer chip [anandtech.com]. Now all we need are programming languages suitable for getting more than one of those cores to do something at a time, but all of the programming languages that we're using now were built for single threaded code running on a single core CPU.
  • by sdinfoserv ( 1793266 ) on Monday January 13, 2020 @11:16AM (#59615566)
    "everyone needs to lean to code!!" - Seems like that's the mantra from every political simpleton trying to solve job problems. Many bought off on it, now were stuck with sub-par coders and million of lines of crapware. I worked with a dev who was just awful. They "could" code, but debugging their stuff was just painful. It was like they tried to used the most lines to accomplish the task.
    Everyone who actually can code knows the beauty and elegance in efficient, minimal code. Not everyone's cut out for brain surgery, not everyone should be a coder. ,
    • Don't forget the unrealistic deadlines and scope creep this is another reason why they cut corners and end up with bloated crap.

  • by FeelGood314 ( 2516288 ) on Monday January 13, 2020 @11:18AM (#59615572)
    We are now down to two companies that can still make processors on the leading edge of Moore's Law. The cost of the fabrication plants is now so high that all the other players have dropped out. The cost of every two and a bit doublings of transistor density has been a doubling of the cost of the fabrication plant. We are going to hit the limit of what is economically profitable in the next 5 to 10 years.
    • That depends on your definition of processor. You can order processors in a 6 pin SOT package that have 24 bytes of data memory and 512 words of program memory. It takes actual skill to get them to do complex things.

  • Two responses (Score:3, Informative)

    by davidwr ( 791652 ) on Monday January 13, 2020 @11:22AM (#59615596) Homepage Journal

    1) For things that will NOT be run a gazillion times, it's cheaper overall to write it using efficient-to-program-with but terribly-bloated-at-run-time libraries. Personally, I can live with this.

    2) Things that run a gazillion times need to be optimized or you wind up wasting lots of energy - read electricity - as well as a lot of bandwidth. For client-side web code and other code that's going to run in many locations, this almost certainly translates into pollution and other things that hurt everyone.

    For #2, I expect in 5-10 years you will see "AI-driven optimization/refactoring" run on code that's expected to be used "a lot" so that the overall energy footprint is reduced. Granted, this is nowhere near "ideally optimized code" but it is going to be a lot better than what's typically done today when it comes to code that uses run-time libraries that are far too "rich" for the task at hand.

    The pollution issue will gradually take care of itself as the world moves to non-polluting energy sources, but its going to be a generation or more before the pollution impact of #2 becomes negligible. In the meantime, "because customers who are dead because they couldn't breath don't buy our products" is a good reason for companies that put out code that will be run many, many times to want it to be optimized as much as practical.

  • I worked for a monolithic old company at one point. The history was interesting. They had computers before I was born. A lot of their mainframe coding practices came because of the price of computing power vs the price of the humans that program for it. Some of their legacy code was very difficult to read. This was sort of by design. They were optimizing the code execution time, instead of focusing on making it more readable, maintainable, etc. As processing power became cheaper, their new system reflected

    • If we do hit a wall, as it comes to processing power, the programming process will be "retooled", supporting execution efficiency over "easy to code" efficiency. It is just a natural economic trade off. It will likely be a gradual process as we possibly hit that wall. As a programmer, it will just mean learning new techniques to increase efficiency of code, even at the expense of a slower coding process.

      It's not as simple as that. Let's say your job is designing and building houses. All you life, you have b

    • I'm not entirely sure this is even the case - are the metrics between old code and new so different in terms of programmer productivity? I think there is a small difference.

      You see, part of the issue is that the old programmers knew their stuff, and could work on a awkward and difficult to maintain program just as easily as the modern programmers who throw things together from a stackoverflow answer and cross their fingers that it works. The old programmers would write compelx code, and then document and co

  • I have an idea (Score:5, Insightful)

    by lFaRptHjbZDI ( 6200322 ) on Monday January 13, 2020 @11:29AM (#59615628)

    Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster.

    I've got just the thing: Let's hire programmers based on a census of how many people of various races, ethnicities, sexes, and sexual orientations we have at our company instead of just hiring the best programmers. To make sure this works we need to pass laws that require us to pay all of these programmers the same amount of money regardless of their abilities, experience, and training. As soon as we do that things should clear up quickly.

  • Assembler (Score:5, Insightful)

    by fluffernutter ( 1411889 ) on Monday January 13, 2020 @11:31AM (#59615636)
    The people they want are people who have had experience in assembler. Anyone who has done assembler for a living knows how to get the most out of a machine. Sadly, anyone who has done assembler is probably over 45 and thus will be completely ignored by most tech companies.
  • by RogueWarrior65 ( 678876 ) on Monday January 13, 2020 @11:36AM (#59615678)

    You need the older programmers teaching efficient coding practices.
    But, you also need MUCH better memory management in the operating system. Memory allocation functions in high-level languages are convenient but they're also incredibly slow.

  • Wow (Score:5, Informative)

    by Maury Markowitz ( 452832 ) on Monday January 13, 2020 @11:37AM (#59615686) Homepage

    "In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. "

    Literally nothing about that statement is correct.

    1) Bill Gates did not write the basic for the TRS-80. Level 1 was based on Tiny BASIC which was in the PD, and Level II was a minor port of Altair BASIC.

    2) Altair BASIC, aka Level 2, was about 8 kB in size, not 16. If there were expansions on the TRS-80 that brought it to that, they were hardware-related, not BASIC.

    The entire argument this article makes, that we need highly crafted software for speed *like these BASICs*, is completely the opposite of what actually happened with these BASICs. They were all optimized for size, not performance, and any number of low-hanging-fruit improvements were ignored to save a few bytes. For instance, one can cache variable names, as in Atari BASIC, or use a fixed table of them, as in BBS BASIC, for a massive speed boost. But that takes maybe 127 bytes. One can pre-process numeric literals, so you don't have to do that at runtime, but that expands them from an average of 3 bytes to 6. While optimizing for size is often the same as performance today, due to caching, it was absolutely not the case in the 70s and this line of reasoning simply illustrates the authors utter lack of knowledge of the claims he's making.

    • by methano ( 519830 )
      I'm glad you pointed this out, at least the point about Bill Gates. I'm an old nerd from a different field, but I've kept up with this computer stuff for a number of years and I don't remember Bill as being a cutting edge programmer. I suspect that as we get older and the people that can correct the bull die off, more and more of the invention of computing will be attributed to Bill. It's not unlike how most of the witty sayings about politics are attributed to Winston Churchill.
    • by Pembers ( 250842 )

      One can pre-process numeric literals, so you don't have to do that at runtime, but that expands them from an average of 3 bytes to 6.

      The Sinclair ZX81 (sold as the Timex Sinclair 1000 in the USA) actually did that, and programmers who were trying to cram as much code as possible into the tiny RAM would declare variables to hold commonly-used numeric values such as 0 or 1, thus throwing away any speed improvement...

    • by Megane ( 129182 )
      Wow guys, nice guessing, but the size of TRS-80 Level II BASIC was not 16K, not 8K, it was in fact 12K, from 0000-37FF. The keyboard was memory mapped at 3800-3FFF, and the rest was RAM.
  • by 93 Escort Wagon ( 326346 ) on Monday January 13, 2020 @11:37AM (#59615688)

    About Bill Gates writing assembly... there's a story from the early days about how Andy Hertzfeld really wasn't impressed with Bill Gate's abilities as a programmer [folklore.org].

    • Re: (Score:3, Insightful)

      I've read Bill's code - he was a good programmer for someone who was creating software out of whole cloth. My comment would be that some time spent reviewing what other people had coded first for the same problem (before starting his coding) would have improved Mr. Gates' code.

  • Today's software is more complex. Yes, it's possible that you can develop very optimized complex software, but it's a lot of effort (=money) who's going to maintain it (= even more money)? Possibly not you, in the gig economy. Yet more money (as ownership of code = faster maintenance)
  • I find it irconic that people largely left Assembly Language for C for portability, speculative that processor changes were coming--that never came.

    Object-oriented languages will be the poster-boy for wasting memory and CPU cycles as well they should be because they hide its use, but even on the Arduino which is a C variant, I've seen people code by just writing their variables wherever they are introduced. If the variables aren't created in one general place, you aren't going to be able to devise a system

  • If there is a problem area, people optimize it. How is this news? Actually just about every experienced programmer knows that needless optimization created convoluted and hard to maintain code.
  • Lotta FUD (Score:5, Insightful)

    by dbrueck ( 1872018 ) on Monday January 13, 2020 @12:28PM (#59615886)

    This is a problem that will naturally take care of itself, the sky is not falling one bit.

    A top (if not THE top) rule of optimization is: "don't optimize what you don't need to optimize". The bloat being complained about is mostly pointing out areas in which stuff isn't optimized and doesn't (yet) need to be optimized.

    Software will naturally grow to the size of available resources in part because it's inefficient to prematurely optimize where you aren't resource constrained. If time to market is a high priority variable and your desktop app weighs in at 10MB, it probably doesn't make sense to take 25% longer dev time to get it down to 5MB.

    Once there's a material benefit/advantage to optimizing away some of this bloat, it'll be taken care of automatically simply because then there is value to doing it. For example, if a mobile app is too hard on the battery, it'll often get poor reviews, causing some number of people to avoid it in favor of another app until the developer fixes the problem - i.e. the app gets optimized for power consumption because there's a good incentive to do it.

  • "Writing software is a craft and some people are better at it than others."

    So in other words, it's just like every other activity any human engages in. So insightful.

  • by Opportunist ( 166417 ) on Monday January 13, 2020 @12:37PM (#59615932)

    If you want to have the "look and feel" of an overbloated desktop environment where papers fly animated from wiggling folders when you copy your stuff, and especially if you want that in a web browser, you're part of the problem.

    Stop with that shit and concentrate on what's required. And suddenly you'll notice that the amount of processing power and ram you have at your disposal is plenty. For nearly anything you could possibly want to do on your desktop.

  • Moore's Law has allowed lazy managers and programmers to cut all corners possible in software development. Managers should get educated about how to run an engineering shop.

  • Seen people use unsigned chars as loop vars, when that often slows the chip as it needs an extra step to up-promote it. And it doesn't save on stack use generally anyway as 4 bytes are reserved minimally on the stack.

  • Another "tech" nobody making hyperbolic statements, pushing an agenda generated by his own pompous, self-important musings. Give me a break.
  • Truth is, there's a whole lot of sloppy, bloated, inefficient code in use every day because it gets things done that people need to get done. They *tolerate* the performance problems because they don't really have a choice. But really, it needed optimization YEARS ago, in order to work the way it should have really been expected to work.

    I give you, as one example that hits close to home; Microsoft Great Plains ERP! That Financial accounting package is absolutely awful, when it comes to its performance over

  • Having worked at a fortune-500 semiconductor firm, I can speak a lot to the process that goes into chip making, but I'll say this. There's always this tension between making chips "smarter" (better RTL, better algorithmic design, smarter processors) and "faster". Historically, most energy is spent on the latter. I was amazed at how ultra-inefficient some basic designs and bus interconnects inside an SOC I worked on were, and wondered, hey, we could easily double performance just if we spent a bit more ene

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...