Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Operating Systems Software Hardware Technology

Nvidia To Cease Producing New Drivers For 32-Bit Systems (arstechnica.com) 92

An anonymous reader quotes a report from Ars Technica: While most people have probably made the switch by now, yet another reason to drop 32-bit operating systems and move to 64-bits is coming. Version 390 of Nvidia's graphics drivers, likely to arrive in January, will be the last to contain support for 32-bit versions of Windows (7, 8/8.1, and 10), Linux, and FreeBSD. There will be another year of security updates for 32-bit drivers, but all new features, performance enhancements, and support for new hardware will require the use of a 64-bit operating system and 64-bit drivers. Reasons to stick with 32-bit Windows are at this point few and far between. 64-bit Windows has superior security to 32-bit, and while it varies with workload, 64-bit applications can run somewhat faster than 32-bit counterparts; for workloads that won't fit within the constraints of 32-bit software, the difference is of course enormous. Generally, those who continue to use the 32-bit operating system tend to be subject to some kind of legacy constraint. 32-bit drivers won't work in 64-bit Windows, so obscure but mission critical hardware can extend the life of 32-bit systems.
This discussion has been archived. No new comments can be posted.

Nvidia To Cease Producing New Drivers For 32-Bit Systems

Comments Filter:
  • by JcMorin ( 930466 ) on Friday December 22, 2017 @05:22PM (#55792153)
    I feel this statement is wrong 99% of the times except maybe when you need more than 32 bit addressing or doing computation using very large numbers. I think most application are faster on 32bit than 64bit.
    • by darkain ( 749283 ) on Friday December 22, 2017 @05:33PM (#55792251) Homepage

      64bit instruction set is faster than 32bit instruction set in the fact that 1) x86 instructions are variable width to to begin with, so that doesn't effect performance. 2) the increased number of optimized instructions. 32bit software is either compiled to target the original 386 instruction set or an extended 686 instruction set. x64 includes countless more registers and instructions, meaning less swapping data to/from registers/ram, meaning actual useful instructions per cycle is higher.

      • 64bit instruction set is faster than 32bit instruction set

        Except that 64 bit sucks big time on x86. The amd64 ABI fares better than i386 only because the latter has to keep a ridiculously inefficient calling convention and similar constraints for compat reasons. If you try a modern ABI such as x32, it wins over amd64 by something like 40% in code that benefits from this, or around 7% overall.

        64-bit does have other advantages, such as bigger address space, but speed is not one, at least not on x86.

        Also, pointer size (which is what matters here) doesn't preclude h

    • by Anonymous Coward

      64 bit has twice as many registers in the register file (which are twice as wide as well, of course). This has a profound performance impact. Also has SSE2 vs SSE, with twice as many SSE registers.

    • by Burdell ( 228580 ) on Friday December 22, 2017 @05:39PM (#55792295)

      It isn't a general 32-bit vs. 64-bit comparison, it is specifically that the x86_64 ISA is better than the i386 ISA in a major way, due to a larger register set. i386 has very few registers compared to other major architectures, which means higher memory access rates. Even with on-die caches, RAM access is slower than the CPU, so the more often code has to hit RAM, the slower it goes. x86_64 added more registers, so can do more with fewer RAM accesses, so it can do the same job faster.

  • by zlives ( 2009072 ) on Friday December 22, 2017 @05:24PM (#55792161)

    noted: if you need legacy support don't buy nvidia

    • by Anonymous Coward

      ...as if their existing legacy drivers aren't good enough for 'legacy support'.

    • by jmccue ( 834797 )

      To me it is "do not by NVIDIA period". The embedded video I have is NVIDIA and all security updates stop in 1 week (including updates for new Linux kernels). So I will need to use nouveau, whch still has some tearing issues on my system.

      So I will only buy systems which provides free source, which I think ATI will start doing soon.

      • The embedded video I have is NVIDIA and all security updates stop in 1 week

        Which is incredibly scary given all those exploits of graphics card drivers that exist right? .... *crickets*

        Seriously though, security updates for drivers? The fact they even need that is a troubling development.

    • by thegarbz ( 1787294 ) on Friday December 22, 2017 @07:11PM (#55792877)

      noted: if you need legacy support don't buy nvidia

      If you need legacy support you're unlikely to throw a new GPU in a new motherboard and play the latest games anyway. This really is an incredible non-story.

  • by Zo0ok ( 209803 ) on Friday December 22, 2017 @05:33PM (#55792249) Homepage

    When will the hardware stop supporting 32-bit (and 16-bit) modes?
    I talk about AMD and Intel CPUs.
    I mean, there could always be one model that does (support 16/32 natively). But most models could be pure 64 bit. It would be easier for everyone, wouldnt it?

    • If it does DMA, then yes.

    • Re:And the hardware? (Score:5, Interesting)

      by Burdell ( 228580 ) on Friday December 22, 2017 @05:41PM (#55792303)

      This is why Intel is talking about dropping legacy BIOS support and going with pure UEFI firmware. BIOS requires starting in 16-bit real mode, but UEFI can start in native protected mode.

      • by Zo0ok ( 209803 )

        Yes, well, obviously a being able to start an 64-bit mode is kind of a prerequisite for even thinking about dropping older modes.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        "native protected mode" should scare the *shit* out of us. IME/AMT et al prove that what is "protected" is their interests, not our computers memory.

        • Protected mode, in this case, means that virtual memory is activated, so your processes aren't all sharing a single memory space where everyone can see everyone else's data.

          • by dryeo ( 100693 )

            Isn't there usually only one process before switching to protected mode?

            • Well, I guess they're technically threads, rather processes due to the shared memory space, but Windows 3.0 and earlier could run fully in real mode and allowed multitasking.

              • by dryeo ( 100693 )

                I'm thinking more of booting a protected mode OS, where the computer is in real mode upon being turned on and a single process (I believe) sets things up and then switches to protected mode or perhaps chains to another process that switches to protected mode.

    • Re:And the hardware? (Score:5, Interesting)

      by gweihir ( 88907 ) on Friday December 22, 2017 @05:46PM (#55792323)

      Not anytime soon. There is no reason for it. AMD64 is a patch on top of the Intel 32 bit instruction set, which itself is a patch on top of the Intel 16 bit instruction set. We are not talking about a well-designed and thought out instruction set like the 68000 had, we are talking about the mess Intel put into its CPUs. Hence there is no gain by having the hardware stop support the older modes.

      • by Zo0ok ( 209803 )

        I would imagine it could save a few 1-3% silicon and reduce design complexity a bit more than that.
        If not so, then obviously you are right.

      • The 376 started up in protected mode

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        There's no reason why you couldn't build a CPU which started in protected mode now. UEFI would run on it with a few minor tweaks and you could boot into an OS without needing real mode. Long Mode needs a page table at the moment, but that could be changed. Long mode already doesn't support V86 mode, which means OSs have already stopped using that.

        I think you'd probably need to keep 32 bit mode because a lot of Windows software is still bui

      • Intel, AMD and Via have patents and cross license agreements for all those weird modes. Most realmode, protected 16bit mode and 32 bit mode patents are void by now. But patents on things like MMX, SSE, register renaming, branch prediction and other architectural goodies in X32 are still valid, and provide a barrier of entry for anyone else interested in doing an X86 processor.

        Also, even if one forgets about the patents, there is a complexity barrier by having to emulate, certify and validate all that cruft,

      • by anss123 ( 985305 )

        The 68000 definitely had more thought put into the design, but that actually turned out to be a bad thing. The later models, the 040 and 060, wasn't fully backwards compatible, but was close enough that you could bridge the gap with emulation. After the 060 Motorola gave up, dropped addressing modes and other bothersome instructions, and gave the architecture a new name... ColdFire.

        x86 is ugly, full of bad design, being a 16-bit cpu pretending to be a 8-bit cpu with 32-bit extensions and a 64-bit patchjob o

        • by Anonymous Coward

          x86 is ugly, full of bad design

          When I looked at the instruction set for ARM CPUs recently, it seemed to be full of complex instructions -- variations of same core instruction. Do you know how many hacks you need to load a simple immediate into a register in an ARM CPU? Now that's bad design. The RISC moniker is a joke -- ARM instruction set is quite complex and huge in some aspects than x86. Isn't it supposed to be "reduced" instruction set? Instead it is a "bloated" instruction set computer.

          x86 by comparison, is quite simple and elegant

        • by gweihir ( 88907 )

          Might also have had something to do with the PC beginning to dominate the market. Looking back, a real pity in more than one way.

      • by Z00L00K ( 682162 )

        And even the x86 solution is a permutation of the 8080 solution, which you see when you realize that the x86 has a segmented addressing of 64k per chunk.

        Later versions of the x86 got rid of the segmentation, but for those of us that programmed for MS-DOS that's one reality we had to work with.

        • by gweihir ( 88907 )

          And the 8080 probably built on the 4004. The point is, Intel never had a good CPU designer back when it counted.

  • Legacy (Score:2, Troll)

    by darkain ( 749283 )

    "Generally, those who continue to use the 32-bit operating system tend to be subject to some kind of legacy constraint." This, EXACTLY this. Even on 64bit hardware, switching from 32bit Windows to 64bit Windows is a serious fucking pain in the ass. I just switched a company using 64bit hardware over from Windows 7 to Windows 10. I figured since it was Microsoft's own upgrade tool and it could easily detect hardware, it would simply upgrade to a 64bit version of Windows 10, since it is a full OS replacement

    • Comment removed based on user account deletion
      • by JustNiz ( 692889 )

        >> you can't in-place upgrade a Windows 32bit OS to Windows 64bit OS.

        Admittedly its a pain in the ass to reinstall windows, but anybody still running 32 bit windows on a 64-bit CPU is a freak.

  • I.e. somebody was stupid or cheap or likely both. Why else would anybody run any mission-critical hardware on Windows in the first place? Because developers are cheaper. Stupid.

    The right way to do this is to use a proper embedded OS with long-term support or to do it with Linux or one of the xBSDs and, and that is the important thing, with a fully documented and open driver. Tell the vendor to **** off if they cannot provide that. If it just costs more, _pay_ it!

  • by xororand ( 860319 ) on Friday December 22, 2017 @06:31PM (#55792607)

    This is a prime example for the necessity of libre drivers.
    The good news is, libre drivers for Nvidia GPUs exist [freedesktop.org], and they continue to work on 32-bit Linux.
    AMD Radeon GPUs have much better open source compatibility, though. [x.org]

    • by jbn-o ( 555068 )

      Quite right, and free software is objectively better for security as well (you aren't allowed to vet nonfree software no matter how much skill and persistence you apply to that task; you aren't allowed to share your improvements to nonfree software either, so even if you find and patch a problem you can't help your community except to tell them to stop running nonfree software).

      And free software is probably better for the environment as it lets people run usable older computers for longer avoiding the probl

  • I can think of a few:

    1.) You have a critical SW which is 16bit (either the whole SW, or a library referenced through trunking). As, by AMD's design, once in 64 bit mode, you can run 32bit Sw but NOT 16bit SW.

    2.) You machine can not take more than 4GB of RAM (32bit OSs and SW tend to take less memory than their 64bit counterparts). Yes, in many cases 64bit apps run faster than 32bit ones, but that does you no good if you exhaust your caches, or if you need to swap.

    3.) You have some HW that only has 32bit dri

  • May old hardware still serves a purpose, but I guess it's time to send it to be recycled.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...