Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Launches First 45nm Shanghai CPUs 264

arcticstoat writes "The wait for AMD's next-gen CPUs is finally over, as the company has now officially launched its first 45nm 'Shanghai' Opteron chips for servers and workstations. 'AMD's move to a 45nm process relies on immersion lithography, where a refractive fluid fills the gap between the lens and the wafer, which AMD says will result in 'dramatic performance and performance-per-watt gains.' It's also enabled AMD to increase the maximum clock speed of the Opterons from 2.3GHz with the Barcelona core to 2.7GHz with the Shanghai core. Shanghai chips also feature more cache than their predecessors, with 6MB of Level 3 cache bumping the total up to 8MB, and the chips share the same cache architecture as Barcelona CPUs, with a shared pool of Level 3 cache and an individual allocation of Level 2 cache for each core.'"
This discussion has been archived. No new comments can be posted.

AMD Launches First 45nm Shanghai CPUs

Comments Filter:
  • Which to buy now? (Score:3, Interesting)

    by Ed Avis ( 5917 ) <ed@membled.com> on Thursday November 13, 2008 @11:09AM (#25746801) Homepage

    Does this mean that AMD chips are now competitive on price-performance with Intel's? I mean for a fairly high-end desktop or server; obviously different considerations apply in the embedded or netbook market.

    • Re:Which to buy now? (Score:4, Informative)

      by Anonymous Coward on Thursday November 13, 2008 @11:17AM (#25746907)

      This news is about a server/workstation chip, and I don't do any purchasing of those. As far as desktop chips are concerned, AMD was ALWAYS competitive on a price-performance basis. The key word there being price.

      • Re:Which to buy now? (Score:5, Informative)

        by blair1q ( 305137 ) on Thursday November 13, 2008 @01:21PM (#25748729) Journal

        No, they weren't. For the past year Intel has boxed AMD in with chips at the same performance and lower price, or the same price and higher performance, or both.

        And Intel has had performance segments (QX*) stretching well above AMD's, and pricing segments (Atom) well below AMD's.

        AMD's short-lived price/performance superiority in the desktop sweet-spot in 2004 and 2005 has left many people thinking they're still in that position. That hasn't been true since Core 2 came out. HyperTransport gave them a slight edge in very-high-end servers for certain applications, but Intel stayed near them with reliably higher clock speeds, and is coming out with QuickPath in four days, wiping out those few use cases where AMD can make easy sales today.

        What I'm saying is, right now you are likely to choose Intel in almost all situations, if you are objective.

        • Re:Which to buy now? (Score:5, Informative)

          by rgviza ( 1303161 ) on Thursday November 13, 2008 @02:08PM (#25749503)

          cycles aren't everything in all cases... AMD still has more system bandwidth, which speeds up everything when talking about IO bound applications. FSB speeds up every aspect of the computer.

          The applications where AMD is superior are IO bound applications like database servers, and music production.

          Intel is better for video because you are dealing with a limited number of streams and it's computationally expensive, so is CPU bound.

          With audio you can have hundreds of streams (often 4-6 per fader on the mixer), and at 24/96, will quickly overwhelm any intel based system. Since a lot of us use DSP cards ( think of it as GPU for sound) the data path capacity, especially to the DSP processors (PCI/PCI-e) is very important, and Intel simply can't touch AMD in this respect.

          AMD architecture simply has untouchable plumbing. If you will notice, Apple is looking for a new chip vendor. This probably has a lot to do with it since most audio professionals use Apple gear.

          If Jobs and Co. were smart, they'd offer both intel and amd architectures, depending on the job being done. Intel is fantastic for video and a lot of pro video peeps use Apple gear too. Those are two market segments that couldn't be more different in their requirements. To be the best of the best for multimedia, Apple needs to either build a new architecture or offer both AMD and Intel.

          -Viz

        • by afidel ( 530433 )
          AMD had ruled on price/performance since the original Athlon shipped. It wasn't until Core2 that Intel was competitive again. For anything memory bound AMD still had the lead until Intel shipped Nehalem a few weeks ago and they won't be shipping the MP chips until Q2-09 so they don't matter to me yet.
        • by Chandon Seldon ( 43083 ) on Thursday November 13, 2008 @02:44PM (#25750157) Homepage

          For the past year Intel has boxed AMD in with chips at the same performance and lower price, or the same price and higher performance, or both.

          That's been true in some price ranges, but Intel hasn't trumped AMD across the board any time recently. There's always been a couple price ranges - and usually the relevant ones like $120 to $150 - where AMD has a better product.

          pricing segments (Atom) well below AMD's.

          Geode?

          I'm not trying to say that Intel hasn't been "the winner" for the past year or so, but it certainly hasn't been as one-sided as you're claiming. AMD has been selling chips, based on being the best choice for individual consumers, the whole time.

        • Re: (Score:3, Insightful)

          by 0xABADC0DA ( 867955 )

          First, Intel didn't have better price/performance until recently. Certainly well into core 2 duo period the AMD Athlon 64, X2, etc were much better price/performance.

          Second, on the low end:

          AMD Sempron 1150 2ghz: $22
          Intel Celeron 430 1.8ghz: $39

          That Sempron is much faster than that Celeron. Atom is cheap for the processor, but the other parts cost more and use a lot of power (%50 of total power, 15 watts for chipset on Intel's Atom mITX board). Why do you think netbooks only get ~3hrs with 4 watt processo

        • by CAIMLAS ( 41445 ) on Thursday November 13, 2008 @03:34PM (#25751001)

          Er, huh?

          AMD dominated the price/performance war with Intel from the time they released their K6 chips - that'd be 1997 (hello, remember the "sub-$1000 PC"? that's thanks to AMD). This was the case until just recently when things started to go multi-core - and even then, AMD had a bit of resurgence while playing leapfrog with Intel.

          From about 1999 to 2003 AMD was way, way ahead of Intel; Intel didn't pull ahead of AMD in terms of simple performance (without spending close to a grand for a processor) until the release of their Core based processors. Their performance started to improve quite a bit with the M based processors, but your common desktop price/performance was still dominated by AMD.

          Arguably, AMD's memory management is still better. We'll see how this generation hashes out.

    • Re:Which to buy now? (Score:5, Informative)

      by MrHanky ( 141717 ) on Thursday November 13, 2008 @11:20AM (#25746963) Homepage Journal

      According to Anandtech's review, it's highly competitive for database servers. http://it.anandtech.com/IT/showdoc.aspx?i=3456 [anandtech.com]

      • by default luser ( 529332 ) on Thursday November 13, 2008 @12:01PM (#25747559) Journal

        Just an off-the-cuff calculation on my part shows power consumption dropped over %50 over Barcelona, clock-for-clock.

        This is good news, because when AMD moved from 90nm to 65nm, their leakage was so bad that the power consumption only dropped around %10 clock-for-clock. Combine this with better cache architecture (larger, and faster), and AMD may have a winner in the server space.

        I'm not sure if they're going to take back the desktop anytime soon. Intel doesn't have the FBDIMM downside on desktop systems, and I'm fairly sure that Shanghai didn't add major microarchitecure changes, so a quad-core Core2, let alone an i7, should continue to dominate the desktop.

        However, it is nice to know that the market once again will have a choice in processors. AMD's 65nm offerings were spanked in terms of performance and power consumption by Intel's lineup, but Shanghai will at least compete on the power front, if not the performance front. We shall see what happens when AMD releases their desktop version.

        • by Penguinoflight ( 517245 ) on Thursday November 13, 2008 @01:37PM (#25749003) Journal

          Power consumption is actually one of the areas where intel has been soundly beat, year after year.
          Even 65nm processors from AMD use less power than Intel's 45nm procs, and Intel doesn't have an on-chip memory controller.

          Add in the extra power consumption of an Intel northbridge, and intel's offerings are usually about double the power consumption of a similarly clocked AMD system.

          AMD's real problems are in acheiving high clock speeds, and solving their fabrication process. If AMD's 45nm process is as improved as they say it is, and with their fabrication/design company split they should be able to get that side of their business under control.

    • Oh please. (Score:5, Insightful)

      by Moryath ( 553296 ) on Thursday November 13, 2008 @11:24AM (#25747011)

      The two companies take turns one-upping each other for the bleeding edge, but every time (10 years running) I've specced out a mid-range (home gamer, single CPU motherboard) to low-end (grandma's email/photo machine) machine, AMD's been the way to go. It's a lot like trying to decide which company's video boards to pick if you're trying to make a game machine without breaking the bank.

      Some people are Intel partisans, some people AMD partisans. Benching them and looking at spec, I've consistently found that AMD's got faster chips (for the same $) up to the "sweet spot" in the curve where price starts shooting upwards during the times I've been buying, but I also know there were times I was not in the market when Intel had done a price cut and AMD hadn't caught up.

      I'm not going to call someone an idiot for their CPU choice, as it's a long-term purchase decision that has to be balanced with other factors (motherboard choice, RAM, video board, power concerns, cooling solution, etc) anyways. In fact, I recommend consumers try to stay OFF the "bleeding edge" because they're basically throwing money away on it; even if you buy the latest, hottest chip right from the factory it's obsolete by the time you get it home. Your best bet is looking at the curve, because there's always a spot (usually between $150 and $250) where the price starts to jump up exponentially for only an incrementally "faster" product. Buy at the spot beyond which the relationship between price and performance fails to be linear and you'll turn out pretty happy.

      • Re:Oh please. (Score:5, Insightful)

        by Colonel Korn ( 1258968 ) on Thursday November 13, 2008 @11:43AM (#25747269)

        Since the C2D arrived, I've been going with Intel. I usually don't overclock, but the C2D handles it so well with such little effort that I based my purchase of a $200 ~2.2 GHz chip on that alone. With the addition of a $30 heatsink I had it at 3.4 GHz with temperatures under 60 C at load (below the temperature seen at stock speed with the stock cooler, implying good longevity), back when there were no 3.4 GHz Duos and the closest thing cost about $1000. I have several friends who had never OCed before who did the same thing, all ending up with 2.8-3.6 GHz chips that all are still working perfectly and speedily ~1.5 years later.

        • Re: (Score:3, Interesting)

          by Lonewolf666 ( 259450 )

          Starting with some of GP's requirements (game-capable PC but at a reasonable price) and wanting to use ECC RAM for reliability I ended up buying an AMD last year. It is an AMD Athlon64 X2 EE 4600, a dual core with 2x2.4 GHz, not overclocked. In practice, this machine is fast enough, especially considering that I don't run the very latest games.
          The deciding factor in terms of Intel vs. AMD was that ECC capable mainboards for Intel are expensive. The cheapest C2D would have been not much more expensive than t

          • Err.... does ECC ram actually help with reliability, or does it merely ensure that errors get detected ?

            Perhaps I'm a bit of a purist, but if ECC Ram is actually self-correcting, I would worry about how/why it got corrupted in the first place. I find it much cheaper and easier to buy good quality non-ECC Ram instead.

            • Re:Oh please. (Score:4, Informative)

              by Ed Avis ( 5917 ) <ed@membled.com> on Thursday November 13, 2008 @01:44PM (#25749141) Homepage

              What happens is that during normal operation of any RAM there is a small chance that a particular bit will get flipped. Cosmic rays are often blamed as the culprit; of course if you overclock and overvolt your memory you increase the chance of errors, but even good quality RAM running within spec will get an incorrect bit every so often. If you use non-ECC memory there is no chance to spot this error; it just returns the wrong data. The old parity memory added one extra check bit for every eight bits, so most of the time it could detect (but not correct) a one-bit error. ECC stands for error correcting code (look it up on Wikipedia) meaning that if one individual bit is corrupted it can recover the correct data. If you are really unlucky and two bits in the same code word get corrupted at the same time then you still have problems, but that is unlikely.

              If you are using non-ECC RAM then may be getting corrupted from time to time, but you don't notice.

              • by afidel ( 530433 )
                Most modern ECC is multiword and can correct two bit errors so it's considerably better than the old single bit ECC. Going up just a bit there's IBM's Chipkill technology that throws a RAID controller on top of ECC ram to enable an entire chip to be lost. Chipkill is available from multiple vendors, not just IBM.
            • Re: ECC RAM (Score:3, Informative)

              by Lonewolf666 ( 259450 )

              Accorcing to Wikipedia (http://en.wikipedia.org/wiki/Hamming_code#Hamming_codes_with_additional_parity_.28SECDED.29 [wikipedia.org]), a scheme that can correct 1 bit error and detect 2 is typically used. So it can correct single errors. The most common reason is some form of radiation (see http://en.wikipedia.org/wiki/Soft_error#Causes_of_soft_errors [wikipedia.org]).

              Against at least one of those (cosmic rays), even quality RAM is not immune. This said, only the vendors of quality RAM seem to be in the business of making the ECC version a

            • Perhaps I'm a bit of a purist, but if ECC Ram is actually self-correcting, I would worry about how/why it got corrupted in the first place. I find it much cheaper and easier to buy good quality non-ECC Ram instead.

              ECC isn't just really meant to counter hardware memory errors. It's also meant to prevent single bit flip error events.

              Cosmic rays and background radiation can cause single bit flip events. With increasing data densities, the likelyhood of a high-energy particle colliding with a memory cell incr

            • Re: (Score:3, Informative)

              by Agripa ( 139780 )

              I have to agree with Lonewolf666. I have been agonizing over going with AMD over Intel but the ECC issue is a deal breaker. It is only supported on Intel's more expensive and older motherboards. DDR3 in combination with ECC is not supported at all ruling out anything recent and an FB-DIMM solution would be more expensive yet.

              All of AMD's recent processors have really nice support (note 1) for non-registered ECC DDR2 with the caveat that not all systems have support in BIOS for it. Gigibyte motherboards

        • by Hatta ( 162192 )

          I bought an E4500, expecting I could overclock it a bit. But I put the machine together, and honestly I don't know why I'd even bother. It's plenty fast enough for anything I do. At some point you reach diminishing returns, and honestly I don't care if a task finishes in 3 seconds instead of 2. Do you really see a difference between 2ghz and 3ghz for anything other than encoding video?

        • by 0xygen ( 595606 )

          I have had a very similar experience - always being an AMD fan prior to this build (AthlonXP Barton was the last one) and now have a C2D E6750 which is getting a bit long in the tooth.

          In my experience, I got fast DDR2 1066 RAM, lowered the FSB:CPU ratio, pushed the core voltage up to 1.4V and put the clock speed up to give about a 3.4 GHz clock speed, from the stock 2.6 GHz.
          Everything still runs happily like this, even on the stock cooler.

          I even can feel the difference in Far Cry 2 with all the physics set

      • Re:Oh please. (Score:5, Insightful)

        by LWATCDR ( 28044 ) on Thursday November 13, 2008 @11:50AM (#25747393) Homepage Journal

        I tend to agree. The honest truth is that and AMD 780G motherboard and one of the low power X2s makes a great system for most users. If you want to play games throw on a 3870 or if you really need it a 4850.
        I just built a system for my wife with an ASUS 780G motherboard, X2 and 4 Gigs of ram. Total cost was under $200 and it runs very well.
        If you not into high end gaming then AMD seems like a great choice.
        I can hardly wait for 45nm AMD desktop CPUs to start showing up. I really want one.

      • Re: (Score:2, Interesting)

        by mfh ( 56 )

        The two companies take turns one-upping each other for the bleeding edge, but every time (10 years running) I've specced out a mid-range (home gamer, single CPU motherboard) to low-end (grandma's email/photo machine) machine, AMD's been the way to go. It's a lot like trying to decide which company's video boards to pick if you're trying to make a game machine without breaking the bank.

        I have to agree. When the Quad cores shipped, I tested them and I compared the speed per dollar. AMD was half price for perf

      • Perhaps the rationale behind buying nearer the bleeding edge than your sweet spot is not having to replace as often?

        • by wisty ( 1335733 ) on Thursday November 13, 2008 @12:36PM (#25747997)
          Or you can put the $500 you saved on the stock market, and by the time you need to upgrade you can use the money you saved, along with any capital gains and dividends to buy, um, a packet of waffles.
        • by Moryath ( 553296 ) on Thursday November 13, 2008 @12:56PM (#25748365)

          If you're buying "bleeding edge" you're not going to be satisfied with your purchase 2-3 years from now (about my replacement cycle for my personal box, and even then a lot of my components like hard drive / sound card / DVD drive tend to last through 3-4 iterations), unless your tastes suddenly radically change and you're no longer interested in the "bleeding edge" games you were trying to run.

          Plus, consider the following two options:

          #1 - "Bleeding edge" rig. Blow $900 on processor, $1200 on dual video boards, $400 on RAM, $800 or so on miscellaneous other components. Total system cost around $3000.

          #2 - "Decent Gaming" rig, single $300 video board, $200 processor, etc. Total cost: $900 if you really push your luck.

          I'll take my $2000, buy more games, take girlfriend to dinner, stick some in a rainy-day fund, etc. One of these years you need to run the numbers and then you'll figure out that the "savings" you claim are there from buying at bleeding edge aren't really there at all. Even if I spend $900 every 2 years upgrading my PC, it takes me 5-6 years to equal the cost of your rig, and I guarantee you're going to turn around and want to rebuild to get back to the bleeding edge because you'll be "disappointed" that your 2-3 year old "bleeding edge" machine is only getting 15 fps in the timedemo mode of CallOfUnrealCrysisDoomQuakeTournament 3: Yet Another Non-Scaling Tech Demo Masquerading As A Game.

          • by leandrod ( 17766 )

            If you're buying "bleeding edge" you're not going to be satisfied with your purchase 2-3 years from now

            Speak for yourself. I tend to stay with stuff until it breaks.

            Besides, I never said I am buying bleeding edge. I merely suggested buying nearer it than the original poster's sweet spot.

            unless your tastes suddenly radically change and you're no longer interested in the "bleeding edge" games you were trying to run.

            I run no games.

          • by 0xygen ( 595606 )

            Agreed - even if you buy "sweet spot" rigs twice as often as the bleeding edge guy, on average over the time involved you will have a better gaming experience.

            "Bleeding edge" guy is essentially paying for the R&D which makes our 2nd generation cards, motherboards and processors cheaper and less power hungry.

        • It depends on where the sweet spot is and what else is going on.

          I have found that if the sweet spot in capable of providing solutions at the time of the build, then it is still capable 5 years down the road. Vista being the exception of course. Even if your gaming, you will find that it is still fast enough although you want something faster. I just retired some Athlon XP 2100 machines last month which had nothing wrong with them, they were just 5 or 6 years old and when parts started failing, we decided to

        • And what's so bad about replacing often? If, at the end of ten years, you end up having had more total cycles available AND having spent less money, AND have the additional reliability factor of not holding on to equipment for very long after it's no longer under warranty, what's so bad about that?

          • And what's so bad about replacing often?

            Convenience. I tend to use e highest quality equipment I can afford [apple.com.], and to run free software [debian.org.], so I don't need to reinstall or worry much about machines and OS. Having to reinstall because e machine is obsolete is someðing I want to postpone as much as poßible.

          • Re: (Score:3, Insightful)

            by Sj0 ( 472011 )

            You subscribe to the bathtub curve of reliability. For many components, this isn't an accurate model.

            For many components, after you get past infant mortality, the devices remain consistently reliable. I've seen 386s and 486s that are still running, day in day out, today. PDP11s simply don't die, and there are some that are just sitting in a corner quietly doing mission critical tasks in industry.

            All you have to do is identify common failure modes and do maintenance to mitigate them. For example, the dominan

      • Re: (Score:3, Informative)

        by Lord Apathy ( 584315 )

        I think you pretty much nailed it. A few years ago I bought a amd 3000+ for 595 bucks because I wanted bleeding edge. A few days later a friend bought a amd 2800+ for 250 bucks. Less than half what I paid.

        My extra almost 400 bucks got me nothing that mattered. Sure I was faster then he was but he could play all the same games I could just as well. Sure the web, read email, view porn. Just as good as I could.

        The only place you could tell the difference was when encoding video or audio. I was just

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Ed...A fairly high-end desktop isn't anything close to comparable to a high-end server. AMD's have been superior for building high-end servers for some time now, thanks to bandwidth considerations.

    • Does this mean that AMD chips are now competitive on price-performance with Intel's? I mean for a fairly high-end desktop or server; obviously different considerations apply in the embedded or netbook market.

      What apps are you running? Do your apps take advantage of multiple cores (regardless of the speed) or risk pushing 17.6GB/sec of memory bandwidth? Then go for the new AMD or Intel CPU, as they are both stupid fast. If not, then you might be better served with a previous gen CPU at a much lower price.

      I mean, really, we are talking about very small differences in speed at this point, right? Would the average person actually be able to detect the difference? I am all for ever faster processors, as it allo

    • by nizo ( 81281 ) * on Thursday November 13, 2008 @11:54AM (#25747451) Homepage Journal

      It is always best to do your research when buying a new chip so you don't get shanghaied.

  • by FlyByPC ( 841016 ) on Thursday November 13, 2008 @11:12AM (#25746841) Homepage
    "AMD Shanghai [wikipedia.org] -- the perfect CPU for your newly-acquired botnet!"
  • Making Me Feel Old (Score:5, Interesting)

    by withoutfeathers ( 743004 ) on Thursday November 13, 2008 @11:39AM (#25747219)
    The first computer I ever worked on (as a data entry operator in the mid '70s) was an IBM S/360 mainframe with 64KB of "main" (physical) memory.

    The first computer that I was a primary operator on, a S/360-135 plug-compatible 2Pi, had 768KB when it was delivered and was eventually bumped to 1.25MB shortly before I moved on to programming.

    The computer upon which I wrote my first professional (COBOL) program was an IBM 3033 with a (for then) eye-popping 4MB of physical memory.

    The first computer I ever owned was an RCA COSMAC with 4KB of memory.

    The first DIY computer I ever assembled completely from parts (about 15 years ago) had 4MB of interleaved DRAM and a 256KB SRAM cache and was considered somewhat amazing by everyone who saw how fast it ran OS/2. I eventually boosted it up to 16MB

    Now you get 8MB of on die cache with your four cores... And I still can't get a decent flying car.
  • About Time (Score:3, Interesting)

    by ShakaUVM ( 157947 ) on Thursday November 13, 2008 @11:40AM (#25747239) Homepage Journal

    It's about time... I mean, seriously. The CPUs coming out of AMD have stagnated in the last few years. The Phenoms are decent enough, I guess, if you have apps that can take advantage of the three or four cores, but they clock at slower than comparable X2s, and two cores is still the optimal point on the diminishing returns curve (on adding more cores).

    I remember the 90s and early 00s when you were basically required to upgrade your processor every year or two or be hopelessly behind when the latest game came out. Now, I'm running the same machine I was back in '04, except with a new video card and an upgrade from a 3800+ (2.4Ghz) to a 4800+X2 (2.6Ghz) a year and a half ago.

    I got curious how far I was behind these days, and found that as far as everything goes, a 4800X2 is still about as good a chip as anything AMD produces, only about 30% below the top chips AMD makes right now.

    By contrast, Intel has the E8500 which is not only significantly faster, but is heavily, heavily OCable as well. I think Moore's Law has finally broken down for AMD.

    • by Ecuador ( 740021 )

      Think of it this way, the current AMD's are not much behind in similar clock speeds, but are a bit behind at top clock speeds (well, ok, more than a bit if you OC). Even if I buy C2D when I want fast workstations, I would definitely say AMD is withing the same performance "generation" and they are very competitive at the mid-low segment.
      So, your perception that Intel is improving very fast is based on the fact that they were running on a laughable (PR dept driven) architecture for years. From P3 -> P4 th

  • Will there be a Hourai as well? ^_^

  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
    • Re: (Score:3, Informative)

      by Chris Burke ( 6130 )

      Despite any advances AMD makes in CPU's, they still have such a sub par selection of chipset vendors.

      What's wrong with their PCI-E bridge chip? It converts PCI-E to HT and back pretty well afaik. Or maybe you meant the southbridge? Yeah, that USB and SATA logic is really cramping my gaming rig.

      The performance-interesting parts of the northbridge are on the CPU in AMD architectures (and now intel ones too), and they're great. I'm not sure what you're complaining about.

    • What are you saying, nVidia chipsets rock!

      We've had someone stick up for amd, and now for nvidia. Anyone want to stick up for via chipsets? anyone? oh...

  • So, thish seems as good a place as any to ask; what's the fastest AMD CPU available? I have a Socket AM2 6400+ and I'm looking for an upgrade without changing the motherboard. I'm talking single core operation, that is, I don't care if a good threaded app runs faster on a quad core Phenom than on my dual core 6400+, I just need it to run one application that doesn't thread, on one core, really fast.

  • This is hardly exciting. AMD should have released these a year ago. Now they are irrelevant.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...