Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Hardware

At the Windows Hardware Engineering Conference 248

downix writes "At Toms Hardware they're running an article where they discuss the next-generation Windows graphics system. The big part of the scoop, it's being done via DirectX. Have to validate those 2Ghz CPU's and GPU's that need their own nuclear power plant to run somehow." Some other interesting things there - quiet PCs, more about the Oqo, etc.
This discussion has been archived. No new comments can be posted.

At the Windows Hardware Engineering Conference

Comments Filter:
  • by toupsie ( 88295 ) on Thursday April 18, 2002 @09:49AM (#3364867) Homepage
    ...how the PC industry is going to take Apple's styling, innovations and designs and incorporate them into Windows hardware. I guess its better late than never...
    • I would rather have a computer that I can actually upgrade then one that looks nice (Mine happens to be both and no one will mistake it for a lamp. Yes iMac, I was looking at you when I said that). Moreover the trend to replace every monitor with an LCD annoys the hell out of me. LCDs suck: they cost to much, their pixel refresh rate blows ass, there are invariably dead pixels, most have a limited view angle. The only thing they have going for them is size and if IBM would get off their ass and ship the inch thick CRTs they made LCDs wouldn't even have that.

      • That's why you buy a PowerMac instead of the iMac. You can stick any kind of monitor you want to it. My work PowerMac G4 has three CRTs and one LCD connected to it. I have 4 Radeon cards in it to run them. Mac has always had great multimonitor support.
    • ...since, at the Windows Hardware Engineering Conference, the only guy able to easily get connected to a WiFi access point and use the public wireless network that had been set up was using.... gasp... a PowerBook!

      So says Jerry Pournelle [jerrypournelle.com], anyway:

      "I have tried to get an Orinoco Wireless WiFi (Allchin pronounced it "Wiffy" at least seven times in his market department written presentation) and I can't get it to work with Windows 2000. Alex hasn't managed with Windows XP. No one else in the press section has connected to the Internet with their 802.11 cloud. Allchin couldn't connect to Wiffy. But Peter has connected to the Internet with the same card with his PowerBook == as Peter says, with Apple everything is either easy or impossible. Using the Orinoco card with his PowerBook was easy. With Windows 200o so far it has been impossible... (But that eventually worked see below.)"

      "I have managed to get on the Internet. The local network is WINHWC2002. Yesterday it was WinHEC2002. It is case sensitive. Except that Peter's Apple didn't have that problem. He got on yesterday and he's still on today, in a hall that no one else can get on because of very weak signals. Astonishing."


      ~Philly
    • by piznut ( 553799 )
      Wow. So you mean MacOS is leveraging the GPU in your video card to draw the windows on your screen as 3d surfaces? And here I thought it was just alpha transparencies. Get a clue, jackass. The real world does not revolve around apple. What MS is going to be delivering in longhorn will be leaps and bounds what what you cockjockeys are using.
      • Wow. So you mean MacOS is leveraging the GPU in your video card to draw the windows on your screen as 3d surfaces? And here I thought it was just alpha transparencies. Get a clue, jackass. The real world does not revolve around apple. What MS is going to be delivering in longhorn will be leaps and bounds what what you cockjockeys are using.

        Actually yes and long before Microsoft decided to do it. Once again, Microsoft is playing catch up. Apple innovates, Microsoft immitates. Apple is and has been working closely with nVidia and ATI on a new 3d graphics card utilizing technology they aquired in the last two years from purchasing high end graphic workstation companies.

        And when is Microsoft going to deliver "Longhorn"? 2003? 2004? 2005? Maybe much longer because they can't even figure out how to get something as simple as WiFi to work like Apple can.

        P.S. Do you kiss your boyfriend with that rude mouth?

  • I'd rather see the load being taken off the power supply. I mean, graphics are nice, but as Michael alludes to, it's gonna take a friggin nuclear power plant to supply the juice- I'd rather see the hardware focusing on lower power consumption. you know, perfect what they got before moving to the next step. Now that I live off campus, I see how much juice my machines run, and well, 300watt powersupplies suck for electric bills.
    • 300watt powersupplies suck for electric bills.

      I agree, and not only that, but when you have three or four boxes running in a single room in the summer, the heat gets to be an issue as well. When your poor and hot, choosing between running A/C (and using a lot of electricity) and running the computers is a hard choice to make. Basically, the more power efficent, the better.

    • Yeah, because at less than .10 cent per *kilo*watt hour and, umm, 24*30, uhh, 720 hours per month, let's see, err, that's 216000 wats used, max, AKA 216 kilowatts, well, uhh. Hmph. That's a whopping 21.6 cents per month. Good gracious, time to get that second job working nights, or maybe just recycle a few aluminum cans to finance such an astronomical power bill. :)

      average cost of electricity in US as of 1999 [uoregon.edu]

      That said, I've got about 5 computers and matching monitors (there's where the power's eaten up) running 24x7, and totally understand the desire to keep power use down... :)
      • by OwnedByTwoCats ( 124103 ) on Thursday April 18, 2002 @10:52AM (#3365157)
        You slipped a couple of decimal places. 10 cents per kwh, not .10 cents. So burning 300 watts of electricity costs $21.60 per month, not $0.22.
        • Yes, absolutely. Running computers really is expensive over a long period of time.

          300 watts is more than the typical computer really uses. 60 to 100 watts continuous is more realistic judging from my UPS data output. Even then, $84/year is not trivial (this is the cost of a good component upgrade, these days).

          There are reasons why initiatives like Energy Star exist. World-wide, I would bet the equivalent of an entire power plant output is devoted just to keeping our computers idle. It is easily argued that this is lots of money and other resources going straight down the commode.

          What portion of California's recent energy crisis was due to tens of thousands of computers running unused?
          • What portion of California's recent energy crisis was due to tens of thousands of computers running unused?

            Zero. California's power crisis was due to market manipulation by out of state power companies.

            -jon

        • Check the link. Unless you live in california, it's .10 cents/KW-hr. One tenth of a cent. "+4, informative" my eye. :)
      • Not only is your math wrong but some people have higher electricty rates. I did the math on a recent post (damn, can't find it now). 300 watts costs $440 per year on Long Island, New York.

        -
    • Electricity is expensive when you start adding it up. I stopped taking home the giveaway server equipment from work because it was so damn expensive to operate versus buying stuff.

      I had a free disk array cabinet+card. The array formatted out at RAID5 at only 20 gigs -- usable, but not phenomenal. The killer was it was going to cost me $20 per month to power it! The new IDE HD I bought was $100 and gave me double the disk storage.
    • And this is why I use Macs...

      My iMac draws 170W max, less than 90W in standby, less than 35W asleep. My iBook draws 45W max, 18W in standby, less than 5W asleep. The iBook is actually faster than the iMac, too...

      Remember, these numbers include the monitor.

      -jon

  • 3d vs. 2d (Score:4, Insightful)

    by room101 ( 236520 ) on Thursday April 18, 2002 @09:49AM (#3364875) Homepage
    Graphics hardware gets to power the Windows shell, and compositing is going to be the big deal. Windows will be treated like surfaces, as opposed to rectangular blocks of bits, as they are now. Everything, in effect, is a texture. GPUs certainly know how to move textures around, and manipulate them, and work with them. Longhorn puts the pressure on the 3D engines of GPUs, and Microsoft is exploring minimum hardware requirements and standards for OEMs to aim for.

    If windows are textures, it seems like it will be pretty difficult to get perfect 1-to-1 mapping of pixels via a graphics gpu. Right now, the only thing that is a big deal is "jaggies", but noone expects a perfect image of textures. I know part of this is the game itself, but it is very hard to make textures fit exactly how you want them to.

    Sounds neat tho, if they can pull it off. Middle of the next decade indeed.

    • It's not hard at all, you just need to do the math(s). If you're not stretching the texture, you just need to offset the quad by minus half a pixel in x and y. If you're stretching, well, it won't be 1-to-1, but it'll be a lot quicker than you can do in software...
    • Graphics accelerated desktop - isn't this one of the features of Enlightenment .17, or is this something else?

    • It's not difficult to get this correct. It used to be a problem with the first couple of generations of graphics cards because they didn't all do things the same way, but nowadays it's pretty straightforward, as anything TNT1 level or later will do it correctly. You just need to offset the coordinates by half a pixel, ensuring that when the sample is taken, no filtering is required.
  • DirectX (Score:2, Insightful)

    Lets give DirectX a break. I know that computers running apps for it are completely decked out, but look at the graphics for christ's sake! With all of the enhancements that have been made between the hardware companies (ATI, NVIDIA) in conjunction with M$ (dx 8.1 enhancements) we are seeing some kick ass games, delivered in a relatively fast time due to a universal API. I think it is a good thing.

    -Mod me up, I need the karma!!
  • by dryueh ( 531302 ) on Thursday April 18, 2002 @09:51AM (#3364884)
    The sad thing is that with Microsoft's recent anti-trust woes, company execs just don't have that same pep, and arrogance of the past. They've become almost too nice and friendly.

    Yeah...my thoughts exactly.

    .....too nice and friendly; poor guys.

  • Cooling towers (Score:3, Insightful)

    by DickPhallus ( 472621 ) on Thursday April 18, 2002 @09:53AM (#3364893)
    Have to validate those 2Ghz CPU's and GPU's that need their own nuclear power plant to run somehow."

    Ya, and they can use the cooling towers to cool those bad boys too!
  • Killer App? (Score:5, Insightful)

    by DickPhallus ( 472621 ) on Thursday April 18, 2002 @09:57AM (#3364910)
    Personally, I see little driving the next generation windows boxes. I mean seriously, most computers that are 3 years old will do most things the average person could ever want. It'll burn CDs, play DVDs, read email, do word processing, email, blah blah blah...

    What's next to drive people to upgrading? Will the game market be enough to drive the market?
    • Re:Killer App? (Score:3, Interesting)

      by gclef ( 96311 )
      "What's next to drive people to upgrading?"

      Two words: interactive porn.

      That alone will justify the graphics, sound and bandwidth growth we've seen. c'mon, you know it's coming.

      (ooh, sorry, didn't mean the pun.)
    • What will drive people to upgrade ? The same thing that drives people to upgrade their cars now.

      Newer cars tend to be slightly more fuel efficient, quieter and faster. And of course cars wear out more quickly than silicon ( although keyboards and mice wear out more quickly than cars ).

      It won't just be the game market, there will be new apps. For example if people want to do good quality video editing, which is becoming a reality, then they will need better and faster computers with DVD writers that work well.

      • Re:Killer App? (Score:3, Insightful)

        by Stiletto ( 12066 )
        There's no way video editing will drive anything. How many people do you know _that are not geeks_ who want any kind of video editing, let alone "quality" video editing?

        Video editing will always be a niche app, because the raw output from cameras is good enough for 99% of the people out there, who only want to film weddings and their kid's birthday parties.
        • Re:Killer App? (Score:2, Insightful)

          by catseye ( 96076 )
          Boy, I hope you're not in charge of forecasting at your place of employment.

          As a Mac user (although not a zealot -- I'll use anything that helps me get my work done... Linux, Win, etc.) I'm always interested in what encourages people to switch platforms, especially those people who have been entrenched in their current selection for many years.

          Friends and co-workers who I would have never predicted would buy a Mac are asking my advice on iMacs and the like (and buying them) specifically due to Apple's push into consumer-class DV editing. iMovie, iDVD and DVD burners *are* selling computers, hilariously enough. I never realized how many people own little DV camcorders, even among my friends.

          Ironically, as a geek, I really don't see the appeal. But especially for families with small children, video editing really may be the killer app of the next 10 years.

          -A.
        • How can you say video is just a niche?

          You've seen all the 'old' home videos in popular culture?

          The concept of filming someone's birthday, setting up the projector, and boring the grandma with an hour of dull footage?

          It's even easier today with digital camcorders, iMacs, and DVD-Rs

          I mean, who's buying half a million iMacs if not people who want to make DVDs?
        • my roommate for instance. He is not a geek, yet he is an excellent artist, and works a lot on videos, he is producing a series of videos throughout this year, 1 and 2 are done. 1 looks excellent, and 2 shows that he learned a lot in the process and looks great
        • Au contraire.

          Most video cameras nowadays record to little itty-bitty proprietary format tapes, players for which cost a lot of money.

          For me, it's a no-brainer to want to take footage from my Sony MiniDV, and get it onto a DVD to sent to the grandparents or whatever. You can't stream video over the internet, not even from DSL, so the next best thing is to snail-mail a DVD. Much smaller and more durable than VHS.

          Granted, not everyone out there wants to invest in this kind of equipment. Granted, my Sony Mini DV camera was like $1k, and upgrades to my computer to do DVD production, another $1k, and software, another $1k. That's a lot of money to spend just for the convenience and cost savings of not having to dump raw footage down to VHS via the VCR, but there are other intangibles, like, DVD media lasts longer, takes less physical storage space, etc.

          Microsoft, of course, has demonstrated that they totally don't "get it" when it comes to DVD, by adopting DVD+R as their "standard" instead of DVD-R. Apparently just to spite Apple. Possibly to suck-up to the content industry (MPAA).
    • Marketing strategies (Score:3, Interesting)

      by dryueh ( 531302 )
      What's next to drive people to upgrading?

      Nothing, and that's the beauty of MS's strategy. Windows releases are always endorsed by celebrities, big promo events, etc etc (didn't 'The Rock' help plug Windows XP?). When Microsoft, the OS company, releases a new version or updates their old products, everyone has to have it...regardless of how well their old systems (whether that's hardware or software) work to fit their needs.

      Effective marketing, goddman them all.

    • Re:Killer App? (Score:3, Insightful)

      by mpsmps ( 178373 )
      If Microsoft accomplishes its goal (according to the article) and manages to move gaming off the PC, then there will be much less incentive to upgrade PCs. I'll bet the PC manufactuers are going nuts about this behind the scenes. Perhaps Microsoft is taking revenge on the PC manufacturers for not supporting MS in the antitrust trial.
    • Re:Killer App? (Score:3, Insightful)

      by burts_here ( 529713 )
      the same killer app that has been upgrading pc for the last ten years, bloatware.

    • Dust

      That crap can kill any PC. Eventually it will die, and die hard.
    • Re:Killer App? (Score:4, Informative)

      by 4of12 ( 97621 ) on Thursday April 18, 2002 @11:44AM (#3365496) Homepage Journal

      Well, apart from the underlying sentiment against commoditization that was mentioned in Tom's review of WinHEC that will impede the rollout of the next killer app, there are a few things that come to mind.

      • Smaller, quieter PCs that don't make a SOHO look like a machine room. I've got a 60 lb monitor sitting on my desk next to a noisy midtower case. If I could replace it with an LCD at equivalent resolution and a Tranmeta like OQO driving it I'd pay for such an upgrade.
      • Greater telephony integration. If I could plug phones, fax machines into RJ11 slots into the computer and use cheap easy software for voicemail, for automatically calling out, forwarding messages to my work number, etc.
      • Wireless networking to conventional Consumer Electronic devices such as TVs, PVRs, FM stereo receivers, CD players, portable MP3 players, etc.
      I know that with enough money and with specialized Knerdly Knowledge it is possible to build systems to do some of these things even today, but what's needed is for it to be cheap and convenient for the average Joe.

      It could be that way if all the major players weren't so worried about protecting their existing revenue streams - I suspect it will be necessary for new companies to provide these innovations. From the gist of the conference, you can tell that MS and the other attendees are not entirely unaware of what people would like to have.

      • Could it be that you're talking on the same lines as Steve Jobs in the "digital hub" idea? Hmm.. maybe Steve-O isn't so crazy after all...
    • What this means is that the PC market is clearly maturing. Humans really are capable of doing only so much at any given time, and PCs have been capable of satisfying us for many years, now.

      This is also apparent in the maturation of office productivity applications. It has gotten to the point where added features, such as the automatic-MS-Office-knows-better-than-I-do crap, really detract from a product.

      There will always be science, engineering, and games to want for more CPU power and bandwidth, but, in general, the industry has reached a critical mass for most of us.

      Honestly, for my work, any computer made since 1994 is just fine. Pentium 200 PC--just fine for OpenBSD. 75MHz SPARCstation--perfect without any frustration.
    • maybe people want to PRODUCE dvds. Ooops, sorry, Microsoft missed the boat on that one. (compared to Apple.)
  • Ensuring that my 7ghz machine with 40 gigs of ram and 520 TB of HD storage will still choke on a mp12 while scrolling with my MS MindMouse.
  • Add this [tomshardware.com] to the long list of Microsoft presentation blunders [cnn.com]. A too hot for TV MS bloopers tape is due out soon.
    • That clip is awsome, it's one that will make me laugh everytime I watch it.

      I'm still young but when I'm 72 that will give me a chuckle..
    • wincrash.jpg? Since when has a 'page cannot be displayed' error been a 'crash'?

      Cheers for the insightful name, Tom, it really gives me confidence in your tech reports.

  • Transparancy (Score:3, Insightful)

    by fraggleyid ( 134125 ) on Thursday April 18, 2002 @10:07AM (#3364955)
    As Pratchett said (in The Truth), do they mean transparant as in you can see through to their motives or transparant as in you can't see their motives at all.
  • by cygnusx ( 193092 ) on Thursday April 18, 2002 @10:09AM (#3364963)
    Microsoft staffers spent a long time hand carving this imposing statue [tomshardware.com] of BillG at the entrance to WinHEC. Based on Native American folklore from the Northwest apparently it wards off government lawyers. :)
  • Well, I think what's happening is companies are tired of seeing office machines being given 3-4 year old graphics cards! :) Most machines for the office don't need (or so someone says) a nice graphics card so now office workers put up with slower graphics because they have a Riva or something to that effect in thier machine. Used to be there was not much difference in a office machine and a home machine. Not anymore. I think Nvidia would like to lower the price of their high end but can't because there are many (I am one of those!) who don't see the point in buying thier firebreather when a Geforce 2 MX works just fine for about 95 percent of the people....even some games can be played just fine on a 2 MX. No Microsoft is feeling the pressure from the hardware folks because for some reason, they can't convince OEMS to use thier firebreathing Geforce and P4 chips in machines that are sold to grandma's (many more grandma's then hardcore gamers). If they could sell more of those, then they don't have to charge 300+ for one of those nice cards. If the OS used it more, then people would be forced to go get that new graphics card. The demand would be up and the price would take a plunge. It's ALL about eyecandy. Users dig it! (I don't need it all of the time, but I dig it too if it can look good and be fast!)
    • by Rogerborg ( 306625 ) on Thursday April 18, 2002 @11:21AM (#3365323) Homepage
      • some games can be played just fine on a 2 MX

      "Some"? Holy heck, welcome to the problem. I've just built a machine for my brother. An XP 1700+, 256Mb of DDR 2100 and a 64Mb GeForce 2 MX 400 with TV out. We debated hardest on the card. He wanted to go for a GeForce 3 TI to future proof himself. Here's how my reasoning went:

      • The 3 TI costs 2.5 x the price of the 2 MX.
      • Either card will push images to his (expensive) TV or (cheap) monitor as fast as it can take them for any current game.
      • When games come out that overstretch the 2 MX, what's the price on the 3 TI going to be? Probably about the same as the 2 MX today. By waiting a year, buying the 3 TI and binning (or donating to a needy brother, ahem) the 2 MX, he actually saves himself money. At no point will he be running a clunky game.

      Logic prevailed. Oh, he still wanted the 3 TI, because game mags say it can run at a squillion fps @ 1600x1200x32, but we did manage to establish that the noticable benefit would be zero, because he doesn't have a monitor that can handle that.

      I'd advise anyone else thinking of buying a high end graphics card to do this calculation. Unless you've got a 1600x1200 @ 80fps monitor, what the heck do you need a GeForce 3 or 4 TI for? Don't spend money "future proofing": all you're doing is paying a premium on hardware that will be a lot cheaper when you do find yourself needing it.

      • Let's say he pays $150 for the 2mx now, and $150 for the 4ti later. He could also pay $300 for the 4ti now (these are hypothetical numbers, they don't make 4ti for iMacs :( ), spend the same amount of money, *and* have the go-jillion FPS now. Just a thought.
          • Let's say he pays $150 for the 2mx now, and $150 for the 4ti later. He could also pay $300 for the 4ti now

          We could say that, or we could say what I actually said, which was that the 3 (not 4) TI costs 2.5x the cost of the 2MX now, so if he buys it when it's dropped to the price of the 2MX, he saves money. We could also look at the fact that if he does it my way, he gets a spare and very usable 2MX to re-use. Further, we could understand the proposition that he can't see the jillion fps now. It's utterly irrelevant.

          Just a thought.

      • It's quite a nice coincidence that tomshardware just had this article [tomshardware.com]

        I think it'll show you that if you're buying a new computer, and want to play the latest games at a decent resolution and framerate, a 2 MX just isn't sufficient. Of course my definitions of decent may differ from yours, but I don't think 1024x768 is unreasonable.
          • [Toms will] show you that if you're buying a new computer, and want to play the latest games at a decent resolution and framerate, a 2 MX just isn't sufficient

          Ah, fair point. However, a couple of things I should have been clearer on:

          • The primary use of this machine is for TV-out, which means 640x480. And at this resolution, there's little point running at maximum detail either.
          • Sure, if you want to run the latest games at full details, you need the latest hardware. But I'm of the mind that the best bang-per-buck comes from games that have been discounted to the 2/3 price level. A great game will be just as great (and more stable with more content) in six months time.

          Basically, I'm saying that it's prohibitively expensive to try and stay at the bleeding edge of the performance curve, or to buy hardware to play any particular title. If we accept Tom's proposition that you need premium hardware to play new games at full detail, then that necessitates buying premium hardware every six months or so!

          If you're prepared to lag six months behind in both hardware and games (or detail levels), then you get a lot more bang per buck. And let's never forget that most reviewers aren't paying for their hardware; I'd far rather see Tom's pick a price point and then put together the best system for that price.

    • My work PC has a TNT2 M64. It runs Unreal Tournament just fine, thank you very much. My home has a Geforce2 MX, now that's what I call a fire-breather.
  • by sfraggle ( 212671 ) on Thursday April 18, 2002 @10:16AM (#3364995)
    Quote from article:

    24-bit True Color, or 8 bits per pixel, is not enough. Microsoft is pushing graphics board vendors to implement greater than 8 bpp in order.

    This is great! Its so awful being stuck with only 256 colours to choose from! Think of all the different shades of blue they'll have in the next version of windows!
    • "Professional" level is, as I recall, 48 bits. It's not the colours, it's the math. John Carmack explains it much better than I; perhaps he will. :-)
      • I think the original poster was just pointing out (in a sarcastic faction) a rather stupid mistake in the article. 24 bit color is 24 bits per pixel (24 bpp), not 8 bits per pixel (its 8 bits per color component).

        I'm no John Carmack, but the reason higher than 24/32 bit color is important is that most 3D graphcis these days use multiple texture passes per polygon. So for one car model, say, you may have a base texture, a 'damage' texture, a bump map texture, an enviornmental mapping (ohhhh shiny!!!) texture, etc. When you composite all of those textures together using multiple passes or multi-texturing, colorspace errors that would normally be imperceptable tend to accumulate and you wind up with ugly artifacts like color banding.

    • Think of all the different shades of blue they'll have in the next version of windows!

      "Oh, look! It's the New, Improved Robin's-Egg Screen of Death! Stay calm, stay calm!"
    • 24 bit true colour is 24 bits per pixel, viz 8 bits per channel (RGB). While extra depth on top of this can be used for doing things like alpha channels, IMHO, there's not much need for even more colour depth. I wish I had the notes from my first year remote sensing, but I recall that 24bpp is pretty close to what the human eye can discriminate anyway. We haven't received an office PC at work these last 3-4 years that *hasn't* had a pretty good 24bpp 2D card (certainly not an issue when we buy the most crummy monitors we can get away with).

      The only reason I can think of having more channels is so that the windowing can be done on the video card, complete with lots of translucent overlays. Sheesh... as if tasteful textured pastel email stationery in Outhouse wasn't bad enough... roll on the floral decoupage desktop.

      Xix.
  • oooh oQo (Score:2, Insightful)

    by moonbender ( 547943 )
    Not to repeat what was said in the slashdot thread to it, but man does the oQo look sweet. I really hope they can pull this off, this looks like the perfect eBook reader, to start with. Too bad games won't run well on it, though I'm sure older ones will work great - GBA emulation on a oQo sounds like another sweet idea. I pray it's not vaporware.
  • Berlin (Score:4, Informative)

    by Anonymous Coward on Thursday April 18, 2002 @10:36AM (#3365088)
    The article mentions that the entire desktop will be made in vector graphics. That was mentioned several years ago, and, there is a (slowly developing) project for Linux named Berlin which also is a vector based desktop.

    http://www.berlin-consortium.org/

    Hopefully that will pace up!
  • Colors Fidelity (Score:2, Insightful)

    by Reverberant ( 303566 )
    We all know that the colors you see on your monitor don't exactly end up being the same as the colors you get on your inkjet printer, or on your LCD, or in real life.

    Why is it gonna take MS 3 more years to implement what Apple did 10 years ago?

    (Yeah, I know it's not quite the same thing, but MS still hasn't given us a simple OS-level color matching system!)
  • As usual, it's amusing to see MS following the lead of others -- in this case, OS X is using a 3D API (OpenGL) as the implementation base for its GUI and other 2D graphics on the desktop today.

    For me, a more interesting question is whether this move indicates the slowdown of the evolution of D3D. D3D has been free to evolve without much concern for release-to-release compatibility largely because game developers change their codebase so much more rapidly than other application developers. But if the mainstream app developers begin to use D3D, the API will gain a lot more inertia.
  • From the text under the picture:

    Microsoft staffers spent a long time hand carving this imposing statue of BillG at the entrance to WinHEC. Based on Native American folklore from the Northwest apparently it wards off government lawyers.

    *grin* Those guys are quite funny, methinks.

  • This is hardly anything new. IRIX has been using OpenGL and/or IrisGL for everything since... a long time ago. OpenGL isn't just for 3D fancy pants games, you know. Also, DirectFB harnesses 3D acceleration of several video cards through the Linux framebuffer to draw its 2D interface. Alas, Microsoft is going to once again claim that they're the first ones ever to use a real graphics library to draw the user interface.
  • Thank god! People in the beowulf community were worried for a second there there might be no reason to release even faster CPUs with even better pipelining and faster FPUs! Microsoft is what makes Linux clusters possible -- they're the insurance behind Moore's law.

    Sorry, I make (part of) my living off of the Wintel conspiracy fallout building Linux & FreeBSD clusters. Just think, you can be DIV-Xing 2 live tv streams at once and watching another on a regular linux box these days thanks to the relatively cheap mid range CPUs being sold these days! WOOT!

    -- Math.
    "Package tours are God's way of teaching Japanese tourists about current events." -- me paraphrasing Ambrose Bierce after JP tourists arrive in Bethlehem recently, completely unaware.
  • Some really definitive industry standards for 3D graphics hardware and software. Standards that are not controlled by any one company and that are not bogged down with patents and cross-licensing. I nominate OpenGL 2.0 (-:
  • It kinda sucks. Apple goes and revives DPS as DPDF, and drastically changes the underlying nature of the display engine of a consumer PC. They have OpenGL and accelerated graphics as part of the core, available to desktop apps and the window manager alike. No tedious driver install. No weird compatibility issues.

    Microsoft goes it, and everyone goes bonkers, like its something new. It is new, in a sense, because Apple is just far off everyone's radar.

    Now if Apple can just get all the bugs worked out, needed features added, and documentation brought up to date by the time Microsoft rolls out the 1.0... here's hoping.

"If it ain't broke, don't fix it." - Bert Lantz

Working...