At the Windows Hardware Engineering Conference 248
downix writes "At Toms Hardware they're running an article where they discuss the next-generation Windows graphics system. The big part of the scoop, it's being done via DirectX. Have to validate those 2Ghz CPU's and GPU's that need their own nuclear power plant to run somehow." Some other interesting things there - quiet PCs, more about the Oqo, etc.
Can't wait for... (Score:4, Funny)
Re:Can't wait for... (Score:2, Flamebait)
Re:Can't wait for... (Score:2)
They've got their work cut out for them... (Score:2)
So says Jerry Pournelle [jerrypournelle.com], anyway:
"I have tried to get an Orinoco Wireless WiFi (Allchin pronounced it "Wiffy" at least seven times in his market department written presentation) and I can't get it to work with Windows 2000. Alex hasn't managed with Windows XP. No one else in the press section has connected to the Internet with their 802.11 cloud. Allchin couldn't connect to Wiffy. But Peter has connected to the Internet with the same card with his PowerBook == as Peter says, with Apple everything is either easy or impossible. Using the Orinoco card with his PowerBook was easy. With Windows 200o so far it has been impossible... (But that eventually worked see below.)"
"I have managed to get on the Internet. The local network is WINHWC2002. Yesterday it was WinHEC2002. It is case sensitive. Except that Peter's Apple didn't have that problem. He got on yesterday and he's still on today, in a hall that no one else can get on because of very weak signals. Astonishing."
~Philly
Re:Can't wait for... (Score:2, Insightful)
Re:Can't wait for... (Score:2)
Actually yes and long before Microsoft decided to do it. Once again, Microsoft is playing catch up. Apple innovates, Microsoft immitates. Apple is and has been working closely with nVidia and ATI on a new 3d graphics card utilizing technology they aquired in the last two years from purchasing high end graphic workstation companies.
And when is Microsoft going to deliver "Longhorn"? 2003? 2004? 2005? Maybe much longer because they can't even figure out how to get something as simple as WiFi to work like Apple can.
P.S. Do you kiss your boyfriend with that rude mouth?
Re:Can't wait for... (Score:2)
You are talking about today, not tomorrow. Apple has been working on 3d interfaces with ATI and nVidia utilizing the tech from their recent purchases. I don't hate MS. I use Word, Excel, PowerPoint and Entourage (Mail client) more than any other product besides BBEdit. I just think they are bumbling fools when it comes to their own OS.
what' I'd rather see... (Score:2, Insightful)
Re:what' I'd rather see... (Score:2, Informative)
I agree, and not only that, but when you have three or four boxes running in a single room in the summer, the heat gets to be an issue as well. When your poor and hot, choosing between running A/C (and using a lot of electricity) and running the computers is a hard choice to make. Basically, the more power efficent, the better.
Re:what' I'd rather see... (Score:3, Interesting)
average cost of electricity in US as of 1999 [uoregon.edu]
That said, I've got about 5 computers and matching monitors (there's where the power's eaten up) running 24x7, and totally understand the desire to keep power use down...
Re:what' I'd rather see... (Score:4, Insightful)
Re:what' I'd rather see... (Score:3, Insightful)
300 watts is more than the typical computer really uses. 60 to 100 watts continuous is more realistic judging from my UPS data output. Even then, $84/year is not trivial (this is the cost of a good component upgrade, these days).
There are reasons why initiatives like Energy Star exist. World-wide, I would bet the equivalent of an entire power plant output is devoted just to keeping our computers idle. It is easily argued that this is lots of money and other resources going straight down the commode.
What portion of California's recent energy crisis was due to tens of thousands of computers running unused?
Re:what' I'd rather see... (Score:2)
Zero. California's power crisis was due to market manipulation by out of state power companies.
-jon
Re:what' I'd rather see... (Score:2)
Re:what' I'd rather see... (Score:2)
-
Re:what' I'd rather see... (Score:2)
I had a free disk array cabinet+card. The array formatted out at RAID5 at only 20 gigs -- usable, but not phenomenal. The killer was it was going to cost me $20 per month to power it! The new IDE HD I bought was $100 and gave me double the disk storage.
Re:what' I'd rather see... (Score:2)
My iMac draws 170W max, less than 90W in standby, less than 35W asleep. My iBook draws 45W max, 18W in standby, less than 5W asleep. The iBook is actually faster than the iMac, too...
Remember, these numbers include the monitor.
-jon
Re:what' I'd rather see... (Score:3, Informative)
When your AthlonXP 1800 eats 85W by itself, I wouldn't be in a hurry to test this. insufficient voltage can be bad for chips and expansion cards. However, I do agree with the high quality PSU sentiment.
Heh, A Powermac uses a 125W PSU. That's for TWO processors, an optical drive, zip drive, up to 4 HD's, two fans, tumbler digital audio amplifier, AND flat panel display. If there's one thing they've got down at Apple is low power consumption. I wish they'd look into rackspace applications, since in that market, their HW wouldn't be any more than PC counterparts.
Re:what' I'd rather see... (Score:2)
I'd love to believe that, but I seriusly doubt it's true. eeach G4 uses (IIRC) between 15 and 25 W by itself. So you're bumping 50 W alone there. Figure in power for a monitor (including the Apple CRTs, not just LCDs) hard drives, optical drives (where the SuperDrive is a big power consumer) the motherboard itself, bus power for FireWire and USB, power for PCI cards...you're definitely using a power supply that's more than 125 W. I'd guess modern G4s have either a 250 or 300 W PSU.
The draw may be lower at times, but I bet a G4 at peak can use as much power as a PIII.
Re:what' I'd rather see... (Score:3, Informative)
My computer has a 300 watt power supply and draws less than 40 watts (ok, its a 486, but...)
Re:what' I'd rather see... (Score:3, Informative)
This is what you do: turn off all the PCs for just a few minutes. Useing a stop watch count the number of revolutions in a minute (or ten seconds, or whatever). Do the math and you will be able to get you baseline power consumtion. It is best to do this with as much as possible turned off. Now turn just the PCs on. Count the number of revolutions, do the math and you have your total power. Subtract your baseline power consumption and you have just the PC power consumption.
I have done this myself and compared the results with a decent power meter. I was only off by 10%.
Re:what' I'd rather see... (Score:2)
Nice try, Dr. Uptime! But I'm on to you!
~jeff
Re:what' I'd rather see... (Score:2)
Re:what' I'd rather see... (Score:2)
8.27 * 300 = 2,481
Yes, 300 watts is 0.3 kilowatts.
What is a kilowatthour?
It's a unit of energy. Energy is power multiplied by time. Watts are power. A kilowatt-hour is the amount of energy consumed by a 1000-watt device running for one hour.
3d vs. 2d (Score:4, Insightful)
If windows are textures, it seems like it will be pretty difficult to get perfect 1-to-1 mapping of pixels via a graphics gpu. Right now, the only thing that is a big deal is "jaggies", but noone expects a perfect image of textures. I know part of this is the game itself, but it is very hard to make textures fit exactly how you want them to.
Sounds neat tho, if they can pull it off. Middle of the next decade indeed.
Re:3d vs. 2d (Score:2)
Re:3d vs. 2d (Score:2)
Not difficult (Score:2)
DirectX (Score:2, Insightful)
-Mod me up, I need the karma!!
A single calcified tear... (Score:4, Funny)
Yeah...my thoughts exactly.
Re:A single calcified tear... (Score:2)
I'm gonna donate some cash to them right now, because I don't want to see Microsoft die.
Cooling towers (Score:3, Insightful)
Ya, and they can use the cooling towers to cool those bad boys too!
Killer App? (Score:5, Insightful)
What's next to drive people to upgrading? Will the game market be enough to drive the market?
Re:Killer App? (Score:3, Interesting)
Two words: interactive porn.
That alone will justify the graphics, sound and bandwidth growth we've seen. c'mon, you know it's coming.
(ooh, sorry, didn't mean the pun.)
Re:Killer App? (Score:3, Interesting)
Now I know there are already lots of projects to try and tap unused computing power, but it doesn't seem to have gone as far as it could. Imagine something like MOSIX distributed worldwide over the net - so when you run 'make' all sorts of random people you've never met will execute part of the job on their PCs. The protections against sabotage would be quite difficult to work out, but I'm sure it's possible.
What I'm saying is that in the past, there was always a need for faster CPUs in the individual PC. But if networks get faster and more widespread, it might turn out that individual PCs are fast enough and more effort should go into harnessing them together.
Of course, if an efficient global market did develop in computing power, then it might be worth developing faster processors just for that reason, to 'farm' them.
Re:Killer App? (Score:2)
Isn't that the goal? I mean, do you really want to be waiting for your computer, ever?
Re:Killer App? (Score:2)
What I meant was, people are saying that power users and specialist applications will drive the development and adoption of faster processors, as happened in the past. But does this pattern still hold if every machine is networked? Do you need a faster CPU, or just a faster broadband connection?
Re:Killer App? (Score:2)
>>>>>>>>
This is very misleading. While the CPU utilization over a period of 10 minutes might be 1% for someone browsing the web, it doesn't mean that the user doesn't wait an agonizing several seconds waiting for a complex page to load. With my DSL connection on my 1.5GHz machine, using Konqueror is a much more pleasent experience than using Mozilla because Konq is so much faster. Of course, neither program taxes my hardware much overall, but what counts for the user experience is maximum latency, not total throughput (so to speak). With more and more complex content coming out (in particular SVG, which is pretty slow to render, and is even slower if complex animations are used) CPU's will need to keep getting faster just to keep up with the internet.
Re:Killer App? (Score:2)
Newer cars tend to be slightly more fuel efficient, quieter and faster. And of course cars wear out more quickly than silicon ( although keyboards and mice wear out more quickly than cars ).
It won't just be the game market, there will be new apps. For example if people want to do good quality video editing, which is becoming a reality, then they will need better and faster computers with DVD writers that work well.
Re:Killer App? (Score:3, Insightful)
Video editing will always be a niche app, because the raw output from cameras is good enough for 99% of the people out there, who only want to film weddings and their kid's birthday parties.
Re:Killer App? (Score:2, Insightful)
As a Mac user (although not a zealot -- I'll use anything that helps me get my work done... Linux, Win, etc.) I'm always interested in what encourages people to switch platforms, especially those people who have been entrenched in their current selection for many years.
Friends and co-workers who I would have never predicted would buy a Mac are asking my advice on iMacs and the like (and buying them) specifically due to Apple's push into consumer-class DV editing. iMovie, iDVD and DVD burners *are* selling computers, hilariously enough. I never realized how many people own little DV camcorders, even among my friends.
Ironically, as a geek, I really don't see the appeal. But especially for families with small children, video editing really may be the killer app of the next 10 years.
-A.
Wha? (Score:2)
You've seen all the 'old' home videos in popular culture?
The concept of filming someone's birthday, setting up the projector, and boring the grandma with an hour of dull footage?
It's even easier today with digital camcorders, iMacs, and DVD-Rs
I mean, who's buying half a million iMacs if not people who want to make DVDs?
Re:Killer App? (Score:2)
Re:Killer App? (Score:2)
Re:Killer App? (Score:2)
Most video cameras nowadays record to little itty-bitty proprietary format tapes, players for which cost a lot of money.
For me, it's a no-brainer to want to take footage from my Sony MiniDV, and get it onto a DVD to sent to the grandparents or whatever. You can't stream video over the internet, not even from DSL, so the next best thing is to snail-mail a DVD. Much smaller and more durable than VHS.
Granted, not everyone out there wants to invest in this kind of equipment. Granted, my Sony Mini DV camera was like $1k, and upgrades to my computer to do DVD production, another $1k, and software, another $1k. That's a lot of money to spend just for the convenience and cost savings of not having to dump raw footage down to VHS via the VCR, but there are other intangibles, like, DVD media lasts longer, takes less physical storage space, etc.
Microsoft, of course, has demonstrated that they totally don't "get it" when it comes to DVD, by adopting DVD+R as their "standard" instead of DVD-R. Apparently just to spite Apple. Possibly to suck-up to the content industry (MPAA).
Marketing strategies (Score:3, Interesting)
Nothing, and that's the beauty of MS's strategy. Windows releases are always endorsed by celebrities, big promo events, etc etc (didn't 'The Rock' help plug Windows XP?). When Microsoft, the OS company, releases a new version or updates their old products, everyone has to have it...regardless of how well their old systems (whether that's hardware or software) work to fit their needs.
Effective marketing, goddman them all.
Re:Marketing strategies (Score:2)
Re:Marketing strategies (Score:2)
Re:Killer App? (Score:3, Insightful)
Re:Killer App? (Score:2)
Connectix Virtual XBox.
Coming to a PC near you.
Re:Killer App? (Score:2)
Microsoft VS Connectix.
Coming to a DMCA-friendly court near you.
--
Re:Killer App? (Score:3, Insightful)
One word... (Score:2)
That crap can kill any PC. Eventually it will die, and die hard.
Re:Killer App? (Score:4, Informative)
Well, apart from the underlying sentiment against commoditization that was mentioned in Tom's review of WinHEC that will impede the rollout of the next killer app, there are a few things that come to mind.
It could be that way if all the major players weren't so worried about protecting their existing revenue streams - I suspect it will be necessary for new companies to provide these innovations. From the gist of the conference, you can tell that MS and the other attendees are not entirely unaware of what people would like to have.
Re:Killer App? (Score:2)
Re:Killer App? (Score:2)
You mean like my iMac?
-jon
Re:Killer App? (Score:2)
This is also apparent in the maturation of office productivity applications. It has gotten to the point where added features, such as the automatic-MS-Office-knows-better-than-I-do crap, really detract from a product.
There will always be science, engineering, and games to want for more CPU power and bandwidth, but, in general, the industry has reached a critical mass for most of us.
Honestly, for my work, any computer made since 1994 is just fine. Pentium 200 PC--just fine for OpenBSD. 75MHz SPARCstation--perfect without any frustration.
Re:Killer App? (Score:2)
It's nice to see Microsoft is leading the way... (Score:3, Funny)
MS Presentations (Score:2)
Re:MS Presentations (Score:2)
I'm still young but when I'm 72 that will give me a chuckle..
Re:MS Presentations (Score:2)
Cheers for the insightful name, Tom, it really gives me confidence in your tech reports.
Transparancy (Score:3, Insightful)
I liked the article photo... (Score:3, Funny)
Re:I liked the article photo... (Score:2)
3d being used more on the non-gamer desktop? Why? (Score:3, Interesting)
Re:3d being used more on the non-gamer desktop? Wh (Score:5, Interesting)
"Some"? Holy heck, welcome to the problem. I've just built a machine for my brother. An XP 1700+, 256Mb of DDR 2100 and a 64Mb GeForce 2 MX 400 with TV out. We debated hardest on the card. He wanted to go for a GeForce 3 TI to future proof himself. Here's how my reasoning went:
Logic prevailed. Oh, he still wanted the 3 TI, because game mags say it can run at a squillion fps @ 1600x1200x32, but we did manage to establish that the noticable benefit would be zero, because he doesn't have a monitor that can handle that.
I'd advise anyone else thinking of buying a high end graphics card to do this calculation. Unless you've got a 1600x1200 @ 80fps monitor, what the heck do you need a GeForce 3 or 4 TI for? Don't spend money "future proofing": all you're doing is paying a premium on hardware that will be a lot cheaper when you do find yourself needing it.
Re:3d being used more on the non-gamer desktop? Wh (Score:2)
Re:3d being used more on the non-gamer desktop? Wh (Score:2)
We could say that, or we could say what I actually said, which was that the 3 (not 4) TI costs 2.5x the cost of the 2MX now, so if he buys it when it's dropped to the price of the 2MX, he saves money. We could also look at the fact that if he does it my way, he gets a spare and very usable 2MX to re-use. Further, we could understand the proposition that he can't see the jillion fps now. It's utterly irrelevant.
Just a thought.
Just not true at all. (Score:2)
I think it'll show you that if you're buying a new computer, and want to play the latest games at a decent resolution and framerate, a 2 MX just isn't sufficient. Of course my definitions of decent may differ from yours, but I don't think 1024x768 is unreasonable.
Re:Just not true at all. (Score:2)
Ah, fair point. However, a couple of things I should have been clearer on:
Basically, I'm saying that it's prohibitively expensive to try and stay at the bleeding edge of the performance curve, or to buy hardware to play any particular title. If we accept Tom's proposition that you need premium hardware to play new games at full detail, then that necessitates buying premium hardware every six months or so!
If you're prepared to lag six months behind in both hardware and games (or detail levels), then you get a lot more bang per buck. And let's never forget that most reviewers aren't paying for their hardware; I'd far rather see Tom's pick a price point and then put together the best system for that price.
Re:3d being used more on the non-gamer desktop? Wh (Score:3, Interesting)
Something I think game devlopers have forgotten lately is how to make a game fun. Now it seems all they and the hardcore gamers care about is eye candy. Sure, looking at these things will make your jaw drop, but who care how pretty it is....is it fun? To me, no. The games that center on deathmatching are no fun for me to play occasionally because there are so many players who have more time then I do and thus are much better then I ever will be. I am not saying that they should make it easy to play. I want to be challenged, but to depend on lightening quick reflexes is too much. I respect those who are real good at deathmatches as much as I respect athletes. I also believe there are some people, like myself, who will never be good enough to do well at the game. Just like i will never be as good as Michael Jordan. But to have fun in these games you have to be that good and it feel terrible to get killed every 2 minutes. If I want to feel like that I can just go and try to play basketball. Then I would get the same feeling.
I enjoy games that help you use your brain. Games like Roller Coaster Tycoon and The Sims challenge you to use your brain to be good at them. Quake, Wolfenstien and the upcoming Doom 3 while they would be fun enough to me in one player mode, just would not be fun at all in deathmatching. Sure they do challenge your brain in some ways, but after that, it's mostly quick reflexes and how quick you can move yer stick. Some say games like Starcraft are like this, and they are, to a point, but one can also win with stragtegy. That's where they differ.
Re:3d being used more on the non-gamer desktop? Wh (Score:2)
Re:3d being used more on the non-gamer desktop? Wh (Score:3, Informative)
And in all likelihood this is just because of crappy coding. Look at games like Grand Theft Auto 3 on the PlayStation 2. They're pushing more polys than the average PC game, with what's already an outdated graphics system and a 300MHz processor with 8K--yes, EIGHT kilobytes--of data cache. On the PC the developers get the latest graphics cards and high end machines, then grudgingly give a little thought at the end of the project toward making it run on something sane.
Odds are that you'll see Return to Castle Wolfenstein ported to a console like the PS2 or Game Cube and it will run faster than it does on the PC and require a factor of four less memory.
A GeForce 2 MX is still a real beast, BTW. It's better than what's in a PS2 in many ways. But while the PS2 coders are going nuts with that hardware, people are sneering down their noses at the GeForce 2 MX. That's a laughable situation. 3D has gotten so fast in recent years that no one knows what to do with it. In all honesty, even the power of Voodoo 2 era cards is rarely, rarely maxed out. Developers just write some half-assed OpenGL or Direct3D renderer and then blame the graphics card, not even looking at their code and realizing that it takes hundreds or thousands of cycles to process a single triangle--or even a vertex--on the CPU side.
Oh, I should have warned fanboys up front to cover their eyes before reading this, so their little worlds aren't shattered.
Re:3d being used more on the non-gamer desktop? Wh (Score:2)
Your're missing the point, Mr. Sarcastic. The Quake 3 engine is just the core rendering (and networking) engine. You can make it fast or slow depending on what you do with it. And, as no one outside of the game industry ever seems to realize, 90% of the code in a game has nothing to do with rendering.
Re:3d being used more on the non-gamer desktop? Wh (Score:2)
more than 8bpp! (Score:3, Funny)
24-bit True Color, or 8 bits per pixel, is not enough. Microsoft is pushing graphics board vendors to implement greater than 8 bpp in order.
This is great! Its so awful being stuck with only 256 colours to choose from! Think of all the different shades of blue they'll have in the next version of windows!
Re:more than 8bpp! (Score:3, Insightful)
Re:more than 8bpp! (Score:2, Informative)
I'm no John Carmack, but the reason higher than 24/32 bit color is important is that most 3D graphcis these days use multiple texture passes per polygon. So for one car model, say, you may have a base texture, a 'damage' texture, a bump map texture, an enviornmental mapping (ohhhh shiny!!!) texture, etc. When you composite all of those textures together using multiple passes or multi-texturing, colorspace errors that would normally be imperceptable tend to accumulate and you wind up with ugly artifacts like color banding.
Re:more than 8bpp! (Score:2)
"Oh, look! It's the New, Improved Robin's-Egg Screen of Death! Stay calm, stay calm!"
Eh??? (and how much colour depth is enough?) (Score:2)
The only reason I can think of having more channels is so that the windowing can be done on the video card, complete with lots of translucent overlays. Sheesh... as if tasteful textured pastel email stationery in Outhouse wasn't bad enough... roll on the floral decoupage desktop.
Xix.
oooh oQo (Score:2, Insightful)
Berlin (Score:4, Informative)
http://www.berlin-consortium.org/
Hopefully that will pace up!
Colors Fidelity (Score:2, Insightful)
Why is it gonna take MS 3 more years to implement what Apple did 10 years ago?
(Yeah, I know it's not quite the same thing, but MS still hasn't given us a simple OS-level color matching system!)
More "innovation" -- and less (Score:2)
For me, a more interesting question is whether this move indicates the slowdown of the evolution of D3D. D3D has been free to evolve without much concern for release-to-release compatibility largely because game developers change their codebase so much more rapidly than other application developers. But if the mainstream app developers begin to use D3D, the API will gain a lot more inertia.
Re:More "innovation" -- and less (Score:2)
Re:More "innovation" -- and less (Score:2)
The problem is, they rarely "do it better." They usually do it the same, but then add DRM or other "features" to it and mess everything up in the process.
Re:More "innovation" -- and less (Score:2)
Actually, I thought OS X was DisplayPDF, a descendent of DisplayPostscript. It's vector driven and superior to bitmapping in every way except raw speed, but it's not 3D.
Re:More "innovation" -- and less (Score:2)
Re:More "innovation" -- and less (Score:2)
Native American folklore (Score:2)
Microsoft staffers spent a long time hand carving this imposing statue of BillG at the entrance to WinHEC. Based on Native American folklore from the Northwest apparently it wards off government lawyers.
*grin* Those guys are quite funny, methinks.
Hardly anything new... (Score:2)
Beowulfers *HEART* Microsoft's CPU requirements (Score:2)
Sorry, I make (part of) my living off of the Wintel conspiracy fallout building Linux & FreeBSD clusters. Just think, you can be DIV-Xing 2 live tv streams at once and watching another on a regular linux box these days thanks to the relatively cheap mid range CPUs being sold these days! WOOT!
-- Math.
"Package tours are God's way of teaching Japanese tourists about current events." -- me paraphrasing Ambrose Bierce after JP tourists arrive in Bethlehem recently, completely unaware.
First we need.. (Score:2)
Making a splash trailing Apple (Score:2)
Microsoft goes it, and everyone goes bonkers, like its something new. It is new, in a sense, because Apple is just far off everyone's radar.
Now if Apple can just get all the bugs worked out, needed features added, and documentation brought up to date by the time Microsoft rolls out the 1.0... here's hoping.
Re:pc meets media (Score:2)
Re:pc meets media (Score:2)
The only thing that separates me from true techie nirvana is a TiVo that, out of the box, will let me connect it via a Cat-5 cable to my LAN at home so I'd have the option of programming it/managing it with a web interface. I love my TiVo, but I hate how tedious it is to use the remote to do that stuff when I could be using a mouse and keyboard.
Being able to archive shows to a computer via Ethernet would be nice as well, but I'm really hurting for a more efficient way to bend the TiVo to my will.
~Philly
Re:pc meets media (Score:2)
ok... fair enough.
Re:Is it that surprising? (Score:2)
if microsoft would just quit trying to stuff everything THEY think is great down the developers throats and focus on OS and API design (STANDARD API not what they can convolute) I'm betting that 90% of the ms-slammers would no longer care.
Re:Is it that surprising? (Score:3, Interesting)