Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Portables Media Power Hardware

Blu-ray In Laptops Could Be Hard On Batteries 202

damienhunter notes a Wired story on the power-hungry ways of the first generation of Blu-ray players coming soon to a laptop near you. "With the Sony-backed HD format emerging victorious from a two-year showdown with Toshiba's HD DVD, many laptop manufacturers are now scrambling to add Blu-ray drives in their desktop and notebook lineups. Next month, Dell will even introduce a sub-$1,000 Blu-ray notebook... But the promise of viewing an increasing variety of HD movies on your laptop may be overshadowed by ongoing concerns over the technology's vampiric effect on battery life. Indeed, if the first generation of Blu-ray equipped laptops are any indication, you might not get more than halfway through that movie before running out of juice completely, analysts say."
This discussion has been archived. No new comments can be posted.

Blu-ray In Laptops Could Be Hard On Batteries

Comments Filter:
  • by Joce640k ( 829181 ) on Friday February 29, 2008 @09:37AM (#22599620) Homepage
    I wonder....

    • by jonnythan ( 79727 ) on Friday February 29, 2008 @09:56AM (#22599810)
      No, the DRM has little to nothing to do with it.

      Decoding 20+ Mbps of MPEG-2 or VC-1 video along with lossless, compressed audio on the fly is extremely taxing and uses a lot of power.
      • by Xest ( 935314 ) on Friday February 29, 2008 @10:17AM (#22600056)
        I'd be surprised if BD+ comes completely free in terms of additional processing load. But even the AACS layer has to be costly.

        I'm not sure how the interactivity features compare in terms of additional processor loads, but this could cause differences between the formats also.

        Whilst I understand the power required to render HD content I think we must also bear in mind we're looking at 20gb - 30gb of data that needs to be decrypted, that can't be easy on the hardware either surely?

        I don't know if there's anything fancy they can do to lower the load, but even if there is dedicated hardware in the drive to offload this from the processor the dedicated hardware is still going to need some power.

        It'd be nice to see what proportion of resources are required for AACS, BD+, Java for Bluray discs and the data decoding and rendering itself. Anyone any ideas on this?
        • by ChoppedBroccoli ( 988942 ) on Friday February 29, 2008 @11:05AM (#22600682)
          Well certainly having hardware assisted decode with the new Intel chipsets will be a great improvement.

          From a recent anandtech review (http://www.anandtech.com/mac/showdoc.aspx?i=3246&p=2):
          "The Mobile GM45/47 chipsets are an integral part of Montevina and will feature the new GMA X4500HD graphics core. The X4500HD will add full hardware H.264 decode acceleration, so Apple could begin shipping MacBook Pros with Blu-ray drives after the Montevina upgrade without them being a futile addition. With full hardware H.264 decode acceleration your CPU would be somewhere in the 0 - 10% range of utilization while watching a high definition movie, allowing you to watch a 1080p movie while on battery power . The new graphics core will also add integrated HDMI and DisplayPort support."

          However, there is going to have to be some sacrifice on the user experience. I mean you can't really expect to watch 30-40gb of data in 2 hours and expect battery life not to take a hit. What would be ideal is if a single blu-ray discs had both an H.264 and a lower quality MPEG-2/mpeg-4 version of the video. If I am watching on a laptop screen (hooking the laptop to a HDTV would be another story), I don't really need to see 1080p resolution.
          • What would be ideal is if a single blu-ray discs had both an H.264 and a lower quality MPEG-2/mpeg-4 version of the video. If I am watching on a laptop screen (hooking the laptop to a HDTV would be another story), I don't really need to see 1080p resolution.

            Maybe someone can invent a disc with less capacity that stores lower quality video?

            Sarcasm aside, I agree there really is no need to see 1080p on a laptop screen. I really question why anyone would want to do that, but I suppose it beats the heck out of buying two different formats of the same movie.

        • Re: (Score:3, Insightful)

          by gravis777 ( 123605 )
          Yeah, but even in stand alone players, they seem to be power hungry and a lot of stuff is done in software, isn't it? I mean, BluRay discs use Java, for instance, for things such as menus. That in itself is hitting a processor of some kind, and if you update Java - you just cannot do that in hardware unless you update firmware, then aren't we back to the fact we are once again handling it in software, kindof?

          Okay, I am sure that did not make a lot of sense. Sorry.

          Now, we probably could put in hardware the a
      • It makes it quite a bit easier if you have a graphics chip built to decode MPEG2 and VC-1, like the newer Intel GMA series chips
        • Re: (Score:3, Informative)

          It makes it easier on the CPU, but you're still consuming the power to decode. I'm sure it helps to a degree, but "quite a bit easier" on power consumption is still an over statement.
      • by Kirth ( 183 )
        Of course it is also needed for the DRM. Those 20MB+ also need to be decrypted.

        In the early days of DVDs, I had some 450Mhz machine which was unable to play DVDs without stuttering, but was perfectly able to play the same mpeg2-File without encryption with no problems. And not everything on the DVD is encrpyted, precisely because of the (at this time very high) demand on cpu.
  • Captain Obvious (Score:5, Insightful)

    by imsabbel ( 611519 ) on Friday February 29, 2008 @09:39AM (#22599642)
    Because _nobody_ would have known in advance that decoding 25mbit+ of 1920x1080 h264 (a task that redlines even dual core desktop cpus) could be a battery consuming activity.
  • o rly? (Score:5, Funny)

    by rarel ( 697734 ) on Friday February 29, 2008 @09:41AM (#22599656) Homepage
    I don't know, my new computer here looks fi
    • Re: (Score:3, Insightful)

      by rudeboy1 ( 516023 )
      Just one question... Why the hell do you need to watch a movie in HD on a 15 inch screen?
      • Re:o rly? (Score:4, Insightful)

        by pipatron ( 966506 ) <pipatron@gmail.com> on Friday February 29, 2008 @10:11AM (#22599982) Homepage
        Why the hell do you need to watch a movie in HD on your 42" screen? Your laptop probably has a higher resolution, and you can still see the pixels.
        • Re: (Score:3, Insightful)

          by rudeboy1 ( 516023 )
          It might be a higher resolution, but most laptops I use (I don't really consider the "desktop replacement", 15 lb monster as a "LAPtop" here) have integrated graphics and a processor that is designed for low power consumption. So, it's not just the resolution that is a factor here. There is also the software/video translation in realtime, not to mention HD sound coming through 2 tinny speakers or a pair of earbuds. I'm sure you might be able to tell the difference, but for all the negative factors, I thi
        • Re:o rly? (Score:4, Interesting)

          by MightyYar ( 622222 ) on Friday February 29, 2008 @10:55AM (#22600566)
          Isn't a laptop with 1080 lines of resolution pretty rare? Will a 1050-line laptop scale the image or just crop it? I'd hope that there is a crop option, since the scaling would probably use even more CPU and degrade the image.
          • Isn't a laptop with 1080 lines of resolution pretty rare? Will a 1050-line laptop scale the image or just crop it? I'd hope that there is a crop option, since the scaling would probably use even more CPU and degrade the image.

            1080i is really 1920 x 1080 - while it's not quite common yet laptops shipping with higher resolution screens have been supporting that resolution (or a bit more).

            Scaling though is a pretty lightweight effort for the system, especially compared to the decoding. You'd always have the i
            • Re: (Score:3, Interesting)

              by MightyYar ( 622222 )
              I realize cropping sometimes sucks, but if you had a 1680x1050 screen (as I do on my Desktop), I contend that the 15 pixels missing on the top and bottom will not be missed - nor will the 120 pixels on each side. You're talking about 13% of the width going away and less than 3% of the height. I'd much rather get the crisp "raw" pixels and lose a teeny bit of the edge than have the "smeared" look of scaling along with black bars on the top and bottom of an already tiny laptop screen.

              The HDTV spec calls for a
        • 90% of the movies I rent aren't available yet in HD, so "meh".

          Actually 1080p on a 42" TV is pretty amazing - you can definitely see the difference between 720 and 1080 at 42". Right now about half of the 42" TVs I've been shopping for are 1080p. Go see it at a TV store where the staff are sufficiently not-dim-witted and actually connect all the HD TVs to HD sources.

          1024-line laptops are common enough, but to display a wider picture than 4:3 aspect ratio you'll need to downsize the picture to 700 or even les
          • "90% of the movies I rent aren't available yet in HD, so "meh"."

            Don't worry dude. Panty Party 29 and Girls Gone Wild in Lower Mongolia 14 are coming to Blu-ray soon.
      • Re: (Score:3, Interesting)

        by teh kurisu ( 701097 )

        My laptop screen resolution is 1280x800. 720p resolution is 1280x720.

      • Re: (Score:3, Insightful)

        by Calinous ( 985536 )
        "Why the hell do you need to watch a movie in HD on a 15 inch screen?"
              Because you bought the BluRay edition of the movie to be able to watch it at home on your 42" plasma TV?
        • by tlhIngan ( 30335 )

          "Why the hell do you need to watch a movie in HD on a 15 inch screen?"

          Because you bought the BluRay edition of the movie to be able to watch it at home on your 42" plasma TV?

          Well, the better solution would be to do it the HD-DVD way. Put both the Blu-Ray and the DVD versions on one disc! The technology has been demonstrated, so it's doable.

          Of course, there is managed copy, but I don't see how that's supposed to work until Blu-Ray 2.0 players come out later this year able to do key negotiations and license

        • by hitmark ( 640295 )
          time to do a recode then. oh wait...
      • "Just one question... Why the hell do you need to watch a movie in HD on a 15 inch screen?"

        This is the same 15 inch screen people read emails off of. I don't have the best vision in the world, yet I still find your question baffling.
  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Friday February 29, 2008 @09:45AM (#22599704)
    Comment removed based on user account deletion
    • It's the decoding. H.264 in particular (which is getting to be rather favoured) but all the codecs on Blu-Ray take a ton of computation to decode. We are talking like 90% of both cores on a dual core CPU. That is what hits the battery really hard. Copying to a HD won't fix that.

      What probably will start happening is some hardware acceleration of the process. The newer models of the nVidia 8800 series support hardware acceleration of the HD codecs and it apparently take a bunch of load off the CPU. Something
      • Decoding still isn't free, no matter whether the CPU or the GPU does it. It still eats up power.

        The thing that would help most is to rip the movie to a resolution more compatible with your laptop... I know of very few laptops that can display 1920x1080 video in it's proper resolution... convert it to 720p, something with fewer pixels. It'll make the decoding easier, the file smaller, and look better on your smaller screen anyway.
        • The reason it would take less to have dedicated hardware do it is you can optimise for it. You don't think that standalone Blu-Ray players come with Core 2 Duos, do you? When you have hardware performing a dedicated task, you can make it do more with less. That is the whole principle behind a GPU in the first place. Well you can get yet again more optimised if you are talking about a very specific thing, such as having a chip that accelerates the codecs and maybe AES decryption. End result is you use a lot
          • You may recall the same kind of thing happening back with DVDs. CPUs just weren't powerful enough to decode DVDs in realtime, at least most weren't. So you'd get an MPEG-2 decoder card in the system to do it. If you looked at the card, you discovered it had a pretty simple ASIC that did it all. No heat sink, low clock speed, etc. However because it was designed for that one specific task, it was able to do it at full speed, despite being over all much less powerful than the CPU.

            Further, the next step of the
      • Comment removed based on user account deletion
    • by CODiNE ( 27417 )
      You know what man. "Managed copy" is gonna suck. I was at Walmart yesterday and I noticed among the $5 DVDs a small row with $13 DVDs in a redish box. You can find great hits like "The Dark Crystal" and "Neverending Story" that usually got for $7.50 combined with another. In fact you can walk one aisle over and do just that. So why are these few lame old movies being offered at a drasticly increased price? Because THOSE awesome DVDs are advertised to have a special DIGITAL copy already included that y
  • by TripMaster Monkey ( 862126 ) on Friday February 29, 2008 @09:46AM (#22599714)
    Before we start bitching about Blu-Ray, it's worthwhile to note that HD-DVD has (had, anyway) similar power requirements. From an Engadget article [engadget.com] (emphasis mine):

    For all the back and forth "we're better than you" rhetoric exchanged between the parties, the two really aren't that different. Both offer the same array of codecs and are driven by very similar power requirements. Essentially (and without intending any slight towards the HD DVD camp), anything an HD DVD player can do, a Blu-ray can do also.
  • by Adam Zweimiller ( 710977 ) on Friday February 29, 2008 @09:47AM (#22599724) Homepage
    Perhaps this is finally the sort of problem that will stur average joe consumer to be dissatisfied with the state of current battery technology, stirring innovation? Personally, I can't wait for Mr. Fusion in my laptop. My kids would all be the next WWE superstars with those kinds of irradiated swimmers in my loins.
    • Re: (Score:3, Informative)

      You mean innovations like this [gizmag.com]?
      • Re: (Score:2, Insightful)

        by cob666 ( 656740 )
        I Can't imagine trying to get that thing onto a plane?
      • I suppose, but from a quick perusal of the article it looks as though A) it's not rechargable. You have to buy new cells after 48 hours of use. Personally, I use my laptop that amount within a week, so I'd be buying these pretty often. You could argue that it's not different than having to stop and fuel your car, but it really isn't as convienent as charging it from an outlet in your home, work, school, or local Starbucks. B) That article is two years old, and it takes about a commercial version possibly be
    • by mgblst ( 80109 )
      Yes, because that is all it takes for technological innovation to happen, consumer disatisfaction. Like when people were so pissed of that the Sun was rotating around the Earth, that they innovated it to stop, and go the other way round. Genius!
    • by Atraxen ( 790188 )
      We need to learn more in order to 'innovate' the next battery technology. And I seriously doubt any scientists are having an epiphany and saying, "Oh, wow! People need to watch HD movies on their laptops! I better move my cot into the lab to better supply consumer demand!" Especially since there's already substantial funding opportunities for research in areas leading to these developments - http://www.nsti.org/press/PRshow.html?id=1342 [nsti.org] for example (and there're probably better examples, but that's what
  • Even when my battery was new, I still wouldn't get more than 3/4 of the way through a DVD before having to plug in.

    Low end laptops never could play through a complete movie, regardless of whether it was on DVD or Blu Ray.

    It doesn't matter how much power Blu Ray consumes - there will always be a laptop manufacturer who skimps on the battery to cut costs. If you want to watch movies on a portable device, you have to buy a personal media player. Sad, but true.

    • by QuantumG ( 50515 )
      That certainly was the case... back when Linus starting working for Transmeta. Things have progressed a bit since then, you might wanna buy a new laptop sometime.

    • you must have bought the cheapo laptop with the cheapo battery.

      I bought a mid-level laptop with the big battery because I'm not stupid.

      I didn't buy the blu-ray drive because it was $360.

      I can run my lappy on full brightness and wifi for over 3 hours.
    • by v1 ( 525388 )
      I was just thinking the same thing. The drive itself, spinning up the disk made my previous laptop NOT a LAPtop for playback of movies. It also vibrated pretty bad which you could not hear but certainly felt. That's changed in new laptops but is still an issue. Playing back DVDs always heats up the laptop.

      I carry a couple DVDs around with me to watch, but I have a folder on my HD with several dozen video clips for entertainment on the road. Considering my HD is 200gb, (about par by today's standards for
  • by Chrisq ( 894406 ) on Friday February 29, 2008 @09:47AM (#22599730)
    Since HD DVD used the same lasers and the same compression codecs I believe this would have applied to HD DVD also. This is not a case of "if only HD-DVD had won" but a basic technology problem.
  • by evilviper ( 135110 ) on Friday February 29, 2008 @09:48AM (#22599734) Journal
    From TFA:

    "The laser that runs the show [in Blu-ray players] is a very high-power laser," notes Mercury Research analyst Dean McCarron. That laser is one of the main things that conspire to raise power consumption.

    If the laser in a Blu-ray drive uses remotely as much as your CPU or LCD backlight, you're going to be burning a hole through your laptop in just a few minutes... Where does the media go to always find these moronic analysts?

    • Does anyone know? (Score:4, Interesting)

      by Chrisq ( 894406 ) on Friday February 29, 2008 @09:51AM (#22599754)
      Is there any reason a high power laser is needed for reading? Writing may have a power requirement but I would have thought that to read a disk you could make up for a lowered power laser with a higher sensitivity detector.
      • Re: (Score:3, Interesting)

        by cube135 ( 1231528 )
        I'm not sure, but it's probably the blue laser that the high-def formats use. It's a shorter wavelength laser than normal DVD or CD lasers, so it would take more power to create the beam. I can't see that taking more power than the 100% CPU needed to actually display the movie, or the LCD's power drain, though...
      • Re: (Score:3, Informative)

        by pnewhook ( 788591 )

        Reading does require less power than writing, but the power requirements are also related to read speed. So the laser on a 12x DVD reader needs to be higher power than one on a 1x DVD reader. Similar for Blu-ray.

    • Re: (Score:3, Informative)

      by pnewhook ( 788591 )

      If the laser in a Blu-ray drive uses remotely as much as your CPU or LCD backlight, you're going to be burning a hole through your laptop in just a few minutes... Where does the media go to always find these moronic analysts?

      I agree - that's a misleading and idiotic quote from the analyst.

      Older 1GHz laptop CPUs use about 10W, while newer CPUs that you would probably want for higher end graphics capability are 30W or more (that's only the CPU not counting the GPU). A laser diode is about 5mW for reading a

  • Say it isn't so... (Score:4, Insightful)

    by binaryspiral ( 784263 ) on Friday February 29, 2008 @09:54AM (#22599776)
    New higher capacity optical storage medium takes more power to use?

    CD-ROM then CD-RW then DVD then DVD-RW/RAM and now BR... each step started with high power requirements and weren't suited for mobile use. And almost every one of them was met with this kind of fud. After evolution of the technology we seem to be surviving just fine with our current optical medium.

    It's just going to take a few revs. of hardware improvements.

    As for HD Video playback... well, that's another problem - just the shear size of data needed to be decrypted and decoded... ouch.
  • Interesting... I wonder how HD-DVD and Blu-Ray compare in this regard? Anybody know?
  • by dalmiroy2k ( 768278 ) on Friday February 29, 2008 @09:57AM (#22599822)
    Last night I played Transformers 1080P Blu-ray rip (10GB MKV file) in my Vaio VGN-FZ340E.
    I used "Media Player Classic" with latest K-lite codecs, using the just the stock battery and a medium power saving mode and everthing went fine for the entire movie.
    Yes, playing this files may not be legal but I just don't see a better or legal way to do HD with my current hardware.
    Same thing happens if you try to play a Blu-ray movie (Assuming you have a drive) with Linux.
    • Transformers was one of the pivotal movies NOT available in blu-ray
      ony DVD and HDDVD
      the director was in the news quite a bit, heavily opposed to this decision by the studio
      transformers still is unavailable on blu-ray- althought that is expected to change.
      • Sorry, the right name was Transformers.2007.1080p.HDDVD.x264-snoopy.mkv so I guess it was ripped straight from HD-DVD
        300 was the blu-ray ripped one.
        Frankly until I get a player I will just call them HD movies or MKVs since they are all just data in my HDD.
    • The article was about blu ray discs and blue ray drives. You were using neither.
  • Usual story (Score:5, Insightful)

    by ledow ( 319597 ) on Friday February 29, 2008 @09:59AM (#22599842) Homepage
    It's the same old story, to a point. The performance required to do a relatively simple job (play fullscreen video) in a new way (using HD content and a new storage medium) means that it becomes impractical without upgrades. I can remember having to tweak computers to be powerful enough to play MP3's without skipping, but there at least you had the advantage that the storage space saved compared to even the best-compressed formats of the time was phenomenal.

    I freely admit that I absolutely do not "get" the HD fuss. It's the same thing we've had for years, with more pixels, that you can't reasonably see on a fair test past a certain distance (although I would say that on a high-res laptop you are more likely to spot the difference because of the unusually close eye-screen distance), with new storage formats, new compression, new software, new DRM and new performance characteristics... which are killing battery life. And, yes, eventually they'll start making "blu-ray acceleration cards" just like MPEG-acceleration, 3D-acceleration, etc., although in this day and age they're called "software on the GPU". But at the end of the day, you've gained little (a higher res that you might not be able to distinguish) for enormous performance increases.

    Where's the advantage in it when a "Blu-ray" PC can still play the DVD's of previous years but at much, much less expense... if you can play a blu-ray for two hours or you can play MPEG-2 for six (while compiling stuff in the background without jerkiness) on the same machine, what are you going to end up using if you watch a lot of video on your laptop?

    When I go away and know that I might want to view movies on my laptop (e.g. long trip staying in cheap hotels, stay over at a friends house etc), I take either DVD's, or I have a bunch of MPG's/AVI's/VOB's etc. on the laptop itself or on DVD-R's ahead of time. Quality isn't really the factor there and the advantage to having everything in a simple format that everyone can read easily and which doesn't tax the laptop is key.

    It's another case of "laptop = general purpose computer, so let's turn it into a media centre and make it do everything". It's nice that it's CAPABLE of everything but you can't expect a portable device to do it all AND give you good performance at everything. Laptops are not even desktop-substitutes for most work (the times I have to explain this to people... it costs pounds to repair a broken desktop, hundreds to repair a broken laptop).

    Let the early adopters waste their money. Even if Blu-Ray becomes the de-facto standard, I'd much rather just decrypt-to-disk and convert to a format that's easily readable, with extremely cheap media, that plays the video "good enough" for most things if I'm intending to carry it around with me. Much better 1 x DVD-R with a couple of full movies on it that I can watch one-after-the-other and make a backup copy for pennies than 1 x Blu-Ray that I can't give my friends with only a single movie on it that kills my batteries just watching it.

    There was a time when I did exactly the same with DVD vs VCD - it's actually trivial to just copy several DVD's worth of movie/tv show to a DVD-R or even a CD-R and not worry about the quality. You're travelling - who cares whether it's HD or VCD-quality so long as you can tell what's going on without eyestrain?
    • but there at least you had the advantage that the storage space saved compared to even the best-compressed formats of the time was phenomenal.

      Nonsense. MP2 was far less CPU intensive, while compressing about 33% poorer than MP3. Not a huge difference.

      I freely admit that I absolutely do not "get" the HD fuss. It's the same thing we've had for years, with more pixels, that you can't reasonably see on a fair test past a certain distance

      An arguement that would have been twice as appropriate to make when DVDs

      • Which is why he started his post saying, "It's the same old story, to a point."
      • Nonsense. MP2 was far less CPU intensive, while compressing about 33% poorer than MP3. Not a huge difference.

        Many people can't hear the difference between mp3 and a cd, but anyone can hear the difference between an mp2 and an mp3. A much lower-bitrate mp3 is listenable when compared to an mp2.

        I had a lot of basis for comparison back in the day, I spent a day at IUMA mangling files for them (if I knew then what I know now I could have done it with a shell script in minutes) and browsing around their site - when they were just making the mp2 to mp3 transition.

        • A much lower-bitrate mp3 is listenable when compared to an mp2.

          Not true at all. MP2 does need a higher bitrate, but not by too much (as I said, an MP3 can be about 33% smaller). And this is true today with GOOD MP3 encoders, while the difference was even less significant at the time.

          I can't possibly guess the specific difference you saw, but it's certainly not typical, and does not reflect on the format.
    • Re: (Score:3, Informative)

      by qoncept ( 599709 )
      Have you ever WATCHED anything in HD? I ask, but I'm already assuming you haven't. The difference is night and day. I won't argue that it makes watching a movie any more enjoyable, especially while travelling, but the difference in the picture is huge. That was a pretty long winded comment to have such a glaring hole.
    • Re: (Score:2, Insightful)

      by gkai ( 1220896 )
      My god, if you can not see the difference between SD and HD, I hope you did not forget your white stick and black glasses, and do not own a car: you are THAT visually impaired. I am far from 20/20 vision, but at my normal viewing distance, the difference is striking, on screens of any size...(a small screen just means I sit closer, I never really understood the common assertion "HD is only usefull on huge screens"). Yes, an HD picture can be so so, if the transfer was not good, if there is too much noise, a
  • by abaddononion ( 1004472 ) on Friday February 29, 2008 @10:00AM (#22599854)
    This might be a good time for me to try to sit through Star Trek IV or Highlander 2 again.

    Blu-Ray: making crappy old movies only half as crappy.
  • Huh? (Score:3, Funny)

    by sm62704 ( 957197 ) on Friday February 29, 2008 @10:03AM (#22599888) Journal
    Blu-ray In Laptops Could Be Hard On Batteries/i?

    Blue-ray is like Viagra?
  • And here's me, with no CD-ROM, CD-RW, DVD-ROM or even BluRay. True it's an ultraportable laptop, so the things are neither needed nor desired. I could understand wanting BluRay in a multimedia laptop, but those things rape their batteries anyway. You want battery life away from the mains? Get an ultraportable. Simple.

    (Oh, and I have a good music and video collection stored locally on the laptop)
  • whats the point? (Score:2, Interesting)

    by night_flyer ( 453866 )
    HD BlueRay on a 19" screen?!? I cant see the difference on my 32" screen... talk about overkill
    • HD BlueRay on a 19" screen?!? I cant see the difference on my 32" screen... talk about overkill

      This comment only makes sense if you sit as far away from your laptop as you would from the 32" screen, or at least far enough away that the 19" screen occupies less of your visual field than the 32" screen would at the distance you typically view it.

      That seems rather unlikely, unless you typically sit unusually close to your 32" screen or use your laptop from an unusually great distance.

    • I presume you sit closer to your laptop than to your TV...
  • Laptops with Blu-Ray have been available for a year now. What's with the "could be"? It should be pretty damn easy to test.
  • by dtjohnson ( 102237 ) on Friday February 29, 2008 @02:11PM (#22603402)
    Blu-ray uses a 405nm laser while HD uses a 650nm laser. Photons emitted by the Blu-ray laser therefore will contain 60 percent more energy than the HD photons. Bottom line is that one would expect a shorter-wavelength laser would use more power, all other things being equal. Maybe blu-ray is the wrong format for laptops, though I don't know why anyone would want to watch a high-res movie on a little laptop screen anyway.

The unfacts, did we have them, are too imprecisely few to warrant our certitude.

Working...