When is 720p Not 720p? 399
Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
Reminds me of Sound Blaster (Score:4, Insightful)
It doesn't matter if you are sampling up or down, resampling is bad, your best b
et is to find a device without it, or if it is necessary like in this case, the one that does the best conversions.
If I bought one of these displays I would be pretty pissed, but I doubt there is much that can be done about it, if you COULD do something than companies like Creative Labs would be out of business.
Re:Reminds me of Sound Blaster (Score:5, Informative)
The first incorrect thing in the
Anyone who is serious about getting the absolute most out of their display will have an external scaler and a device to delay the audio. Frankly as digital display technologies take more of a foothold in the market I'm hoping these interlaced resolutions will become far less common.
When I first read the headlines I thought they would perhaps talk about 1024x768 plasmas with rectangular pixels being marketed as 720p. That kind of thing is far more blasphemous in my opinion.
So in summary of TFA: 720p is not 720p when it's 1080i.
Re:Reminds me of Sound Blaster (Score:4, Informative)
Shameless Atari / Amiga Plug (Score:5, Interesting)
Re:Shameless Atari / Amiga Plug (Score:3, Interesting)
Re:Shameless Atari / Amiga Plug (Score:5, Informative)
Re:Shameless Atari / Amiga Plug (Score:3, Interesting)
Re:Reminds me of Sound Blaster (Score:5, Informative)
Lines 243-262 of each frame (off the bottom of the TV) start with 0.3V for 4.7us, and the rest is 0V. This tells the TV to prepare for a new frame.
This leaves just 242*2=484 lines of effective display.
http://eyetap.org/ece385/lab5.htm [eyetap.org]
Re:Reminds me of Sound Blaster (Score:3, Insightful)
But getting back to the subject, this article is goofy from the start. It supposes that the "proper" way to down convert the 1080i signal is to first convert it to 1080p, then down convert it; mostly because thats the way the "geniuses" at HQV have done it. But if you think about it, thats a brain dead solution too. Why?
The 1080i signal is 2 540 line scenes, shot 1/60th of a second apart, with half the picture data. If anything on the screen is moving, i
No, that's the standard (Score:4, Informative)
Re:Reminds me of Sound Blaster (Score:4, Informative)
Re:Reminds me of Sound Blaster (Score:3, Insightful)
This is one place where ONE standard would be fine. Remember good ol' NTSC? Single friggin' standard, and all TVs sold here support it with little to no problem.
The government (FCC) had a job to do, and it failed its citizens
Re:Reminds me of Sound Blaster (Score:4, Insightful)
It's there (Score:5, Funny)
The HD signal's still there... you just have to learn how to read between the lines.
For the inevitable /.ing (Score:5, Informative)
When is 720p not 720p?
Tom Norton, in his coverage of the Home Entertainment expo, brought something up that I was unaware of.
720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field. Reason being, it takes a lot less processing power to do this than to convert the image to 1080p and scale that, which would use all the information in the original signal to derive the 720p signal. If you have a display like this, it means that you're watching 540 lines of resolution upconverted to 720p. This is not HD, just like watching a DVD upconverted to 720p is not HD. Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height. While this is better than DVD, it's not HD in my mind. (Aside: Tom Norton mentions this in his review of the Screenplay 777 projector.)
If this is indeed the case, most people with 720p (or similar) projectors (and most DLP, LCD, and LCOS home theater projectors are exactly that) are not seeing what their displays are capable of. They're not, technically, even watching HD. This is crazy! How can this be? Why haven't we heard of this before? How are manufacturers getting away with it?
Over-reacting? Well, if you're an owner of a 720p (or any similar resolution) projector you're either gonna be really upset by this or you're just gonna be laisez-faire about it because there's nothing you can do and you're enjoying your projector just fine thank-you. But me, I don't even own any such projector and I'm a little ticked. But I guess I should really wait for evidence of how properly-done conversion looks in comparison before making any snap judgements. I'm sure that the people selling HQV (a processor chip that does it the RIGHT way) will set something up.
To me, this is a serious issue. Comments are welcome.
from: http://www.hdblog.net/ [hdblog.net]
Re:For the inevitable /.ing (Score:5, Informative)
He's leaving one step out. 1080i is 540 lines scanned 60 times per second, offset by half a vertical pitch. 720p is 720 lines scanned at 30 times persecond.
To try and take two frames which are not occuring at the same instant, stitch them together, remove the motion artifacts, resample, and then display is just plain silly. And frought with errors, as you are expecting a computer to determine which parts of the motion (over 1/60 of a second) to keep and which to throw away.
If you wanted high fidelity, you'd spend the money for a 1080p60 system. Then it wouldn't matter. Except that you would complain about the quality, because each frame you see was upsampled from only 540 lines of resolution.
It all comes back to the fact the the FCC let the industry choose this "18 formats is good" spec.
Personally, I'm in favor of an olympic standard mayonaise, but...no...wait...awww hell, I give up.
Re:For the inevitable /.ing (Score:5, Informative)
720p is 720 lines scanned at 30 times persecond.
Mostly incorrect.
There are 18 recognized MPEG stream formats for HDTV.
In presenting these on a monitor, your receiver/settop box/whatever is supposed to turn them to a format that your monitor can handle. This will typically be one of these four:
It is noteworthy, though, that some videophile monitors can handle, and set-top boxes deliver, 1080 row 60Hz progressive.
As for the presence/absence of interlacing, I agree that it is very bad to use interlacing at the strem level. This should be eliminated. I would make an exception for the 480 modes, because the material may have been originally captured on NTSC videotape, in which case some sort of conversion would have to take place to get a progressive image, and I feel very strongly that conversions should never be done for broadcast unless absolutely necessary (as when showing PAL/SECAM native material).
On the other hand, at the monitor level, if you have an interlaced monitor, I don't think that is a major issue. In 1080 mode, the best picture that can be sent is the 30fps progressive stream. This can be interlaced for presentation on a cRT.
Now, someone commented that CRT's are dead. Not if you have a budget, they're not! I've owned an HD set for over three years now, and it only ran me $700. It is a CRT. It has a beautiful picture.
Further, I would put forth that CRT's, in addition to being significantly cheaper than the alternatives, also put out a better picture than LCD (view from any angle; accurate color rendition; no lag), are less susceptible to burn-in than plasma (which will be killed by network bugs) and do not exhibit the rainbow effect of DLP (which, in fariness is not really all that bad). Their major failings are their physical size and power consumption.
Re:For the inevitable /.ing (Score:3, Interesting)
Re:For the inevitable /.ing (Score:3, Informative)
As an example, if you own the above box, or the PVR box, (both are silver and provided by Comcast), do the following:
Turn off your cable box, then press 'setup' on the remote. You'll get a different config screen, one which allows you, among other things, tell the box what '
Resampling (Score:5, Insightful)
The clever algorithm is a "Fourier transform" (Score:5, Informative)
It's not a distortion-free transform, since high frequency signals (e.g. sharp edges) in the original image get interpreted as smooth changes and can get blurred between multiple pixels in an upsampled signal. But then again, that's exactly the sort of thing that happens when you digitize a picture in the first place - if you have a sharp black/white edge that passes through the middle of a pixel, the most accurate thing you can do is make that pixel gray.
Re:The clever algorithm is a "Fourier transform" (Score:2, Informative)
Disagree with second paragraph. Upsampling should have more Gibb's ringing around the discontinuities in the image. This is why one introduces a Hannning or Hamming window during sinc interpolation. Whether that manifests itself as a blurring depends on the context of the discontinuity within the image. Upsampling via FT is not the same as linear (or even simple non-linear) weighting during digitization.
Re:Resampling: Imagine a 1-pixel-wide line (Score:5, Informative)
Its a bit messy. Imagine 1080i image with a 1-pixel wide sloping black line that is nearly horizontal on a white background. If you throw out half the data, you create an image with a dashed-line. Gaps in the line occur where the slanting line cut across the rows that were discarded. If you upsample from 540 to 720, you will find that the remaining dashes become fattened non-uniformly. In places where the row in the 720-row image falls directly on top of the 540 row image, the line will be thin and dark. In places where the 720-row image falls midway between rows in the 540 row image, the line will be wide and less dark. The end result is the thin black uniform line is converted to a dashed line of varying thickness and darkness -- not pretty.
Even if you resample directly from 1080 to 720, you still run into problems where the 720-row image pixels fall between the 1080-row pixels. At best, you can use higher-order interpolation (e.g. cubic) to try and fit a curve through the original data and try to estimate what was in the middle of the pixels so they can be shifted half way over. But the result wil never look like an image that was taken with a 720-row camera in the first place.
Re:Resampling (Score:3, Interesting)
However, even this is not a problem in practice since in real-world pictures nearby pixels are not independent. By using an appropriate encoding dictionary such as wavelets, which zoom in on sharp edges and economize on flat surfaces, you can shrink a typical picture by something like 90% without visible quality loss.
Now since
Which Models? (Score:5, Interesting)
Re:Which Models? (Score:5, Interesting)
If there is a difference you can't see but could learn to see, don't learn; it will not bring you joy, it will only make you miserable or annoying. Long ago I learned to see the FFT distortion in JPEG and MPEG images. Has it made me happy? No. I end up making the JPEGs on my website bigger than everyone else's so I won't see wrinkles on people's faces that are apparently invisible to everyone else. And I can't stand to watch satellite television on a big screen TV because of the annoying compression artifacts.
Re:Which Models? (Score:5, Funny)
This is not a solution.
At Best Buy and Circuit city I've seen lots of SD signals on HD displays. How on earth am I going to know if it's the set or the signal that's producing all those jaggies? Ask? At Best Buy? I might as well ask them to build a moon rocket while they're at it.
Knowing the stats won't neccessarily guarantee you a better picture, but it is a better place to start.
TW
Re:Which Models? (Score:3, Insightful)
So don't go to Best Buy or Circuit City to evaluate your monitors! Go to a high-end video shop to evaluate your monitors and then go to Best Buy or Circuit City to buy them if the prices are really that much better.
Seriously, do you want to solve the
Re:Here's how to tell. (Score:3, Insightful)
First off, DVHS and HDTV PVRs exist in tiny numbers. I am not one of the guys who owns one. Neither are the vast majority of the people reading your post. It's a great idea, but a bit of a stretch to ask someone to purchase equipment that costs half a grand minimum before even going out to purchase their first HDTV. Once again, great idea, just not something very many people are going to be able to take advantage
Re:Which Models? (Score:4, Insightful)
That's not necessarily good advice. At the showroom, everything you see is optimized for selling the display. You might not notice any problems until you start to view content outside of their controlled environment.
Re:Which Models? (Score:3, Informative)
Re:Which Models? (Score:3, Insightful)
The problem though is that it's BS that the cost of the processing represents affects the price of a multi-thousand dollar system. I can't imagine that a simple $100 PC video card couldn't be stripped of it's bare chips and used as the filter between what-ever proprietary system they use and the input signal processor. It can't possibly cost more than $500 to push an off-
Workaround is to use an HTPC... (Score:5, Informative)
As a reference, my Athlon XP running at 2.4 GHz (aproximately equivalent to an Athlon XP 3400+) with a Geforce 6800GT and TheaterTek 2.1 software will have (little) trouble achieving this, assuming the 1080i source isn't glitchy itself.
Alternative is to use the NVIDIA DVD Decoder version 1.0.0.67 ($20 US after 30 day trial) and ZoomPlayer 4.5 beta ($20 beta or nagware) for similar results.
TheaterTek is roughly $70 US and includes updated NVIDIA DVD Decoders - too bad NVIDIA hasn't updated their official DVD decoders with the bugfixes that is present in the TheaterTek package.
Re:Workaround is to use an HTPC... (Score:2)
OK, then let's look at your DVD signal path. 480i converted to 1080i then sent to your display that convertes it to 720p?? Two resolution conversions - and the article states that the second one may
1080i streams... (Score:2)
If you have Microsoft's Media Center Edition 2005, you can specify the NVIDIA DVD Decoder (or other competent mpeg2 decoder, such as Elecard's or WinDVD's) for all mpeg2 content including HDTV.
Re:Workaround is to use an HTPC... (Score:2)
DVDs actually often contain ED (480p), not SD (480i) material. That's why there's a benefit to hooking up your DVD player to your TV's progressive inputs (if it has them).
Re:Workaround is to use an HTPC... (Score:3, Informative)
But does a $200 19" CRT have enough "dots" to display all the pixels in a 1920x1080p picture? (I'm not sure. I really want to know.) My knowledge of display technologies is limited, but I think 19" CRTs in this price range don't have enough "dots" (calculated from dot pitch [howstuffworks.com]) to display all of the pixels and will not give a "true" 1920x1080p picture.
Example: I've been thinking about getting a Samsung 997DF [samsung.com], which
Re:Workaround is to use an HTPC... (Score:3, Insightful)
This will, of course, suck.
There are two different ways to get 1080i material. You can either shoot some other format at 24 frames per second and convert it to 1080i, or you can shoot 1080i.
If you shoot film or 1080/24 at 24 frames per second and convert to 1080i, there are a set of well-defined tricks you can use. These tricks are collectively referred to as "pulldown." It's possible to remove pulldown, which is great
When is 720p Not 720p? (Score:5, Insightful)
Somewhat offtopic, but still.. (Score:2)
Which can also be the only explenation for why anyone would try to encode HD-content on todays DVDs.
I can't understand anything else than that the DCT-artifacts would be even more dominant than on todays DVDs (or DVD-rips).
Re:When is 720p Not 720p? (Score:2)
Re:When is 720p Not 720p? (Score:3, Interesting)
If you can't see the problem, is there a problem? (Score:4, Informative)
Re:If you can't see the problem, is there a proble (Score:4, Informative)
If you've ever seen high-def on a 480p EDTV plasma, you'll understand just how superior the picture STILL is compared to 480i NTSC.
Nevertheless, true 1080p deinterlacing is coming down the pike right now. Faroudja, SiliconOptix, and Gennum have all created solutions, and we should begin seeing them in external video processors and displays soon.
Well, a little worse, actually... (Score:4, Informative)
Except your 720p display will hopefully have a horizontal resolution of 1280. 1080i video has a horizontal resolution of 1920. So, you're keeping half of the vertical (1080 lines to 540) and you;re keeping 2/3rds of the horizontal (1920 down to 1280).
Ouch.
Consider the source too! (Score:3, Informative)
And in other instances, the broadcaster will not use the full resolution - what looks like 1920x1080i may actually be an upconvert of 1280x1080i, 1440x1080i, or 1280x720! And then there is the overcompression - taking a 20mb/sec mpeg2 stream and cutting the bitrate in half - compression artifacts galore.
It is sad when HDTV programming available in North America
Re:Consider the source too! (Score:2)
So most broadcast SD material upconverted to HD resolution still looks better than had the material been broadcast in analog NTSC.
Re:Well, a little worse, actually... (Score:2)
Re:Well, a little worse, actually... (Score:4, Informative)
High, I just got back from NAB show in Las Vegas last week. The vendors were had HD Cams that would film and record 1920x1080i. That somepoint is today.
Good Ol' CRT (Score:5, Insightful)
Re:Good Ol' CRT (Score:2)
The trendier offerings sell to first-adopters and very rich people: those in the first group get their kicks inviting friends at home to hear them go "ooh..ahh..wow", not really out of the better quality, and the second ground just doesn't care about the price.
When the early adopters are done early-adopting, then it gets affordable for people with regular lives, like you and me.
Re:Good Ol' CRT (Score:3, Insightful)
Re:Good Ol' CRT (Score:2)
The reason Plasma/LCD are on the market wasn't because they were selected for image quality. Quite a ways from it. Instead, they were able to scale to sizes greater than 40".
Tube based HDTVs are only manufactured to about 36" today. And they are an awesome value for what you get... they've got an excellent image quality. The problem is that to fully resolve a 1080i image with your eye, you've got to be sitting pretty close t
Re:Good Ol' CRT (Score:4, Insightful)
its really too bad (Score:2, Insightful)
Re:its really too bad (Score:2)
Then I guess it's a little late to consider yourself an early adopter ;)
This is what you get..... (Score:3, Informative)
That said, I'm sure there a lot of people out who "don't care". It works form them, and that is all they care about.
Re:This is what you get..... (Score:3, Insightful)
Re:This is what you get..... (Score:5, Insightful)
Re:This is what you get..... (Score:2)
Yeah, thats what I thought too, except that my research is usually about 2x better than the QA of most companies today. I've found that its best to pay a premium or do without, and when paying a premium, have some integrator or store stand behind the shitty QA of the products. Even then, its a pain in the ass, but it certainly beats buying something from a company like Newegg and paying a 15% restocking fee for them
Should have bought a 1080i screen then! (Score:5, Insightful)
When I get around to buying a HD television (not any time soon, I do all my televisioning on my computer), it will be a true 1080i (are there 1080p televisions?) display so I'll know I'm getting the full potential of HD.
Unless I'm strapped for cash, of course, in which case I'll just suck it up and know my 720p won't be the best thing for watching 1080i content on.
On the plus side, it's important to get the facts out there for the consumer, who will likely (although not logically) assume he's/she's getting more than they really are.
Re:Should have bought a 1080i screen then! (Score:3, Informative)
Personally I'm getting a samsung 6168 model.
Re: (Score:3, Informative)
Re:Should have bought a 1080i screen then! (Score:3, Informative)
Comment removed (Score:4, Informative)
Hey, Bloggers... (Score:5, Insightful)
Otherwise, people might assume this is a shameless attempt to draw traffic to your site.
Exacerbate or mitigate? (Score:2)
Anyone with any real regard for picture quality, and whose equipment leaves them the choice, has probably evaluated it under both configurations anyway.
Which is the reason... (Score:2)
This is the reason I bought a 52" rear screen projection than LCD/Plasma and whatever. That, and it was 3000 bucks cheaper and had a better picture.
spatial vs temporal resolution (Score:5, Informative)
Re:spatial vs temporal resolution (Score:4, Insightful)
The "digital feature box" in the TV is supposed to combine the two fields into one single frame. This is usually referred to as "motion compensation" or some other nifty marketing term.
This is what separates the cheap from the expensive TVs.
Comment removed (Score:5, Insightful)
ED displays (Score:3, Interesting)
Re:ED displays (Score:2)
Re:ED displays (Score:3, Informative)
Many current, popular 42" plasma sets are either "HD" resolution (typically around 720 lines) or "ED" resolution (480p). No one argues that the HD doesn't provide a slightly superior picture for HD content, but many argue that ED is preferrable for non-HD (SD) and DVD sources. And the price difference between the two can be dramatic. ($2500 vs. $5000).
For that $ difference, I was willing to compromise. Some pureists will make
only HD is HD (Score:2, Interesting)
The bigger trick is finding a 1080i broadcast (Score:2)
Comment removed (Score:4, Insightful)
Depends on the conversion (Score:2)
If they convert each individual 1080i frame (1920x540) to 720p (1280x720) then they are not tossing any fields (which seems to be the problem). So, if they are converting 1080i@60 fields to 720p@60 frames, then there is no problem here. If, however, they are converting to 720p@30 frames, then they are tossing half the fields from 1080i and we have a problem. All depends on how the conversion
Solution: use a real 720p signal! (Score:2)
It's a big enough shame that this crap has found it's way into the HDTV specs, but WTF does someone use it?!?
Not news - Buy a scaler. (Score:5, Informative)
Anyway, it's fairly well known that the internal scalers in many devices suck. That is why there is a market for good external scalers. If you are paranoid about watching a lot of 1080i on your 720p projector or LCD TV or Plasma, go buy a scaler. They cost about $1000 but will improve scaled display a lot.
At least if you have an external scaler you will have some options about how you convert 1080i to 720p. The article makes it sound like splitting the fields is a huge sin -- and it is if you discard one field per frame (Half field deinterlacing), but it's perfectly acceptible to scale EACH 540-line field to a seperate 720-line frame and double the framerate. This is called bob deinterlacing and is often the best for converting 1080i video to lower resolutions. If you are watching a 1080i upconvert of a film or something, though, you can have the scaler do full field deinterlacing and inverse telecine for you and see a nice 720p/24fps picture. Scalers also generally have internal audio delays for various types of audio feeds so you won't have to worry about AV sync issues either.
If you have any questions about how your device does this, you should try to find out before you buy it. Most devices don't publish how they do it, though, so your only option may be to derive it -- and that will not be an easy job.
gee am I suprise (Score:2, Insightful)
Buy crap equipment and you will get crap.
More Like When is HD Not HD (Score:4, Informative)
If you're looking to get into HD there are a *lot* of little quirks to take into account, such as:
- Offically there are two HD resolutions, 720p and 1080i
- Most HD TV's are only capable of *one* of these resolutions. So you have to choose, 720p OR 1080i in most cases. If you want one that can do both, check very carefully.. forget DLP or LCD based devices (fixed set of pixels so fixed resolution), CRT only.
- Many HDTV's will *not* convert from one format to another. They accept only their native resolution.
- Different networks broadcast using one standard or the other. For example CBS uses 1080i and ABC 720p IIRC. Fox is way behind in HDTV support.
- Most HDTV receivers can handle either a 720p or 1080i signal and will convert as required for your TV's native resolution.
- Some TV providers only support one format, regardless of the source material. Ie, in Canada Starchoice only broadcasts in 1080i. Any 720p content they have they upconvert to 1080i before broadcasting. It's impossible to receive a native 720p signal from them.
- The Xbox supports both HDTV modes... but very few HD games actually use 1080i (Dragons Lair being one). Most are 720p. So if this is important to you, you'll possibly want a 720p native TV: most receivers do not have HD inputs that would let you upconvert a 720p game to a 1080i signal for the TV. (the new Xbox will have more HD content than the current one, but it's a good bet that they'll be mostly 720p titles)
- Most Projectors and Plasma's are *not* HDTV. They are EDTV (enhanced definition) or some such. Check the specs carefully.
- Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.
So there you go... lots of fun things to take into account!
Blockwars [blockwars.com]: free, multiplayer, head to head Tetris like game
Re:More Like When is HD Not HD (Score:3, Informative)
I have a Mitsubishi X390U linked up at home - it's your typical 1024x768 resolution. I've got the comcast HD box linked up to a TiVO (SD only) and (HD feed) directly into the amp's component inputs. The result is that I can switch between HD & SD at the flick of a button. It all gets projected onto an 8' screen.
The difference betw
there are not many 720p displays to begin with. (Score:3, Informative)
only cheap projectors or displays have a maximum res of 720p.
I dont see many of those anyhow.
but yes, on those displays the signal is downconverted by chopping out 1/2 of it.
however, these displays are not popular anyhow.
some of the most popular displays still cant display native, but they are still can display either XGA or SXGA with no proble (were getting pretty close to hd at this point)
dont buy a cheap projector, and you wont get a cheap display. you get what you pay for.
How to prove/disprove it easily. (Score:5, Interesting)
The results may be one of the following:
Much ado about nothing. (Score:4, Informative)
The whiners in TFA mistakenly assume that 2 fields of 1080i = 1 frame of 1080p. This is WRONG, WRONG, WRONG.
It cannot be assumed that the following field has anything to do with the current one. See the "not resized or deinterlaced" picture here:
http://www.100fps.com/ [100fps.com]
When the television takes the 540 lines in a given field, interpolates the missing lines, and scales to 720 lines, it is DOING THE RIGHT THING. Otherwise your TV would look like the first two example pictures at the above site.
Nathan
Look at the Source (Score:3, Informative)
I'm almost sure their scaler will help with most sources you feed your 720p HDTV, what it can do with 480i DVD's is impressive enough that you would believe that. However, I doubt the problem is as bad as they say it is. Also, 1080p DLP sets are going to start hitting the market soon, and in a couple of years 720p will probably have been pushed out of the market mostly. Given what a scaler costs, I'd probably save my money to get the 1080p set in a couple years since the 720p sets still look great.
I have a 1080i set, but I considered a 720p DLP set since they looked amazing and only didn't because of cost.
Re:Look at the Source (Score:3, Insightful)
Easy Fix (Score:3, Informative)
If you're a real quality nut like me then get a tube based HDTV, they can actually get close to doing 1080i.
Silicon Optix ASTROTURF? Naw, it couldn't be... (Score:3, Informative)
Gennum, Pixelworks, Genesis, Oplus (now Intel), and several others make their own scaler/deinterlacer chips. Most of these have already found their way into displays and have proper deinterlacing strategies in them. Nobody scales without deinterlacing first anyway in a modern image processor.
Silicon Optix's technology is based on offline film processors by Teranex. While they can certainly be high quality, they aren't the top of the heap either by volume or by prestige. Genesis/Faroudja had a name for a long time with their "line doublers" which are over 20 years old and their more advanced but cheap gm1601 is one of the more popular solutions for HDTVs. Gennum's GF9350 with VXP technology is currently in the largest plasma tv in the world (Samsung 80"). These and other scaler/deinterlacer chips have none of the problems that Silicon Optix claims exist. If you look at the debates that rage over at the usual enthusiast sites, you'll see that there are issues with its own technology like latency and cost that aren't present in the other solutions I mentioned.
Just like Silicon Optix's "odd film cadence technology" which requires nothing different than what everyone else has today, this reeks of a cheap PR vehicle. While the choice of scaler and deinterlacer is important, it is not the utter tragedy that SO would like to make it out nor are they the saviors of the HDTV world. If they know who the culprits are, then let them name whose image processor it is that creates these problems.
CD Audio all over again. (Score:4, Interesting)
Oh, this was going to be great. Fidelity like you never had it before. No scratches. No groove wear. Dynamic range you won't believe. Crystal clear highs. Thunderous lows, with no rumbling feedback even if sat your player on your speaker.
Remember the little logos? AAD? ADD? DDD was the best you could have (digital recording, digital mastering, and (obviously) digital media in your hand). And a lot of hard work on the part of the engineers operating the mixing boards. It's that last part that costs time and money. Now, all the equipment is digital. So, it's all great, right? Sorry--the technology is not the limiting factor in sound quality anymore.
The limiting factor is apathy. Most people can not really hear the difference. And fewer people care.
Exactly the same thing is now happening in video.
Since we can't improve the functionality (well, we could, but you'd never notice). It's pure hype from here on out.
Now, where'd I leave that case of speaker spikes and green markers? Gotta get 'em up on ebay; David Hannum was right.
Decide for yourself (Score:3, Interesting)
I have seen pros and cons on how these sets do their sampling. Here is my advice -- go look at the picture on a set with a good HDTV source. Use the specs as a guide but don't trust them. Get what looks good to you. My father would never have been able to see anything more than the quality of a good DVD. He couldn't see the difference between crappy digital cable and DVD. Some people like me are so manic about visual quality we will devote huge amounts of time tweaking our systems. While my system is probably limited to about 1500 lines of resolutions due to the lenses, I find its image much warmer, uniform, and pleasing to the eye than the pixilated look of some very high-end flat screen solutions that go for 10-20k.
About the only thing that really shows how good HDTV can be is material that is shot originally with HDTV video cameras. Upconverting film inevitably introduces a softness that is exaggerated by systems like mine. For now you can only see a few things on the Discovery Channel and a few musical events in true HD (meaning not upcoverted from film). I mention this because while I advise you to go see for yourself (if possible) most stores don't really offer a good enough HD signal to display the difference. If you can hold out a little longer I would wait until either HD-DVD or Blu-Ray players hit the shelves and then demand a demo with these HD sources to make a decision.
One final note, I haven't noticed that 1080i hasn't had as much comb-artifact during motion as I would have expected, but there still is a noticeable blurring during camera pans (maybe this is just combing in disguise). I'm sure I will get a little boast in quality when I can play off of a true 1080p source. If I were to design the next generation of video recorders I would introduce variable framing rates in playback. The picture being refreshed as high as120fps, but the actual picture updates depending on need for frames to eliminate motion blur. About motion blur, storing 120fps would be inefficient and overkill. The system I propose would make true frames at like 30-60fps, but as the camera moves, the edges would be scrolled in as needed to keep a smooth fluid motion. A really intelligent system might be able to also track one or two moving objects across this field and give them higher frame rates as well. At 2 mega pixels, I think we need to retrench and try to slay motion blur before going onto higher pixel counts.
What's the problem? Monitors are just monitors (Score:3, Insightful)
Almost everything I've read notes that the deinterlacing hardware in most TV's flat out sucks. My solution? I bought a Samsung DLP sans ATSC tuner. My TV is a display, nothing more. Had I been able to, I would have purchased it without the NTSC tuner as well. Buying the tuner separately affords me the opportunity to buy a better quality piece of hardware without the redundancy of having purchased the same hardware in my monitor.
I'll deliver a quality 720p signal to my monitor, and it will display the picture. What's more to ask?
Same thought (Score:3, Insightful)
Three letters... (Score:2)
Re:misty (Score:4, Insightful)
Mfgrs usually tout their amps with having "200 watts of pulsing music power" which usually means 100 watts per channel peak. In reality it's more like 70.7 watts/channel RMS (assuming they're not still lying).
Re:misty (Score:2)
We all know that the 100W at 0.08THD in an 8-ohm load at 100-10,000Khz is the far supperior amp.
However, there are equivalent stats on projectors that aren't listed on the box, and maybe they should be. If I'm throwing down 5-10k you better bet I check what resolutions at what frequency.
Re:misty (Score:4, Funny)
You don't care much for bass, do you? And most people find it difficult to listen to ultrasound all day.
Re:misty (Score:3, Interesting)
Mfgrs usually tout their amps with having "200 watts of pulsing music power" which usually means 100 watts per channel peak.
I call this the Radio Shack method of describing the system, because it was originally used only by Radio Shack, but now is used by everyone -- the system power is described as the sum of all channels, not the value of one channel.
As such, I have a 420W sound system in my living room that, 15 years ago, would have been described as 70W. More likely, it would have been described
Re:Not as Alarming as it Sounds (Score:3, Insightful)
If you receive an OTA broadcast or cable signal in 1080i then you don't have any control over the video source. Since the broadcasters are split between 720p and 1080i this is a real issue.