Just bought myself an HDTV.

LiquidD said:
strommsarnac, 480p is stanard. why do you that everyone says that the wii is not really true hd? cuz 480p is a crappy enhanced resolution. 480i is standard on all tv's. 480p is just a little enhancement from 480i and really... im not sure what you're trying to say....

you say "Most CRT screens prior to 2004 will not support 480p at all. If they do, they down convert to 480i which really degrades the image quality," and that doesn't makes sense. can you make a little more sense here...

and yeah, if you pay anything above 15 bucks for those component cables, ur dumb.

It appears a lot of people don't understand the difference between "i" and "p".

I is for Interlaced. This means that the image is created by two passes over the screen. The first past only paints the odd lines (1,3,5,7,etc.). The second pass paints the even lines (2,4,6,8,etc.).

P is for progressive. This means that ALL the lines are painted in a single pass (1,2,3,4,5,6,etc.).

Now, 480i is ANALOG an analog signal, while 480p is DIGITAL. 480i is a 640x480 image and on broadcast is at 25 frames per second while DVD/etc run at 29.7 frames per second. 480p is 720x480 and supports up to 60 frames per second.

Now that everyone understands the difference between "i" and "p", you'll see the problem if your TV supports an input of 480p, but only displays 480i. You'll get half the image "per paint". Since the image is down-converted, it also tends to have color and motion problems like blurring.

This problem is true with ANY image resolution down grade. If you're going to display at 1600x1200, it's better to use a 2mp camera than it is to use an 8mp camera and resize the image down to 2mp.

My comment on pre-2004 is based on that most CRT TVs which fall into that time frame were analog displays, not digital. This means they were interlacing the image.
 
strommsarnac, interlaced and progressive are independent of analog/digital. The only digital signal input for a television is HDMI. RCA, svideo and component cables are all analog cables.
 
gnahc79 said:
The Pioneer Kuro plasmas are 720p and they absolutely rock :). Don't pay attention to spec#s too much, especially just resolution alone. PQ has lots of factors other than just 720 or 1080.


I am fully aware of that having worked with and sold high end home cinema and hi-fi equipment for many years. I was merely stating facts, and 1080p is FULL HD 720p is NOT FULL HD.

gnahc79 said:
strommsarnac, interlaced and progressive are independent of analog/digital. The only digital signal input for a television is HDMI. RCA, svideo and component cables are all analog cables.

yes interlaced and progressive are independent of analogue and digital in the signal transmission. Interlace and Progressive can both be transmitted via analogue or digital means, however, analogue displays, which are Cathode Ray Tubes (CRT) ONLY work on Interlaced (I am aware of some prototypes being made that could run progressive with very complex ray gun assemblies but am not aware of one being made available on the open market) so strommsarnac is correct when he says progressive is Digital, although the signal might not be, the display equipment capable of displaying it correctly is.
 
Byuakuya said:
I exchanged the cables twice, and the result appears to be the same. My PS3 seems to look great, but I guess that would be a direct result of the HDMI cable.

Ah, so your PS3 looks great on the display, but not your Wii. How do your non-HD TV shows and DVD's look? Find a show which has a lot of computer graphics (not real stuff as that'll be blurry), pause it and look to see if it's similar to the same image as your Wii. My guess is it will be. You might even try to swap the devices and the inputs they use on the TV. Say if DVD is comp1, cable is comp2, Wii comp3 - move Wii to comp1, DVD to comp3 and then see if the images changed for better or worse.

Your new HDTV has a native resolution of at least 1366x768. You said you're using the PS3 through HDMI. The PS3 is able to output 1920x1080p, the Wii 720x480p.

Here's what I'm thinking. The input chip on the TV is likely tweaked for the HDMI input more than it is for the component input. This is very common, especially on lesser expensive models. ALL TV favor one type of input over another.

Next, your 720x480 image is being stretched and stuff is being added to the image to make it fit the native resolution of the display. This will cause it to look fuzzy and blocky. This will occur with ALL LCD/Plasma/DLP screens, but not CRT based screens.

I have a 6 tube 55" rear projection HDTV and my neighbor has a DLP screen. He's still amazed that the Wii looks better on my $1000us TV than it does on his $3000 DLP. This is because CRT tubes don't have a "native" resoltion, but a maximum resolution. The tube itself won't make the image look worse, only the conversion chip will.

I hope this explains why the Wii doesn't look as good as you expected.
 
stumac1985 said:
I am fully aware of that having worked with and sold high end home cinema and hi-fi equipment for many years. I was merely stating facts, and 1080p is FULL HD 720p is NOT FULL HD.

If that was your only intention, that's cool :). It just seemed that you implied anything on the market now that's 720p is sub-standard somehow:

As for wii being HD etc, its not, it supports enhanced definition but not High Definition. TRUE HD is 1080p. 720p which a lot of your cheap 'hd' TV's goto is cut back HD and no where near as good.

strommsarnac, stumac1985, and I are all sort of correct for the interlaced/progressive thing. To be clear for others, ...interlaced and progressive are both analog signals. However, only digital displays can run progressive. CRTs cannot do progressive.
 
Last edited:
As others are, I am curious what model number this standard CRT TV is that runs in 480p, because I have never seen one. Not saying it doesn't exist, but would like to see it for myself.

I have a feeling, as others have stated, that you are covinced you are running 480p simply because your TV has component inputs. That just simply doesnt make it 480i. Here is an example of a standard TV with component inputs that clearly states its 480i: http://www.bestbuy.com/site/olspage.jsp?skuId=8225921&productCategoryId=abcat0101006&type=product&tab=2&id=1166840609756#productdetail

Now I just picked my Wii up last Monday. Didn't get the component cables with it. Was a gift and wasn't about to send the Girlfriend right back out to pick up component cables.

My TV is a 50" 1080p Sony.

Wii looked...ehh, kinda crappy to be honest. But I didn't want it for the graphics and was going to pick up component cables ASAP anyways as I knew they would make SOME difference.

I was honestly surprised by the difference it made. Night and day for me. Yes, there are still jaggies (360 has them too, btw.) but the colors, contrast and sharpness are an amazing difference.
 
gnahc79 said:
strommsarnac, interlaced and progressive are independent of analog/digital. The only digital signal input for a television is HDMI. RCA, svideo and component cables are all analog cables.

Don't forget DVI-D. It's pure digital. Need proof, we'll use your statement that the only digital signal input for a TV is HDMI. Well, I run my cable box out via HDMI through a HDMI-DVI cable and into my TV's DVI input.

There is NO difference between HDMI's video signal and DVI-D's signal. The only advantage HDMI has over DVI-D is that it bundles audio (and now the newer 2.0 format has a higher bandwidth).

As another pointed out, 480p is a digital signal, but over component it is changed to an analog signal. It's still at 60fps though.


Stumac1985: I know my Mistu has digital tubes and it retailed for $2400 three years ago. Also, most "professional" level PC CRTs are digital tube. They're the large 20" or better 1600x1200 displays. I had a Viewsonic P815 I paid $1700 for 11 years ago (died last summer:( ) and it was a digital tube, not just digital input.

Good thread going. Hopefully not too confusing for people who don't have a nack for this stuff.
 
I forgot about DVI that wasnt really adopted in the UK for domestic use. Regarding Digital Tubes, as I said, I know such a thing was possible, but didnt know of any of domestic use, and again, here in the UK such things wouldnt have been adopted very readily as no one wanted/needed them unless you had high end equipment which is very uncommon here, and so those people went Plasma, rear projection or projection.
 
Xaegoth said:
My TV is a 50" 1080p Sony.
Yes, there are still jaggies (360 has them too, btw.) but the colors, contrast and sharpness are an amazing difference.

Ahh, you've found the flaw in the 360 over component :) The 360 over component only outputs a max of 1080i, not 1080p. Try switching it over to 720p and see if it looks better.

Personally, I can't tell much difference between 720p and 1080i. Usually I find 720p looks better since the image is smoother. However, I do notice the difference between both those and 1080p.

But the 360 will only display 1080p via VGA or HDMI. If your TV has a VGA port, you can get a 360-VGA cable and use it for a better pic. Oh, that is IF your TV supports 1080p in via VGA. Not all will.
 
strommsarnac said:
Ahh, you've found the flaw in the 360 over component :) The 360 over component only outputs a max of 1080i, not 1080p. Try switching it over to 720p and see if it looks better.

Personally, I can't tell much difference between 720p and 1080i. Usually I find 720p looks better since the image is smoother. However, I do notice the difference between both those and 1080p.

But the 360 will only display 1080p via VGA or HDMI. If your TV has a VGA port, you can get a 360-VGA cable and use it for a better pic. Oh, that is IF your TV supports 1080p in via VGA. Not all will.

Yeah, unfortunately my 360 is an older one and doesn't have the HDMI output.

And yeah, have tried switching to 720p and noticed next to no difference between that and 1080i.
 
Xaegoth said:
Yeah, unfortunately my 360 is an older one and doesn't have the HDMI output.

And yeah, have tried switching to 720p and noticed next to no difference between that and 1080i.

Just to clarify, you switched the 360 over to 720p, not just the TV?

Also, don't know if it's sold where you're at, but there is now an HDMI cable for "non-hdmi" 360's. It plugs into the 360 where the normal video cable does (that trapezoid connector). I don't know anything else about it though.
 
yeah, I just remembered DVI after my last post :p. Btw,

Owen0501 said:
One thing to bear in mind is that some games that have been saved in non HD format will not be playable in HD format (Mario Olympics for example).

Can anyone else confirm this? I just tried it and can't recreate this: Mario & Sonic saved during 480p, switch to 480i, load '480p' saved game-->successful. It doesn't make sense really anyways having saved game data dependent on the display resolution.
 
strommsarnac said:
Just to clarify, you switched the 360 over to 720p, not just the TV?

Also, don't know if it's sold where you're at, but there is now an HDMI cable for "non-hdmi" 360's. It plugs into the 360 where the normal video cable does (that trapezoid connector). I don't know anything else about it though.

Yes, switched the 360 over.

And yeah, I saw the Mad Catz HDMI adapter, but for 80$, might as well just sell my current one and buy a new one that has it built in.

BTW, switched it again on the 360 from 1080i to 720p and still really didn't notice TOO much difference. Picture looks a bit sharper, so think I will leave it there until I get an HDMI capable 360.
 
Holy smokes, people are slinging misinformation all over the place!

Note: I'm in the US, so anything outside of the NTSC specification (signal defs, TVs, etc.) I'm not addressing.

1) "True" HD vs. not -> is a marketing spin - the HD specification allows for several different source signals - the two common broadcast specs are: 720P (720x1280) and 1080I/P (1080x1920). Fox for example is 720, NBC is 1080. I'd say that your viewing distance and set size can allow for 720 to be easily as good as 1080 on most peoples subjective scale. With either spec, if you watch all networks, you're going to be up/down-scaling on one point or another.

2) Yes, DVI and HDMI are both digital based interfaces. All others (composite, component, S-VHS) are analog.

3) Most broadcast NTSC based signal are 720/704x480 @ ~30FPS (it doesn't differ from DVD based source). While 480P24 is a 24FPS based source on the frontend (24FPS is film based), and sets can do a decent job of detecting the 3/2 cadence, I don't know how much of @24 vs. @30 source is broadcast (it's sort of a favorite conspiracy topic...). I do know that most shows are shot on HDV (even those with a "film" look to them) to they're a P30 or P60 source (so I guess it's possible they're down-sample to a 24 for 480 broadcast). OK, that got a little too far into the technical minutia.

4) Someone else explained I vs. P reasonably well, I won't reiterate.

5) Ideally with a digital based set (LCD, Plasma, DLP) and a digital based signal, you don't want to do a D->A->D conversion (that is use an analog based interconnect), though I've done some blind testing on a HD based broadcast and HDM player via component, then HDMI and it's hard to differentiate (it's like the old tube vs. solid state argument: it's different but arguably better).

6) One thing to note about digital signal and digital interfaces: it provides a way to apply DRM (like HDCP over HDMI) or even encryption over QAM for your cable based sources (which a thorn in my side for an DIY HTPC, though now I'm on cable so there are NO sat based tuner cards...)

7) non-CRT based displays are FIXED resolution (also referred to as "optimal" or "native"). So on these sets (LCD for example), they display everything in a single spec (a 1080P LCD *only* displays 1080P). What people confuse is the source signal - YES, they can take 480/720/1080. This is why on larger 1080 displays, the Wii doesn't seem to improve (and potentially could get worse) - the set has to upconvert the 480i/p into the native format which introduces noise/artifacts. (again, it's been explained that CRTs can handle different scanrates and can display 480/720/1080 in the resolution of the source signal - which is why SD TV tends to look better on older CRT based sets like our Toshiba 42" vs. newer stuff like our 50" Sony - though thankfully 90% of D* is now HD :D)

[edit]

8) Agreed, this is a good topic!
 
And yeah, have tried switching to 720p and noticed next to no difference between that and 1080i.

I read that 1080i produces a better picture but a slower production time compared to the 720p.

720p has inferior picture but faster picture load (better for video games that have alot of movement on the screen).
 
Last edited:
Back
Top