# 720p better than 1080i? Is this true?



## Sprkd1

http://lifehacker.com/5908969/why-you-should-watch-and-record-video-in-720p-instead-of-1080i


----------



## Pwntastic

Makes perfect sense to me, your not thinking of 1080p are you? Because 1080p is not the same as 1080i.


----------



## Chris13002

Progressive is like the full image compared to interlaced which is every other line...
hence the 'I' vs ''P'


----------



## Ovlazek

There is much more information in 720p compared to 1080i.


----------



## Dousand Thollars

Progressive > Interlaced.

But yes. Let's be clear that we are *NOT* talking about 1080p here.

Like the article says, you're getting 720 all the time against what is essentially 540 all the time.


----------



## psyside

1080i is a bit better then 480p, so 720p is much better then 1080i, 1080i = 560p.


----------



## zarusprime

The article is a bit misleading.

While scenes with high amounts of motion will look better in 720p, scenes with many still frames look better in 1080i.
1080i gives a sharper image than 720p. That is why most HD TV is broadcast mostly in 1080i.
Channels like ESPN are broadcast in 720p to better handle the motion.


----------



## drbaltazar

both are the full image lol!not it being the full image is an urband legend!

progressive is they do like a printer scan one line after another ,interlaced is the do odd line then even line!the second is more forgiving because the timing is a bit different!we talk 1 / 30 of a second here!

1080i take a bit more ressource then 720p.but 1080i give you 1080p quality!everage user wont mind!for most 1080i is a very good compromise!1080i is more work to adjust !why ?because people are more busy convincing you to use 1080p then actually respond on how to maximise setting so you get the best 1080i experience possible with the result that most in any format have setting that are wrongly set somewhere,sometime it is on the monitor,the game ,the tv,the os,the gpu!see all the variable here .we dont talk i vs p here we talk only various setting that can affect both side of the fence!then add the streamer and you can see why we get so much different artifacting!

also another thing that affect and still very much affect any screen is response time!if you got a modern hdtv the artifacting issue will be nil!i am talking 23 inch here!at 40+ inch yes go for 1080p!but at 23 inch (the standard size for hd!1080i is very good!another advantage n insane lot of tv channel on the web are at 1080i!and remember this:this all happen faaster then your eyes can blink!there are issue dont get me wrong but it is because of bugs that were never polished!and contrary to popular belief the vast majority cannot afford 1080p web!and most dont want to be at 720p !1080i is the best compromise in my view!


----------



## ASUSfreak

1080p = 1080 lines per second --> EVERY second all the lines are refreshed 50 times (Europe)/60 times (USA)--> resulting in lot's of data to be calculated per second

1080i = 540 (half of 1080) lines per second --> second 1 = lines 1, 2, 3, ..., 539 and second 2 = lines 2, 4, 6, ..., 540. Second 3 = lines 1,3,5,... and second 4 = lines 2, 4, 6, ... again. ETC... --> Resulting in half the data to be calculated per second

That's why most TV stuff is 1080i and not 1080p (it's only half the bandwith used)

PC's can do 1080p cause they only send from PC to TV/monitor and do not saturate the TV grid









And to compensate that "loss" they cranck up the refresh rate of the TV's

It used to be 50Hz (Europe), than they raised it to 100Hz TV's and now some (like Sony) uses MotionFlow Technologie and boost it up to 200Hz.

Plasma's can go up to 600Hz...

(But don't think you'll be able to play at 100-200 or 600fps







that's not the same frequency







)

It's just how you look at it... If you do the math one can say that 720p (effective 720 lines PER sceond) is better than 1080i (only 540 lines per second)

But because it switches so fast (for the eye) it isn't noticable (except fast moving horizontal stuff perhaps...) you'd better stay at 1080 i or p (pwould be best offcourse)

Cause the higher the res on the same screen is more pixels on that screen thus better image.

Here's an example (in Dutch) on "how" you can "look" at it (understand it)


----------



## punker

I think it depends on the native resolution of the screen

CRT's will do interlaced better

My CRT HDTV does 1080i nicely (853x1080i) supports 480i/480p/720p /960i (480p upscaled )

while
LCD's do progressive scan better.

Plasma handles both good but

still has burn in issue and uses too much power
even on newer sets


----------



## ramicio

No, we're talking 1/60 second here.

A frame of interlaced video has even fields and odd fields. Both are encoded into one frame. If you are a moron and weave deinterlace something, then you will wind up with 30 combed frames per second. Blend deinterlacing is almost the same as weaving, but without the combing. Some people just discard half of the fields, which is gross-looking when there is motion. The proper way to deinterlace is to bob. If you bob deinterlace, even fields are picked, filled downward 1 line, and displayed for 1/60 second, and then the odd fields are picked, filled upward 1 line, and displayed for 1/60 second. You wind up with 60 progressive frames per second of half vertical detail, which the brain blends to look very close to full vertical detail, and motion is just as fluid as 720p60. The advantage of interlacing is that because fields are compressed to one frame, they share information, which makes for nice compressibility. You can interlace a 1080p24 movie to 1080i60 and lose nothing.


----------



## drbaltazar

i use a led in 1080i and it does it nicelly!is it more ressource affective?now that is a very good question!because the whole point here is to take the minimum ressource possible

if i can have a 1 million pixel data path and look as good or very close to a 2 million pixel then i am all for it!
i think the conversion all happen in the screen so the gpu sending 1million is probably right,but i aint sure here!

(i dont think there is redundancy going on here between i and p in the screen)just a conversion)

ty ramicio for the feed back!i love to have the real stuff or possibility of interlacing instead of debating witch is better!cause at the end we all want the best possible experience for our chosen toy!

when you mention bob etc isnt the new tv dispalying odd line first then even line after?so wouldnt deinterlacing etc be problematic in the new screen?


----------



## ASUSfreak

Watch previous page cause I changed something to my post







but was too late


----------



## ramicio

Literal: 1080p > 1080i > 720p > 720i > 480p > 480i
Perceived quality versus bandwidth: 1080i > 1080p > 720i > 720p > 480i > 480p

Interlacing is nothing more than a trick. The same goes for compression. Of course, tricks can't ever give you the quality of the original, but the perceived quality with the enormous space savings is the game that will always be played. Chroma subsampling is a trick that is hack and it absolutely ruins any cartoons for me.


----------



## losttsol

Quote:


> Originally Posted by *ASUSfreak*
> 
> 1080p = 1080 lines per second --> EVERY second all the lines are refreshed --> resulting in lot's of data to be calculated per second
> 
> 1080i = 540 (half of 1080) lines per second --> second 1 = lines 1, 2, 3, ..., 539 and second 2 = lines 2, 4, 6, ..., 540. Second 3 = lines 1,3,5,... and second 4 = lines 2, 4, 6, ... again. ETC... --> Resulting in half the data to be calculated per second


You should state that the lines are refreshed multiple times a second. If it was only once per second, the video would look awful.


----------



## drbaltazar

ramicio could you explain a bit more about chroma subsampling?the way you mention it seems like you suggest some might get better result by disabling it!how would a led tv plugged to a amd video card on computer go at it if he was to test ?ty again for taking time!might need this info after all lol!(sorry)


----------



## ASUSfreak

Quote:


> Originally Posted by *losttsol*
> 
> You should state that the lines are refreshed multiple times a second. If it was only once per second, the video would look awful.


Thx







I just did it in the top of my post (so people know from the start







)
Quote:


> 1080p = 1080 lines per second --> EVERY second all the lines are refreshed 50 times (Europe)/60 times (USA)--> resulting in lot's of data to be calculated per second


----------



## ramicio

You don't disable it. It's how video is encoded. There is a large failure in the area of PCs, HDMI, and TVs. Sometimes it's the video card that's outputting only 4:2:2, and sometimes it's the TV subsampling the input. Most often it's the TV that does not support 4:4:4. This is fine for video, because 4:2:2 is still better than 4:2:0, which is how ALL video that consumers see is sampled. It simply makes colored text against black backgrounds look like crap, which is why using TVs for monitors is a giant fail.


----------



## mothrpe

720p will look better


----------



## ASUSfreak

Quote:


> Originally Posted by *ramicio*
> 
> You don't disable it. It's how video is encoded. There is a large failure in the area of PCs, HDMI, and TVs. Sometimes it's the video card that's outputting only 4:2:2, and sometimes it's the TV subsampling the input. Most often it's the TV that does not support 4:4:4. This is fine for video, because 4:2:2 is still better than 4:2:0, which is how ALL video that consumers see is sampled. It simply makes colored text against black backgrounds look like crap, which is why using TVs for monitors is a giant fail.


Also because TV's uses HDMI and thats 1920x1080 thus 16:9

Monitor's can use DVI and that 1920x1200 thus 16:10

That also could give extra problems on how the viewer are seeing the images on screen...


----------



## ramicio

Quote:


> Originally Posted by *mothrpe*
> 
> 720p will look better


Nope.

Quote:


> Originally Posted by *ASUSfreak*
> 
> Also because TV's uses HDMI and thats 1920x1080 thus 16:9
> Monitor's can use DVI and that 1920x1200 thus 16:10
> That also could give extra problems on how the viewer are seeing the images on screen...


What?


----------



## skatingrocker17

I prefer 1080i. It just looks sharper to me. Plus, my A/V receiver does a pretty good job of de-interlacing it.


----------



## ramicio

Quote:


> Originally Posted by *ASUSfreak*
> 
> ]
> 
> 
> 
> 
> 
> 
> 
> [/URL]


That's not how interlaced video is displayed. If you are in a video editing program and have no deinterlacing applied, then yes, it will look like that.


----------



## Chris13002

Quote:


> Originally Posted by *ramicio*
> 
> You don't disable it. It's how video is encoded. There is a large failure in the area of PCs, HDMI, and TVs. Sometimes it's the video card that's outputting only 4:2:2, and sometimes it's the TV subsampling the input. Most often it's the TV that does not support 4:4:4. This is fine for video, because 4:2:2 is still better than 4:2:0, which is how ALL video that consumers see is sampled. It simply makes colored text against black backgrounds look like crap, which is why using TVs for monitors is a giant fail.


Quote:


> Originally Posted by *ASUSfreak*
> 
> Also because TV's uses HDMI and thats 1920x1080 thus 16:9
> Monitor's can use DVI and that 1920x1200 thus 16:10
> That also could give extra problems on how the viewer are seeing the images on screen...


? Aspect ratio has nothing to do with this...

ramicio, I always wondered about this on PC's connected to most 'very high end' Samsung LED TV's through HDMI...

The text would be unreadable and one way to get it better would be to turn the sharpness all the way down...

I do have a 42" LCD 1080p TV that I use, through VGA that still looks almost perfect (and this was one of the first 1080p TV's you could buy)


----------



## jtom320

It feels like 2006 all over again in this thread.


----------



## ramicio

What is happening is you're basically applying blurring to that chroma channel by turning down the sharpness. There's nothing to do to VGA. It is a pure RGB signal, so there is no chroma channel to subsample. Any imperfections in the picture is because it's an analog signal and there is some bleeding going on. Chroma subsampling mainly looks so bad because everything that renders video uses nearest neighbor for sizing back up the chroma channel. You can use different resizing algorithms to somewhat band-aid the problem.


----------



## ramicio

Quote:


> Originally Posted by *jtom320*
> 
> It feels like 2006 all over again in this thread.


Well, sadly broadcast technology is still stuck in the stone age and the only improvement has been spotty changes to h.264 over MPEG-2.


----------



## drbaltazar

outch!from all the info here i have seen!i agree!we re still stuck in the past!cause after all the thread here nobody really knows!and there are so many different manipulation like chroma subsampling etc!and he just mentionned that lowering sharpness to 0(like have been doing)render chroma sampling useless!so clearly be it in 1080p ,1080i or all the lower version!aside from all the urban legend!i dont think we have gained one inch of real world applicability of all the techno!worst gpu maker and tv maker keep adding new feature to the pile lol!so it blurr the line even more !and to make mather worst every time we try to get info on how to set up 1080p or 1080i someone come along and yell:this is better then that!we dont want to ear wich is better then wich!we want to ear :i will use 480i!how i adjust my tv-gpu-os-etc etc etc
same for all other size!cause clearly i bet most of us using 1080p or 1080i are wrongly adjusted!and i bet what apply to a guy with a computer screen wont apply to a lot of people that plugged to a 23 inch hdtv for a bedroom!AAAARRRRRG!HEEEELLPP!


----------



## ramicio

Chroma subsampling is not the past. It will be here probably forever. Their solution is to just keep upping the resolutions and giving us bigger screens. The industry is also disregarding refresh rate, which also is a domain that contributes to resolution.


----------



## jtom320

Quote:


> Originally Posted by *ramicio*
> 
> Well, sadly broadcast technology is still stuck in the stone age and the only improvement has been spotty changes to h.264 over MPEG-2.


I wasn't so much commenting on the tech as much as saying I remember this question being asked all the time in 05/06. Don't know much about broadcasting to be truthful.


----------



## Quantum Reality

Quote:


> Originally Posted by *ramicio*
> 
> Well, sadly broadcast technology is still stuck in the stone age and the only improvement has been spotty changes to h.264 over MPEG-2.


And honestly I still see a bit of blockiness even in stuff that should be DVD-quality or Blu-Ray quality; looks like video card algorithms also need some work.







)


----------



## ramicio

Quote:


> Originally Posted by *Quantum Reality*
> 
> And honestly I still see a bit of blockiness even in stuff that should be DVD-quality or Blu-Ray quality; looks like video card algorithms also need some work.
> 
> 
> 
> 
> 
> 
> 
> )


What video card algorithms? Macroblocking occurs because the bit rate is too low and poorly distributed. Some people are fine with stuffing a movie down to 700 MB for a DVD rip or DVD5 for an HD rip. They just want the movie NOW! These are the same people who listen to all of their music on YouTube, and don't notice the crap sound. You also have the fact that they are re-compressing already-compressed material.


----------



## Quantum Reality

Quote:


> Originally Posted by *ramicio*
> 
> What video card algorithms? Macroblocking occurs because the bit rate is too low and poorly distributed. Some people are fine with stuffing a movie down to 700 MB for a DVD rip or DVD5 for an HD rip. They just want the movie NOW! These are the same people who listen to all of their music on YouTube, and don't notice the crap sound. You also have the fact that they are re-compressing already-compressed material.


I have no concrete proof since this is subjective but I could swear that my ATI 6950 was a bit "blockier" than my GTX 460 for the exact same video.


----------



## SinX7

That is true.

1080p
720p
1080i
480p

check this video out.
http://www.youtube.com/watch?v=Avvh0iH2xSg


----------



## drbaltazar

is amd better then nvidia for this?also:the other is right!using the web to get the proper experience isnt probably the best idea cause of verious compression going on!doing it at home isnt easy either ,cause some are in film format other in video other in h264 etc etc etc!god what a mess,and like another pointed out!gpu probably have a lot of bugs or issue that arent compatible or will look very bad with todays new screen solution!


----------



## ramicio

Quote:


> Originally Posted by *Quantum Reality*
> 
> I have no concrete proof since this is subjective but I could swear that my ATI 6950 was a bit "blockier" than my GTX 460 for the exact same video.


Then one or both of the cards are doing some sort of processing to the video.


----------



## ramicio

Quote:


> Originally Posted by *SinX7*
> 
> That is true.


It is not true. The idiot in the video has zero understanding about how interlaced videos are actually deinterlaced and played back.


----------



## Quantum Reality

Quote:


> Originally Posted by *ramicio*
> 
> Then one or both of the cards are doing some sort of processing to the video.


That's.... what I meant by algorithms.


----------



## ramicio

They should be disabled.


----------



## punker

Quote:


> Originally Posted by *ramicio*
> 
> What video card algorithms? Macroblocking occurs because the bit rate is too low and poorly distributed. Some people are fine with stuffing a movie down to 700 MB for a DVD rip or DVD5 for an HD rip. They just want the movie NOW! These are the same people who listen to all of their music on YouTube, and don't notice the crap sound. You also have the fact that they are re-compressing already-compressed material.


Just wondering have you ever watched a VHS tape *without* it being Processed by a digital to Analog converter ?
Using a HDMI digital signal output?

Using a HDMI for video (SPDIF to a sound system)
Results in much better video & audio quality.

No mosquito noise


----------



## ASUSfreak

Sorry for the late reply but what I meant with "also" was: in addition (yeah I speak Dutch, not English... Could not find the correct term)









But anyway


----------



## ramicio

Quote:


> Originally Posted by *punker*
> 
> Just wondering have you ever watched a VHS tape *without* it being Processed by a digital to Analog converter ?
> Using a HDMI digital signal output?
> Using a HDMI for video (SPDIF to a sound system)
> Results in much better video & audio quality.
> No mosquito noise


What? VHS was analog. What would a DAC to do an analog signal? Digital is digital. I don't see what HDMI has to do with mosquito noise. I've never had video with mosquito noise because I don't use crappy sources. I don't apply processing to anything because I always start with an acceptable source. I don't need to apply filters to video because I don't watch 700 MB Xvid rips. I don't apply any processing to my music because I don't listen to lossy crap.


----------



## Conspiracy

progressive is better than interlaced. no further argument needed on that lol. and unless you are projecting on a huge movie screen and need the best quality possible the difference between 720 and 1080 is not enough to ruin the quality of a video. if you need any bigger you would be shooting in 2K+ anyway which no normal person is doing or needing


----------



## ramicio

Progressive of a lower resolution is not better than interlaced of a higher resolution. 2K is basically like shooting at 1080p. 4k and up is where it's at, even for scanning film, and even for downsampling to 1080p.


----------



## drbaltazar

yep 1080 look better then 720 be it in i or p!but we werent embarking on that slipery slope !we were discussing how to this and that!cause there is a lot of info that is clearly too old!

http://www.anyhere.com/gward/hdrenc/hdr_encodings.html i found this page but i aint sure if its meant for computer screen ,tv screen or just plain photography!but they seem to say ms is pushing the wrong standard in term of quality vs the amount of space per bit it take?


----------



## ramicio

HDR has zero to do with what we're talking about here. It started in photography but is creeping into videography now. It really makes stuff look quite trippy and sometimes Kodachrome-ish.


----------



## Conspiracy

Quote:


> Originally Posted by *ramicio*
> 
> Progressive of a lower resolution is not better than interlaced of a higher resolution. 2K is basically like shooting at 1080p. 4k and up is where it's at, even for scanning film, and even for downsampling to 1080p.


i didnt realize this thread was about film my bad... the OP was about video and i didnt see anything about professional use of video or film either.

last i checked no normal person was shooting 4K and up


----------



## ramicio

It's not about film, it's about digital video. A movie on Blu-ray is video. A TV show is video.


----------



## drbaltazar

on a side note!as much as i love 1080i ,for mobile i would say go 1280x720 (720p)because on mobile this setting make more sense ressource wise!

1080i =1036800
720p =921600
so it would be better for mobile of those size

if you were to get a 23 inch tho 1080i would probably be better!


----------



## Conspiracy

Quote:


> Originally Posted by *ramicio*
> 
> It's not about film, it's about digital video. A movie on Blu-ray is video. A TV show is video.


yes and i dont see your point captain obvious


----------



## 420Assassin

here why alone why p is better than i.. scroll down to Problems caused by interlacing
http://en.wikipedia.org/wiki/Interlaced_video

that dosnt happen with progressive


----------



## ramicio

That is once again a problem caused SOLELY by bad deinterlacing, and obviously written by someone who doesn't understand how interlacing and deinterlacing work.


----------



## Conspiracy

resolution isnt everything. and it seriously comes down to how you are viewing the video as to whether or not having a slightly larger resolution image is important. i still would take a clean progressive image over interlaced. also depends on what you are watching as well. i would not want to watch a football or sports event in 1080i. if you are viewing video on a 20" screen the difference in 1080i and 720p resolution is not that important. the progressive video will look a lot nicer though, but i would probably want a 1080 image on a 70" screen


----------



## ramicio

What is wrong with motion and interlacing? There is nothing 1080i60 can't achieve that 720p60 can't. It will achieve more.


----------



## Conspiracy

absolutely nothing i shoot in 1080i60 with the XDcam on the travel show i am a camera assistant on and it shoots great for a $30k camera

i still prefer progressive video when i have the option unless told otherwise. thats just me. didnt realize there was no such thing as personal preference anymore.


----------



## drbaltazar

1280x720p is perfect for mobile solution!that being said ,it doesnt have its place on a regular pc!everybody should be 1920x1080x23 be it i or p.i favor i because it is pretty much same thing to p at that size!if i had a 40 inch + hdtv i would favor p probably ,we ll see when i do get there lol.


----------



## Adonis

720p is better for smaller screens imo, like tablets.


----------



## estudiaa

As an ingeneer working on the video technologies, I have to say that INTERLACED SCAN IS BETTER than progressive.
The problem is that some devices are not able to display interlaced video.
Interlaced scan saves half the bandwidth than progressive.
Ramicio seems to be the only authorized person to talk about this subject.
The rest of the people here seem to confuse "progresive" with "progress".


----------



## ramicio

It depends what you care about, quality or bandwidth. Interlacing is just another sleight of hand method to reduce bandwidth requirements, and it sacrifices quality. It's especially a problem with 4:2:0 video, and we can safely assume that's what will apply in any discussion of this topic. You're halving the vertical resolution an already fully-halved chroma channel. At a given resolution, progressive beats interlaced. Some people are borderline brain-dead and will just not notice these artifacts, and are the same people who think their Apple ear buds are awesome sounding with their lossy files. With 4:4:4 we can start talking, but by the time we get to that point, we won't be worrying about bandwidth. I don't see 4:4:4 coming to consumers even in the next decade. We need movies to be 1080p120 with an RGB palette (banding is horrible).


----------

