# My Element 40' TV Screen resolution is to big?



## lithgroth007

This is most likely due to the hdmi settings on either your TV or your computer. Try adjusting the resolution on your PC to something manageable and then changing the TV's settings.


----------



## Nocturin

Turn off Scaling in your GPUs control panel, should work 

Also check for a "game mode" or something that turns off the scaling of the tv. Either one should work.

When you connect a blu-ray or another 1080p device, I'm assuming there is no issues?


----------



## egerds

use pc or game mode on the HDMI Picture settings on the tv, and or pick unscaled,
turn of any scaling in the pc


----------



## skatingrocker17

Make sure the resolution is 1920x1080. If you're using an nVidia card you can make the adjustments in the control panel but an even better option would be changing the aspect ratio on your HDTV from 16:9 to an option that does NO overscaning. So it will be a 1:1 pixel mapping. It seems kind of strange that you have this problem with HDMI so hopefully your TV has an option for turning overscan off. If now, you adjust the picture manually.


----------



## Narwhal_Revenge

I had to manually resise it my self :/


----------



## Nocturin

I had to do this with my old nVidia card, I think it's an issue with the older cards, because I don't hear of it today.


----------



## skatingrocker17

Quote:


> Originally Posted by *Nocturin*
> 
> I had to do this with my old nVidia card, I think it's an issue with the older cards, because I don't hear of it today.


I'm pretty sure it's the way the TV scales the picture. For example, if you had a 1080p TV and a 720p source you would want it to just display on 1280x720 pixels. That would leave a big boarder around the whole picture. The TV has to scale the picture to fit the TV. However, if it's a 1080p source on a 1080p display then it SHOULD just display pixel for pixel since the source matches the resolution of the display. The TV is likely over scanning the image, if the graphics card puts out a 1080p digital signal via HDMI, Display-port or DVI then it's still a 1080p signal, no matter what card it is. It's his TV that's scaling it off the screen which is commonly just used for SDTV signals to cover up closed caption data.


----------



## Nocturin

Quote:


> Originally Posted by *skatingrocker17*
> 
> I'm pretty sure it's the way the TV scales the picture. For example, if you had a 1080p TV and a 720p source you would want it to just display on 1280x720 pixels. That would leave a big boarder around the whole picture. The TV has to scale the picture to fit the TV. However, if it's a 1080p source on a 1080p display then it SHOULD just display pixel for pixel since the source matches the resolution of the display. The TV is likely over scanning the image, if the graphics card puts out a 1080p digital signal via HDMI, Display-port or DVI then it's still a 1080p signal, no matter what card it is. It's his TV that's scaling it off the screen which is commonly just used for SDTV signals to cover up closed caption data.


Both TV and PC have overscan/understand capabilities, the problem is finding out which one is haywire. My experience has show that GPUs are normally the culprit in these situations, and PCs don't always read the EIED correctly. The PC is easier to control depending on the TV's menu options, so it's nice to knock out variables during troubleshooting.


----------

