# Video Decoding on GPU



## MohammadH4s4ni

hello,
I have 1050 ti and when I play a video my GPU does the video decoding but my friend has a GTX 1650 and when he plays a video he gets CPU usage instead of GPU video decoding usage! does anybody know how can I put video decoding on GPU? We've enabled hardware acceleration on 1650 but it is still not working! and hardware Acceleration is of on 1050 but it works!


----------



## Asmodian

It depends on what you use to play it. The video decoder should have its own configuration.

Where are you changing hardware acceleration settings? What are you using to play videos?


----------



## MohammadH4s4ni

I'm changing the hardware acceleration in Graphics settings->Hardware-accelerated GPU scheduling. I'm using VLC to play video. Actually, I'm developing a software that reads the camera stream and displays it. when I run the program on my pc it does video decoding on GPU but when I run it on my colleague's pc it doesn't use GPU video encoding. but hardware acceleration is "off" on my pc it is "on" on his pc!


----------



## Asmodian

That setting has nothing to do with what VLC does. Is your college using VLC? Do they have the same GPU?

Check the options in VLC.


----------



## MohammadH4s4ni

I've tested it with vlc on my colleague's pc and it works on GPU. but other players use CPU. is there any configuration that enables this feature on the whole system?


----------



## Asmodian

No. The player can do whatever it wants, it isn't something that could be controlled system wide.

That said, if you install LAV Filters most Windows Direct Show based players will use it. If you then configure LAV Video to use hardware decoding all those players will end up using hardware decoding too.

Most other non-Direct Show players will have their own option for hardware decoding (like VLC).


----------



## MohammadH4s4ni

Ok, Thank you for your help 😊


----------

