# [Guru3D] Deus Ex: Mankind Divided Gets DX12 and a Benchmark says AMD (Updated)



## YellowBlackGod

Quote:


> Huddy confirmed saying that Deus Ex: Mankind Divided would indeed launch with DX12 support. Mankind Divided will also include AMD's own TressFX Hair 3.0 technology alongside DX12. Huddy confirmed that Mankind Divided will include a built-in benchmark, too.
> 
> Update: current release date is 23.02.16


Source

*Update:* Now here are some more interesting news. The new Deus Ex game will fully support AMD Technologies, alongside with DX12. A couple more of these, and everybody will see the Radeon GPUs from a different point of view.
Quote:


> Originally Posted by *BIGTom*
> 
> Earlier today, GameGPU has posted performance Benchmarks for Deus Ex: Mankind Divided
> 
> GameGPU Deus Ex: Mankind Divided GPU test
> 
> DIRECTX 11
> 
> 
> 
> DIRECTX12


Quote:


> Originally Posted by *Glottis*
> 
> There seems to be massive performance difference between Ultra and High presets but hardly any picture quality difference.


Quote:


> Originally Posted by *Robenger*
> 
> DX11 CPU
> 
> 
> 
> 
> 
> DX12 CPU
> 
> 
> 
> 
> 
> CPU Usage


----------



## lacrossewacker

Launching with TressFX 3.0 is nice.


----------



## Valor958

So, here is the roadmap for a lot of gamers' next 2 years... Play Fallout 4 and all associated mod/dlc for the next year. Once exhausted, they'll dust themselves off and start on the same for Deus Ex: Mankind Divided









NOW I have a reason to upgrade my video card at least... and probably the rest so i'm DX12 ready. Given a few months of refining, I may even go Win10 to be ready for what's next.


----------



## thegreatsquare

Quote:


> Originally Posted by *Valor958*
> 
> So, here is the roadmap for a lot of gamers' next 2 years... Play Fallout 4 and all associated mod/dlc for the next year. Once exhausted, they'll dust themselves off and start on the same for Deus Ex: Mankind Divided
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NOW I have a reason to upgrade my video card at least... and probably the rest so i'm DX12 ready. Given a few months of refining, I may even go Win10 to be ready for what's next.


That's the plan, though DE is gonna share billing with JC3, MGS:V and a few others summer sale items.


----------



## ZealotKi11er

It seems TressFX does not get the same hate. I wonder why.


----------



## sugarhell

Finally a good game on the horizon.


----------



## Noufel

i enjoyed playing human revolution and i'm sure to enjoy mankind divided


----------



## DIYDeath

I will laugh so hard is Nvidia bombs in this benchmark too, tries to get the dev to not include features so they look better in the benchmark, gets exposed and then backtracks trying to control negative PR.


----------



## Dudewitbow

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It seems TressFX does not get the same hate. I wonder why.


because it was only used in one game. and despite having an ugly opening start(due to nvidia not getting the source code earlier), the fps tank after that issue was resolved was minimal relative to the competition so any user could still enjoy it. We as users never got to see the TressFX 2.0 version, and waiting on this to see 3.0. As long as AMD allows TressFX to have a better Graphics/Performance Loss ratio to all users(AMD, Nvidia and Intel (and those <0.01% cards who may still have something obscure like a matrox)), then users will be more content with it(relative to comparing it to Gameworks implementation)

edit: rise of the tomb raider iirc will get it too


----------



## Rob27shred

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It seems TressFX does not get the same hate. I wonder why.


Lol, because AMD does not lock out team green through driver coding so TressFX runs just as well on a red or green GPU.


----------



## SystemTech

Ohh now this is something to get excited about DX12 for








Next thing on my procurement list is a 4K screen and another 390 then im sorted









I do hope that AMD beat Nvidia again for one simple reason....Competition benefits only the consumer. A lack of competition only benefits the Producer/Supplier because they can charge whatever they want.

I dont want to have to pay $1000 for a high end card in a few years (980TI equivilent) but you can still to some degree justify the sub $700 for one. Think if something like a 970/380 are priced int he $700 region















Competition is good...


----------



## YellowBlackGod

Quote:


> Originally Posted by *SystemTech*
> 
> Ohh now this is something to get excited about DX12 for
> 
> 
> 
> 
> 
> 
> 
> 
> Next thing on my procurement list is a 4K screen and another 390 then im sorted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do hope that AMD beat Nvidia again for one simple reason....Competition benefits only the consumer. A lack of competition only benefits the Producer/Supplier because they can charge whatever they want.
> 
> I dont want to have to pay $1000 for a high end card in a few years (980TI equivilent) but you can still to some degree justify the sub $700 for one. Think if something like a 970/380 are priced int he $700 region
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Competition is good...


You do have an equivalent high end card. The R9 390. DirectX 12 says so.


----------



## umeng2002

It wouldn't take a "bomb," just a mid-range AMD card to perform as well as a high-end nVidia card in DX12 mode.


----------



## geoxile

It'll be nice to see TressFX 3.0 in action.


----------



## BeerPowered

Refreshing to see AMD get some Gaming Evolved games after all these Gameworks failures. Maybe they will do that Never Settle deal again where I buy an AMD card and get 3 free games. Got Tomb Raider, Crysis 3, and Sleeping Dogs last time.


----------



## sugarhell

I think i prefer gaming evolve over twimtbp. At least we got some good features.


----------



## FLCLimax

did i get in before this thread goes down the crapper?


----------



## Kuivamaa

This will be done on Dawn engine which is largely based on Glacier 2 (Hitman:Absolution) so I expect it to be quite advanced and rather heavy on the GPU.


----------



## Exeed Orbit

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It seems TressFX does not get the same hate. I wonder why.


Isn't TressFX open source?


----------



## nakano2k1

Quote:


> Originally Posted by *Exeed Orbit*
> 
> Isn't TressFX open source?


Source: Wikipedia
Quote:


> AMD TressFX is an open source software developed by AMD which provides for advanced GPU rendering of hair, fur, and grass in video games. The competing solution offered by Nvidia is HairWorks which is part of their GameWorks suite, and is proprietary in nature.


Sure seems like it... What does NVidia make that ISN'T proprietary?


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It seems TressFX does not get the same hate. I wonder why.


More open, better performing (even on NVIDIA hardware), and similar effect.


----------



## Noufel

TressFX is just faaaaar better than hairworks and it isn't even proprietary


----------



## FLCLimax

TressFX has better performance and doesn't cripple the competition's(nor their own) hardware unnecessarily.


----------



## Remij

Wow, I disagree completely. In my opinion Hairworks looks WAY better than TressFX. Even watching the Rise of the Tomb Raider demo you can see that TressFX is too floaty/unnatural. Hairworks looks and performs great


----------



## Woundingchaney

Its hard to say about hairworks or tressfx. Honestly given that tressfx wasn't utilized much I feel like its not much of a comparison. Hairworks has been in the market and used for quite sometime.

But anyways I hope this title is better than the last one. For various reasons I simply couldn't finish the game, my biggest issue was with the plot line.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Remij*
> 
> Wow, I disagree completely. In my opinion Hairworks looks WAY better than TressFX. Even watching the Rise of the Tomb Raider demo you can see that TressFX is too floaty/unnatural. Hairworks looks and performs great


Easy. HW looks worse in Witcher 3 then stock hair. TressFX looks way better in TR then stock hair.


----------



## Assirra

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Easy. HW looks worse in Witcher 3 then stock hair. TressFX looks way better in TR then stock hair.


While that is true for Geralt, HW looks absolutely amazing on animals/monsters.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Assirra*
> 
> While that is true for Geralt, HW looks absolutely amazing on animals/monsters.


Played 100 Hours. Maybe 20 mins was time killing monsters that had HairWorks. Also we are talking about people hair.


----------



## Remij

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Easy. HW looks worse in Witcher 3 then stock hair. TressFX looks way better in TR then stock hair.


If that's the case it's only so because Lara's stock hair in TR is god awful...

Just load up the latest tech demo's of both and try them out. HW is more impressive to me.

I prefer both however to nothing at all though...so...


----------



## YellowBlackGod

Quote:


> Originally Posted by *Assirra*
> 
> While that is true for Geralt, HW looks absolutely amazing on animals/monsters.


The performance cost of HW in The Witcher 3 is plain simply too high. Not worth it.


----------



## sugarhell

Hairworks looks kinda good but omg the resources that it needs are way too much.


----------



## kael13

Well I hope Mankind Divided doesn't turn out to be rubbish. They still haven't reneged on their pre-order scheme, though.


----------



## sumitlian

Quote:


> Originally Posted by *Blameless*
> 
> More open, better performing (even on NVIDIA hardware), and similar effect.


Blameless comment....hahaha


----------



## Rob27shred

IMO, Hairworks does look really good on the monsters & animals but does not do much for the look of Geralt. I actually prefer Geralt's look without HW. I have to agree with most on this thread though, HW is just way too much of a resource hog, even on Nvidia cards. To me that says a lot when a company can't even get their proprietary feature to work well on their products. Then to have it cripple the competitors performance on top of that!? It is one of the bigger reason I decided to go AMD for the GPU in my new rig.


----------



## lacrossewacker

Quote:


> Originally Posted by *Rob27shred*
> 
> IMO, Hairworks does look really good on the monsters & animals but does not do much for the look of Geralt. I actually prefer Geralt's look without HW. I have to agree with most on this thread though, *HW is just way too much of a resource hog, even on Nvidia cards. To me that says a lot when a company can't even get their proprietary feature to work well on their products. Then to have it cripple the competitors performance on top of that!?* It is one of the bigger reason I decided to go AMD for the GPU in my new rig.


So when HW takes a dump on Nvidia hardware, it's incompetence, but when HW takes a dump on AMD hardware, it's malicious.

Okay, got it.


----------



## Rob27shred

Quote:


> Originally Posted by *lacrossewacker*
> 
> So when HW takes a dump on Nvidia hardware, it's incompetence, but when HW takes a dump on AMD hardware, it's malicious.
> 
> Okay, got it.


Obviously you don't got it, I never said it was incompetence or maliciousness by Nvidia at all. My point was that Nvidia is hurting PC gaming overall by their business tactics & keeping GWs closed source. Also just to be clear I do not feel that Nvidia is doing it maliciously or making HWs or GWs a resource hog on purpose, they are in it to make money & closed source definitely ups the profit margins. However I do feel that if they can not get their GWs features to work decently on most higher end GPUs including their own it should not be included in games just yet. Also maybe, just maybe if they made the GWs library open source we could get better optimized versions of the features....

Sorry to rain on your fanboy parade but I'm just a gamer looking to have the best performance I can out of my hardware & games. It makes no difference to me if it is an AMD or Nvidia GPU I'm running or which one the game I'm playing is optimized for. Just calling it like I see it & IMO GWs is something that hurts the overall performance of games on PC across different build types.


----------



## rickcooperjr

Quote:


> Originally Posted by *Rob27shred*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lacrossewacker*
> 
> So when HW takes a dump on Nvidia hardware, it's incompetence, but when HW takes a dump on AMD hardware, it's malicious.
> 
> Okay, got it.
> 
> 
> 
> Obviously you don't got it, I never said it was incompetence or maliciousness by Nvidia at all. My point was that Nvidia is hurting PC gaming overall by their business tactics & keeping GWs closed source. Also just to be clear I do not feel that Nvidia is doing it maliciously or making HWs or GWs a resource hog on purpose, they are in it to make money & closed source definitely ups the profit margins. However I do feel that if they can not get their GWs features to work decently on most higher end GPUs including their own it should not be included in games just yet. Also maybe, just maybe if they made the GWs library open source we could get better optimized versions of the features....
> 
> Sorry to rain on your fanboy parade but I'm just a gamer looking to have the best performance I can out of my hardware & games. It makes no difference to me if it is an AMD or Nvidia GPU I'm running or which one the game I'm playing is optimized for. Just calling it like I see it & IMO GWs is something that hurts the overall performance of games on PC across different build types.
Click to expand...

I have to say that is the most dead on take on how I look at it Nvidia GameWorks hurts everyone but the latest and greatest Nvidia graphics card users previous gen Nvidia cards took such a massive hit it isn't even funny and well AMD got put under the bus then steam rolled and AMD couldn't do anything about it nor could the game DEV's because the code was proprietary not even the game DEV's had access to the source code because Nvidia Gameworks is a software suite the DEV's are given by Nvidia and if Nvidia didn't optimize it or put the effort in everyone but the new Nvidia card holders get the shaft big time.

The equivalent in the automotive world would be for USA and such to drop gasoline all together and only have E85 because over 70% of vehicles on the road wouldn't be able to run it and it would literally kill the engines not designed for it. The reason is E85 or 85% ethanol 15%gasoline most engines won't run on it or it would destroy theyre fuel system or worse would damage the engines internally due to lack of lubrication and less carbon alot of engines need this lubrication / carbon to keep valves seated and lubed and to protect the fuel system components fine carbon makes a great lubrication hince graphite and such.

I want to point out E85 is some nasty stuff it has a high PH draws moisture out of the air and in general is corrosive and has almost 0 lubrication water and high PH are a big No No add literally no lubrication well chaos is the longterm outcome also E85 has been proven to be almost if not more dangerous to enviroment than regular gasoline due to the manufacturing process releases massive amounts of green house gasses into the air and also the exhaust output out of a engine is high PH also essentially give birth to corrosive gas in the enviroment it is ran in.

I will say a study found the usage of E85 was only a few percentage less of carbon into the air by running it in a vehicle but upon manufacturing it adds like 10-15% more than the manufacturing of gasoline this meens a negative benefit to using it at all and E85 in general reduces fuel mileage by around 25% or more in some cases around 35%+ vs using gasoline this meens even more carbon and toxic chemicals is released vs running standard gasoline in short the ethanol idea is well a negative idea. I will say only fuel source that is carbon neutral is wood because if the wood is left to decay it releases same amount of carbon into air as if it was burned aka the term carbon neutral.


----------



## lacrossewacker

Quote:


> Originally Posted by *Rob27shred*
> 
> Obviously you don't got it, I never said it was incompetence or maliciousness by Nvidia at all. My point was that Nvidia is hurting PC gaming overall by their business tactics & keeping GWs closed source. Also just to be clear I do not feel that Nvidia is doing it maliciously or making HWs or GWs a resource hog on purpose, they are in it to make money & closed source definitely ups the profit margins. However I do feel that if they can not get their GWs features to work decently on most higher end GPUs including their own it should not be included in games just yet. Also maybe, just maybe if they made the GWs library open source we could get better optimized versions of the features....
> 
> Sorry to rain on your fanboy parade but I'm just a gamer looking to have the best performance I can out of my hardware & games. It makes no difference to me if it is an AMD or Nvidia GPU I'm running or which one the game I'm playing is optimized for. Just calling it like I see it & IMO GWs is something that hurts the overall performance of games on PC across different build types.


Resort to calling me a fanboy. sure. I have 0 Nvidia cards and 4? or 5? AMD cards (3 or 4 R9 280X's and 1 R9 290)

I'm just shining light on the fallacy of your argument. "HW can't even work on Nvidia hardware" which is true. But when it's detrimental to AMD performance, it's somehow intentional?

Everything else you said I agree with. HW is a performance hog period. Just tired of seeing the AMD being victimized card.

As for "just don't include X feature because it's too hard to run on current hardware..." that's bologna. Throw in all the features you can. Maybe we can enjoy it in all its glory next "gen". Unless you wanted Crytek to tame the Crysis 1 visuals for being too hard. (Just using this as a point - not that anybody would've actually wanted that!)


----------



## Klocek001

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Played 100 Hours. Maybe 20 mins was time killing monsters that had HairWorks. Also we are talking about people hair.


people's hair looks poop, hw or no hw.
but I remember fighting a red furred fiend with hw on....just beautiful


----------



## SpykeZ

http://www.dualshockers.com/2015/10/10/deus-ex-mankind-divided-director-console-ports-on-pc-is-disrespectful-direct-x12-detals-coming-soon/
Quote:


> Dugas believes that "console ports on PC is disrespectful" and they will make sure that PC version has features that console does not, to make it more distinct and separate. According to Dugas, it's "super important that the PC version is treated as such."


----------



## Thready

Still not upgrading to Windows 10 until they iron out the performance bugs when it comes to multitasking. I can't listen to music and edit photos in Lightroom at the same time without the music skipping around because Windows 10 can't handle multitasking whereas Windows 7 gives me no problems.

Which means I probably won't be able to play this game on DX 12 until I either upgrade my rig which I'm not going to do, or Windows 10 gets more efficient running multiple cores.


----------



## ImJJames

Quote:


> Originally Posted by *Thready*
> 
> Still not upgrading to Windows 10 until they iron out the performance bugs when it comes to multitasking. I can't listen to music and edit photos in Lightroom at the same time without the music skipping around because Windows 10 can't handle multitasking whereas Windows 7 gives me no problems.
> 
> Which means I probably won't be able to play this game on DX 12 until I either upgrade my rig which I'm not going to do, or Windows 10 gets more efficient running multiple cores.


Dual boot...its 2015 bro


----------



## Thready

Quote:


> Originally Posted by *ImJJames*
> 
> Dual boot...its 2015 bro


I do have an unused SSD sitting in my rig. I might give it a try.


----------



## dagget3450

Quote:


> Originally Posted by *Thready*
> 
> Still not upgrading to Windows 10 until they iron out the performance bugs when it comes to multitasking. I can't listen to music and edit photos in Lightroom at the same time without the music skipping around because Windows 10 can't handle multitasking whereas Windows 7 gives me no problems.
> 
> Which means I probably won't be able to play this game on DX 12 until I either upgrade my rig which I'm not going to do, or Windows 10 gets more efficient running multiple cores.


I've had something similar when overclocking, are you overclocked? Have you tried multitasking in windows 10 with stock cpu clocks? Also, ive had another time where my usb sound was the problem and my onboard audio was fine. Just wondering if you issue was isolated to you? Don't think i have heard anyone complain about this yet.


----------



## Thready

Quote:


> Originally Posted by *dagget3450*
> 
> I've had something similar when overclocking, are you overclocked? Have you tried multitasking in windows 10 with stock cpu clocks? Also, ive had another time where my usb sound was the problem and my onboard audio was fine. Just wondering if you issue was isolated to you? Don't think i have heard anyone complain about this yet.


no it's all stock. I've already posted on this issue and I've spent a lot of time trying to figure it out. I've done a ton of research and in the end I had to go back to 7.


----------



## rickcooperjr

Quote:


> Originally Posted by *Thready*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> I've had something similar when overclocking, are you overclocked? Have you tried multitasking in windows 10 with stock cpu clocks? Also, ive had another time where my usb sound was the problem and my onboard audio was fine. Just wondering if you issue was isolated to you? Don't think i have heard anyone complain about this yet.
> 
> 
> 
> no it's all stock. I've already posted on this issue and I've spent a lot of time trying to figure it out. I've done a ton of research and in the end I had to go back to 7.
Click to expand...

well I know your issue it is your 970a motherboard it won't allow your NB and HT to run at proper clocks only the 990fx board will proper stock clocks for your processor are 2200mhz NB and 2600mhz HT with the 970a motherboard you are running like 1800-2000mhz Nb and around 2200mhz-2400mhz HT this trips out the windows 10 OS AKA the power saving section of the OS because it isn't the proper settings for the FX 9000 / 8300 / 4300 / 6300 series also if you would have went the 990fx board you could have got better performance in general.

That and that 970a motherboard was never designed to run the FX 8000 series to begin with the power delivery on that board is way to small it will cause throttling issues and for cores to be randomly shut off AKA parked due to power constraints the motherboard can't feed / supply. The 970a motherboard was designed for at max the 6000 series CPU and even then it was dicey to say the least.

We already know a stock FX 8350 under load can pull around 150w or more and your motherboard the 970 one has 0 cooling at all on the VRM's do the math that is like running your car without a radiator or simply without coolant and VRM's overheating and throttling the CPU back is the outcome.

I will also point out even on a 990fx motherboard you need to manually set your NB and HT there is a bug where it will underclock the NB and HT and this robs performance out the gate somebody else with a 990fx / 970a motherboard can simply run something like HWinfo and see this or even AMD overdrive NB is suppose to be 2200mhz and HT is suppose to be 2600mhz this is the oem frequency directly from AMD for the FX 4300 / 6300 / 8300 / 9000 series.


----------



## Thready

Quote:


> Originally Posted by *rickcooperjr*
> 
> well I know your issue it is your 970a motherboard it won't allow your NB and HT to run at proper clocks only the 990fx board will proper stock clocks for your processor are 2200mhz NB and 2600mhz HT with the 970a motherboard you are running like 1800-2000mhz Nb and around 2200mhz-2400mhz HT this trips out the windows 10 OS AKA the power saving section of the OS because it isn't the proper settings for the FX 9000 / 8300 / 4300 / 6300 series also if you would have went the 990fx board you could have got better performance in general.
> 
> That and that 970a motherboard was never designed to run the FX 8000 series to begin with the power delivery on that board is way to small it will cause throttling issues and for cores to be randomly shut off AKA parked due to power constraints the motherboard can't feed / supply. The 970a motherboard was designed for at max the 6000 series CPU and even then it was dicey to say the least.
> 
> We already know a stock FX 8350 under load can pull around 150w or more and your motherboard the 970 one has 0 cooling at all on the VRM's do the math that is like running your car without a radiator or simply without coolant and VRM's overheating and throttling the CPU back is the outcome.
> 
> I will also point out even on a 990fx motherboard you need to manually set your NB and HT there is a bug where it will underclock the NB and HT and this robs performance out the gate somebody else with a 990fx / 970a motherboard can simply run something like HWinfo and see this or even AMD overdrive NB is suppose to be 2200mhz and HT is suppose to be 2600mhz this is the oem frequency directly from AMD for the FX 4300 / 6300 / 8300 / 9000 series.


yeah I got it because it was cheap and I really don't feel like spending any money upgrading it either. I don't like to spend a lot of money on computers.


----------



## Evil Penguin

Quote:


> Originally Posted by *y2kcamaross*
> 
> And now it's not launching with DX12 support, ugh


"Our teams are working hard to complete the final push required here though, and we expect to release DX12 support on the week of September 5th!"

That's not too bad.

I would much rather they support Vulkan, though.


----------



## sumitlian

Quote:


> Originally Posted by *y2kcamaross*
> 
> And now it's not launching with DX12 support, ugh


Hey Hello !? Nvidia needs to be at the top in the market. Okay !!! And to do so, Deus Ex launch shall be held with the DX11 sides.
yay more Nvidia GPU sales









Of course take it as conspiracy theory


----------



## EightDee8D

Quote:


> Originally Posted by *sumitlian*
> 
> Hey Hello !? Nvidia needs to be at the top in the market. Okay !!! And to do so, Deus Ex launch shall be held with the DX11 sides.
> yay more Nvidia GPU sales
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course take it as conspiracy theory


If a man can think like that while sitting in front of a pc, then a paranoid billionaire C.E.O. like J.H.H. can surely do it.







cuz you know something something business, something something profit, something something margins and all that mumbo jumbo


----------



## sumitlian

Quote:


> Originally Posted by *EightDee8D*
> 
> If a man can think like that sitting in front of a pc, then a paranoid billionaire C.E.O. like J.H.H. can surely do it.
> 
> 
> 
> 
> 
> 
> 
> cuz you know something something business, something something profit, something something margins and all that mumbo jumbo


Exactly.
Nvidia: We are filthy rich and have 0 debt.







Hmm where can else we pour some money beside pursuing to get negative DX12 results even with 3 Billions on R&D ?








J.H.H.: "I've got an idea !, you have 10minutes to give me the list of upcoming game launch dates. "


----------



## zealord

Delayed? Check
Heavily advertising DX12, but in reality not releasing with DX12 at launch? Check

Atleast they are not going the Resident Evil Umbrella Corps or Metal Gear Solid Survive route.


----------



## sumitlian

Quote:


> Originally Posted by *zealord*
> 
> Delayed? Check
> Heavily advertising DX12, but in reality not releasing with DX12 at launch? Check
> 
> Atleast they are not going the Resident Evil Umbrella Corps or Metal Gear Solid Survive route.


Or maybe Illuminati got them. I swear I saw Lucifer's eye in the trailer.


----------



## Defoler

Quote:


> Originally Posted by *Rob27shred*
> 
> Lol, because AMD does not lock out team green through driver coding so TressFX runs just as well on a red or green GPU.


Actually they have. When tressfx came out, it was only working on AMD, and AMD said it was open source, but they only released the code 3 years later.
The only way nvidia were able to make it work on their hardware, was to work with crystal dynamics (whom actually developed the tressfx code) for a couple of weeks until they found a way to make it work on nvidia hardware (there were changes to both game code and drivers in order to make it work).

The difference is that nvidia never said "open source, free to use" in the first place.
Quote:


> Originally Posted by *Dudewitbow*
> 
> because it was only used in one game. and despite having an ugly opening start(due to nvidia not getting the source code earlier), the fps tank after that issue was resolved was minimal relative to the competition so any user could still enjoy it. We as users never got to see the TressFX 2.0 version, and waiting on this to see 3.0. As long as AMD allows TressFX to have a better Graphics/Performance Loss ratio to all users(AMD, Nvidia and Intel (and those <0.01% cards who may still have something obscure like a matrox)), then users will be more content with it(relative to comparing it to Gameworks implementation)
> 
> edit: rise of the tomb raider iirc will get it too


Tressfx (unlike hair works), is only active on the main character, while hair works is active on every character, animals etc.
So it is not that tressfx has better graphics/performance loss, it is just doing less work.
On witcher 3 there are settings to run the hair only on the main character, and the performance cost is just like tressfx in tomb raider

On rise of the tomb raider, crystal dynamics had to work with nvidia before the game release in order to make sure tressfx works well (after they learned from the first tomb raider). The code for tressfx is very heavy favorating AMD, and needed to be altered to be compatible with nvidia (so the hair will stop trying to fly up like it did on the original tomb raider before the big patches).

Regarding which games have tressfx, well the tomb raider games, as tressfx was developed by crystal dynamics. Since square enix own them, giving it to Eidos (who are also owned by square enix) makes sense.


----------



## Unkzilla

Nvidia conspiracy theories and delay aside , the bigger question here is around the DX12 implementation.

If its not native / not coming with release, is the DX12 stuff just a wrapper/tacked on similar to ROTR?

Given that it was announced so long ago I was under the assumption it was DX12 from the ground up


----------



## jologskyblues

AMD Gaming Evolved. Nuff said.


----------



## LoLomgbbq

Quote:


> Originally Posted by *y2kcamaross*
> 
> And now it's not launching with DX12 support, ugh


And?

What has this OS limited api done for either amd or nv thus far?


----------



## Defoler

Quote:


> Originally Posted by *LoLomgbbq*
> 
> And?
> 
> What has this OS limited api done for either amd or nv thus far?


Actually DX12 is what has been selling AMD cards for the last year.
Without it, their cards are just underpowered cards being trumped on by Nvidia.

AMD have been heavily advertising their cards as the superior choice because of async and DX12.


----------



## Glottis

All this unnecessary drama and whining just because one of rendering APIs is delayed 2 weeks. Game will be 100% playable in DX11 you know?


----------



## Lass3

Quote:


> Originally Posted by *Assirra*
> 
> While that is true for Geralt, HW looks absolutely amazing on animals/monsters.


I agree.


----------



## jologskyblues

Quote:


> Originally Posted by *Glottis*
> 
> All this unnecessary drama and whining just because one of rendering APIs is delayed 2 weeks. Game will be 100% playable in DX11 you know?


Gasp! DX11?!?!?! Heresy! The delay for the DX12 patch is all Ngreedia's fault. They must have paid off square enix somehow


----------



## Offler

Quote:


> Originally Posted by *Lass3*
> 
> I agree.


With Witcher 3 I am waiting for some sort of gold edition with all DLCs, but man... the wolf definitely looks much much better with fur. But the gryphon...

I dont see much difference on the beast, but explain me please.. .why in the hell you require Gameworks to display distant clouds?


----------



## sumitlian

Quote:


> Originally Posted by *Glottis*
> 
> All this unnecessary drama and whining just because one of rendering APIs is delayed 2 weeks. Game will be 100% playable in DX11 you know?


LOL since when did OCNers start to talk about Playable ? Where did all the enthusiasm fade away all of a sudden ?


----------



## sumitlian

Quote:


> Originally Posted by *Offler*
> 
> With Witcher 3 I am waiting for some sort of gold edition with all DLCs, but man... the wolf definitely looks much much better with fur. But the gryphon...
> 
> I dont see much difference on the beast, but explain me please.. .why in the hell you require Gameworks to display distant clouds?











And here GTA V's clouds don't seem to like DirectX 10 at all.




Looks like developers are implying DirectX 10 isn't able to render grass as well.


----------



## Klocek001

Quote:


> Originally Posted by *Offler*
> 
> With Witcher 3 I am waiting for some sort of gold edition with all DLCs, but man... the wolf definitely looks much much better with fur. But the gryphon...
> 
> I dont see much difference on the beast, but explain me please.. .why in the hell you require Gameworks to display distant clouds?


take that griffin head as a trophy and strap to your saddle. then get on your horse and ride. compare hw on and off. hairworks isn't about static shots only. If it was I'd probably just turn it off.


----------



## Offler

Quote:


> Originally Posted by *sumitlian*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here GTA V's clouds don't seem to like DirectX 10 at all.
> 
> Looks like developers are implying DirectX 10 isn't unable to render grass as well.


Well in case of DirectX 10 I consider the technology well dead and well buried, and I can understand cases when people dont bother to do anything except the very basic rendering. But clouds can be always gimmicked by a static skybox texture even in case you make your game compatible with Direct 7.0 for some reason







. But in case of Witcher 3 I expected compatibility in DirectX 9.0, 10 and definitely 11. There are ways to do it without any such gimmick in each of those... "D

To Klocek001:
I can imagine that (i had one TressFX demo from TombRaider so I have an idea how it looks in motion). But I am almost sure that you dont need that for dynamic clouds...


----------



## Defoler

Quote:


> Originally Posted by *Klocek001*
> 
> take that griffin head as a trophy and strap to your saddle. then get on your horse and ride. compare hw on and off. hairworks isn't about static shots only. If it was I'd probably just turn it off.


Yeah.
The movement of hair in witcher 3 is absolutely amazing, even if the main character hair looks off sometimes, the added value to the overall feel of the game is amazing.

Comparing animal hair between rise of the tomb raider and witcher 3, on the witcher 3 the quality is so far superior. Of course it also affects performance, but you get more out of it.


----------



## LoLomgbbq

Quote:


> Originally Posted by *Defoler*
> 
> Actually DX12 is what has been selling AMD cards for the last year.
> Without it, their cards are just underpowered cards being trumped on by Nvidia.
> 
> AMD have been heavily advertising their cards as the superior choice because of async and DX12.


Thats not what i asked.

What dx12 done so far in games?

Amd maybe selling more cards, but that doesnt have an affect on dx12.


----------



## Klocek001

Quote:


> Originally Posted by *LoLomgbbq*
> 
> What dx12 done so far in games?


it brought us the visually astonishing quantum break


----------



## sumitlian

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Thats not what i asked.
> 
> What dx12 done so far in games?
> 
> Amd maybe selling more cards, but that doesnt have an affect on dx12.


Well we have seen GPUs are getting utilized more efficiently and we sure have seen new eye candies as well. Play Quantum Break or Forza Motorsport 6 for example and you'll know how even an i3 is almost able to fully feed the GPU most of the time.

Apart from that Developers did see something in DX12 and it seems they know very well why they are choosing it over DX11.


I know where you're coming from, it was much better if it was supported in Win7/8/8.1 as well. And I am sad about that too. But what else can you do.


----------



## Glottis

Quote:


> Originally Posted by *Klocek001*
> 
> it brought us the visually astonishing quantum break


i'm not sure if you are being sarcastic or not







but quantum break is coming to steam and with dx11, proving again that dx12 is nothing but a gimmick mostly to make people buy/upgrade to win10.


----------



## Defoler

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Thats not what i asked.
> 
> What dx12 done so far in games?
> 
> Amd maybe selling more cards, but that doesnt have an affect on dx12.


When truly utilized, you get better transparent textures, more computation with async, faster rendering time, less driver dependencies.

DX12 has a lot to offer, if really utilized.
Right now, actual games, they are utilizing very little of DX12. That fact doesn't mean it doesn't have what to offer. Since it is so developer dependent, and developers are too tired to scheduled and so are rushing things, it will take time for them to finally utilize everything.

AMD performance increases because of DX12 when used. That is basically what we get right now.


----------



## Glottis

Quote:


> Originally Posted by *Defoler*
> 
> AMD performance increases because of DX12 when used. That is basically what we get right now.


Not sure why people still completely ignore Pascal and say that it's only AMD that has gains.


----------



## sumitlian

Quote:


> Originally Posted by *Glottis*
> 
> i'm not sure if you are being sarcastic or not
> 
> 
> 
> 
> 
> 
> 
> but quantum break is coming to steam and with dx11, proving again that dx12 is nothing but a gimmick mostly to make people buy/upgrade to win10.


You sure are misunderstanding something in here. Developers wanted more control/multithreading on GPU and CPU sides since GPU power keeps increasing at enormous rate and CPUs could not feed them because CPU performance is not increasing much, they created a efficient API so that they can play with it.
Now Microsoft turns out to make the new API for Windows 10 exclusive is another thing. Though I secretly agree with you to some extent on DX12 being gimmick just because MS wanted us to install Win10.

Why do people not realize that we should not feel the need for Microsoft AT ALL to enjoy GPUs, video gaming, graphics rendering, graphics API market ? Since people do not realize it ever, here we are stuck in Direct3D.


----------



## Ha-Nocri

Ofc if you own NV you are against any innovations, but if you own AMD GPU you want the future now.

I'm just sad we won't have DX12 from the start.


----------



## sumitlian

Glottis,
One more thing. We can't blame DX12 for this. Skylake/Intel 6th Gen platform is not being fully supported for Windows 7 either, because they stopped making drivers for it. Intel has stopped making XHCI drivers for WIndows 7 for new platform. I do not any hacks about that but now It is officially not possible to install Windows 7 from USB drive. Microsoft and Intel have already announced future CPU platform will be completely incompatible for Windows 7. I am afraid AMD will have to follow the path too.









Edit: Even if we didn't have anything like DX12, Microsoft would have eventually gotten us to Install Windows 10 one way or another.


----------



## Klocek001

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Ofc *if you own NV you are against any innovations*, but if you own AMD GPU you want the future now.
> 
> I'm just sad we won't have DX12 from the start.


no if you own amd there's a 80% chance you bought a used gpu from a mining pc and you haven't actually supported your gpu manufacturer with any real money for years yet you expect your beloved company to overcome a company that sells 5x more new cards than amd do, including $700-$1200 ones.


----------



## mcg75

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Ofc if you own NV you are against any innovations,


Do you mean the same Nvidia that innovated Gsync which forced AMD's hand to creating freesync that you have listed in your sig?

I've heard some ridiculous things over the years here but that comment takes the win by far.

I'm not going to argue who has invented what or the most but to make a blanket statement about people like that is very, very poor.


----------



## Klocek001

lol I wonder what his share in these innovations is with a second hand 290

I never realized people can be just so plain stupid.

nvidia developed their own HBAO,PCSS,TXAA,HW,Godrays,Physx and probably a lot more effects years ago when all they could use was dx11. they didn't tell you to wait till 2017 for it, they found a way to implement it all into existing games in 2012. now they're working on voxel based ligtning and ambient occlusion which blows what we see in all dx12 amd games so far out of water.
amd owners, have you seen how clothes/fur/hair move during a fight in Witcher 3 with gameworks on ? no cause you own the most innovative things as far as GPUs go which gave you nothing of this sort.


----------



## LoLomgbbq

Quote:


> Originally Posted by *Klocek001*
> 
> it brought us the visually astonishing quantum break


Ah no, no it didnt.

Besides, it runs piss poor even with dx12.


----------



## Glottis

Quote:


> Originally Posted by *sumitlian*
> 
> You sure are misunderstanding something in here. Developers wanted more control/multithreading on GPU and CPU sides since GPU power keeps increasing at enormous rate and CPUs could not feed them because CPU performance is not increasing much, they created a efficient API so that they can play with it.
> Now Microsoft turns out to make the new API for Windows 10 exclusive is another thing. Though I secretly agree with you to some extent on DX12 being gimmick just because MS wanted us to install Win10.
> 
> Why do people not realize that we should not feel the need for Microsoft AT ALL to enjoy GPUs, video gaming, graphics rendering, graphics API market ? Since people do not realize it ever, here we are stuck in Direct3D.


Quantum Break started out as "this game is only possible thanks to DX12 and Windows 10 technologies". few months of lackluster sales on WinStore and now we are getting it on steam, windows7 etc and dx11. i'm not denying DX12 has many benefits, but they aren't as huge as MS makes it out to be. Let me put this another way, MS cares about DX12 so much because of exclusivity which is incentive for people to get Win10 and buy software form their primitive WinStore.


----------



## PlugSeven

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Ofc if you own NV you are against any innovations, but if you own AMD GPU you want the future now.
> 
> I'm just sad we won't have DX12 from the start.


After witnessing the rather petty and delicate disposition of the nvidia user base with games like AOtS and Hitman, I think it's a prudent move to give them the API that gives the most performance out of their hardware. If AMD were to came out rocking it harder here at the start because of dx12, sales of their game may very well falter.


----------



## Glottis

Quote:


> Originally Posted by *PlugSeven*
> 
> After witnessing the rather petty and delicate disposition of the nvidia user base with games like AOtS and Hitman, I think it's a prudent move to give them the API that gives the most performance out of their hardware. If AMD were to came out rocking it harder here at the start because of dx12, sales of their game may very well falter.


AotS is just a mediocre game from niche RTS subgenre and Hitman was a unoptimized always online trainwreck with shoddy business practices. All the negativity surrounding Hitman doesn't even have anything to do with Nvidia or AMD.

Do you even know or care why AMD appears to have such performance boost in DX12? That's because AMD had extremely poor performance under DX11, where as Nvidia had near peak performance in DX11.

Also, Nvidia already has good gains in Async with Pascal that was released months ago, yet people with red glasses are still in complete denial about this fact.


----------



## EightDee8D

Quote:


> Originally Posted by *Glottis*
> 
> Also, Nvidia already has good gains in Async with Pascal that was released months ago, yet people with red glasses are still in complete denial about this fact.


One example of a *game* with async on and off. please.


----------



## Glottis

Quote:


> Originally Posted by *EightDee8D*
> 
> One example of a *game* with async on and off. please.


i know you are trolling but still i'll bite. the only game that allows to toggle Async is AotS and it shows gains with Pascal GPUs. Read 1080 reviews. you'll have to ask game developers to allow to toggle Async in all games, not me.


----------



## Ha-Nocri

There are so few DX12/Vulkan games, 10 or so, that is nothing, and they are all just benchmarks


----------



## ToTheSun!

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klocek001*
> 
> it brought us the visually astonishing quantum break
> 
> 
> 
> Ah no, no it didnt.
> 
> Besides, it runs piss poor even with dx12.
Click to expand...

Pretty sure he was being sarcastic.


----------



## sumitlian

Quote:


> Originally Posted by *Glottis*
> 
> Do you even know are care why AMD appears to have such performance boost in DX12? That's because AMD had extremely poor performance under DX11, where as Nvidia had near peak performance in DX11..


I agree with most of what you recently said. Except for this, this is extremely exaggerated topic. You can cherry pick benches where AMD GPUs are looking extremely slower and so can I cherry pick benches in return that AMD GPUs are not that much slower. Yes AMD Fiji seems to have some architectural problem that it may be can't be utilized fully ever or god knows. Apart from that other AMD GPUs are nowhere that bad in DX11 as people claim about it, in fact I can show you exactly otherwise in many cases but since the benches vs benches would be a never ending war, I refrain myself from doing so.


----------



## EightDee8D

Quote:


> Originally Posted by *Glottis*
> 
> i know you are trolling but still i'll bite. the only game that allows to toggle Async is AotS and it shows gains with Pascal GPUs. Read 1080 reviews. you'll have to ask game developers to allow to toggle Async in all games, not me.


so nothing to backup your so called *fact* but i'm the one trolling here ?







it doesn't show any gains because async path is still disabled for pascal iirc. and whatever single digit % gain you saw on pascal was within margin of error.

if you don't know, rotr has registry hack for on/off async, but as far as i know it doesn't show any gains on nvidia. i haven't paid much attention to it so don't know current situation. but if nvidia gains anything they will make sure everyone and their mother will know about it.


----------



## Glottis

Quote:


> Originally Posted by *EightDee8D*
> 
> so nothing to backup your so called *fact* but i'm the one trolling here ?
> 
> 
> 
> 
> 
> 
> 
> it doesn't show any gains because async path is still disabled for pascal iirc. and whatever single digit % gain you saw on pascal was within margin of error.
> 
> if you don't know, rotr has registry hack for on/off async, but as far as i know it doesn't show any gains on nvidia. i haven't paid much attention to it so don't know current situation. but if nvidia gains anything they will made sure everyone and their mother will know about it.


i already said Pascal shows gains with Async in AotS and also in Time Spy. can't you read or what? if you can't be bothered paying much attention why are you demanding so much accuracy and attention from me?


----------



## PlugSeven

Quote:


> Originally Posted by *Glottis*
> 
> Quote:
> 
> 
> 
> AotS is just a mediocre game from niche RTS subgenre and Hitman was a unoptimized always online trainwreck with shoddy business practices. All the negativity surrounding Hitman doesn't even have anything to do with Nvidia or AMD.
> 
> 
> 
> Nothing to do with nvidia or AMD yes, that's why I mentioned a certain user base, go read the related game threads to see my point.
> Quote:
> 
> 
> 
> Do you even know or care why AMD appears to have such performance boost in DX12? That's because AMD had extremely poor performance under DX11, where as Nvidia had near peak performance in DX11.
> 
> Click to expand...
> 
> Nice oversimplification.
> Quote:
> 
> 
> 
> Also, Nvidia already has good gains in Async with Pascal that was released months ago, yet people with red glasses are still in complete denial about this fact.
> 
> Click to expand...
> 
> iD are waiting on that async compute for their vulkan path on nvidia cards, including Pascal...
Click to expand...


----------



## EightDee8D

Quote:


> Originally Posted by *Glottis*
> 
> i already said Pascal shows gains with Async in AotS and also in Time Spy. can't you read or what? if you can't be bothered paying much attention why are you demanding so much accuracy and attention from me?


Because you made BS claims and can't back them up. just show me so called *GOOD* gains on pascal with async.
Quote:


> Also, Nvidia already has good gains in Async with Pascal that was released months ago, yet people with red glasses are still in complete denial about this fact.


so again just one example of it. of an actual game.
Quote:


> Originally Posted by *Klocek001*
> 
> cause this is ocn and there's a group of ppl who think they're computer scientists but probably belong to the kindergarten.


Nobody understands that better than you man,


----------



## Klocek001

that is not an oversimplification at all. digitalfoundry measured 1080 has 12% gain from Vulkan and Fury X has 40%. But Fury X was slower than 980 in OpenGL to begin with, while already 1080 run plenty fast.
Quote:


> Originally Posted by *EightDee8D*
> 
> Nobody understands that better than you man,


lol you still don't understand that I don't pretend to.


----------



## EightDee8D

Quote:


> Originally Posted by *Klocek001*
> 
> that is not an oversimplification at all. digitalfoundry measured 1080 has 12% gain from Vulkan and Fury X has 40%. But Fury X was slower than 980 in OpenGL to begin with, while already 1080 run plenty fast.


I forgot amd only makes Fury/x.

other gpus weren't suffering on ogl as much as fiji. so yes, that was an oversimplification.


----------



## Klocek001

Quote:


> Originally Posted by *EightDee8D*
> 
> I forgot amd only makes Fury/x.
> 
> other gpus weren't suffering on ogl as much as fiji. so yes, that was an oversimplification.


rx480 is on par with 970 in OpenGL benches of Doom, so I guess you must be right. So much scientific evidence on your side. Wow.
AMD fanboys are just impossible to talk to when it comes to dx12, head so big it shows up on radar.


----------



## EightDee8D

Quote:


> Originally Posted by *Klocek001*
> 
> rx480 is on par with 970 in OpenGL benches of Doom, so I guess you must be right. So much scientific evidence on your side. Wow.


480 gains 46%, 390 gains 48%, fury gains 66% ( means it was suffering more than those 2 gpus), enough evidence i guess ? (on 1080p)

https://www.computerbase.de/2016-07/doom-vulkan-benchmarks-amd-nvidia/


----------



## sumitlian

Quote:


> Originally Posted by *Glottis*
> 
> i already said Pascal shows gains with Async in AotS and also in Time Spy. can't you read or what? if you can't be bothered paying much attention why are you demanding so much accuracy and attention from me?


Correct me if I'm wrong. Pascal can't do asynchronous processing which is more like independent processing, it is more like time is not shared between various process, instead Pascal does context switching which is basically mutithreading of processes where time is shared. Theoretically what I understand, context switching through driver is always prone to either suffer from microstuttering due to delay nature of context switching or rendering slowdowns aka low fps (what we have seen in Vulkan with an Nvidia GPU as late texture popup in many cases) if the more part of the workload has been designed to be loaded in asynchronous manner.

Remember why somethings were not being rendered by GTX 1080 in AoTS DX12. You can easily guess why Nvidia disabled it in their driver.
Quote:


> Originally Posted by *Klocek001*
> 
> that is not an oversimplification at all. digitalfoundry measured 1080 has 12% gain from Vulkan and Fury X has 40%. But Fury X was slower than 980 in OpenGL to begin with, while already 1080 run plenty fast.
> lol you still don't understand that I don't pretend to.


Nobody is claiming here to be scientists. Either It is you taking the speculation part too seriously and personally from those group of people or you are just frustrated.


----------



## Defoler

Quote:


> Originally Posted by *Glottis*
> 
> Not sure why people still completely ignore Pascal and say that it's only AMD that has gains.


Not saying pascal doesn't gain out of it, but AMD are gaining the most of it. While pascal gains a few %, AMD gains a more because hardware wise they have a bit better queue handling to handle the async compute calls better.

You can't take away from AMD the fact that in some games, they can gain 10-15% extra performance just by switching to DX12, while pascal gains maybe 5-6% in most cases, if at all.
Their hardware right now is more concentrated on DX12.

Of course pascal is a beast overall, and has a lot more raw power the handle the DX12 calls just fine. But again, % wise, DX12 works quite well with AMD. The fact that their cards are still underpowered and their gain brings them a bit higher up to speed but not enough, is another matter. But a current owner of the last two gens from AMD, will gain more with DX12.


----------



## Klocek001

Quote:


> Originally Posted by *EightDee8D*
> 
> 480 gains 46%, 390 gains 48%, *fury gains 66%* ( means it was suffering more than those 2 gpus), enough evidence i guess ? (on 1080p)
> 
> https://www.computerbase.de/2016-07/doom-vulkan-benchmarks-amd-nvidia/


fury x is equal to gtx 970 in OpenGL in that test.
Tell me one thing: are you here to advocate amd or embarass yourself ?


----------



## EightDee8D

Quote:


> Originally Posted by *Klocek001*
> 
> fury x is equal to gtx 970 in OpenGL in that test.
> Tell me one thing: are you here to advocate amd or embarass yourself ?


you asked me for evidence to prove what i said, and i said "other gpus weren't suffering on ogl *as much as* fiji". 66% vs 48% gain proves my point.

Now you tell me one thing - do you have short term memory loss or something ?


----------



## Semel

*Defoler.
*

Quote:


> So it is not that tressfx has better graphics/performance loss, it is just doing less work.


Wrong.Enabling hairworks only on Geralt still cripples performance much more compared to TressFX3.0.

Quote:


> after they learned from the first tomb raider


What was wrong with the first one? I had nvidia gpu. TressFX worked just fine after some patching. It had a more serious performance hit than tressfx 3.0 but no more than crapworks pubic hair performance hit in 2016(!) on amd gpu.

And unlike nvidias pubic hair 2016 edition, tressfx 3.0 runs great on amd AND nvidia this time
Quote:


> as tressfx was developed by crystal dynamics


It was not. They developed it with AMD.

and later they made a custom version of TressFX3.0 and called it PureHair (you can't have tressfx mentioned in an nvidia game).. But it is still the same tressfx 3.0. It is still called tressfx in registry and config files.


----------



## Klocek001

I never seen tressfx/purehair in game myself, but from what I saw in the video compared to what I see in witcher 3, amd's hair method seems to be thinning the hair slightly while hairworks makes it visibly more thick. With hw on Geralt looks like from the friggin Schwartzkopff commercial.
The animation on both is equally good though.


----------



## Ha-Nocri

Yes, b/c humans have such a thick hair


----------



## Klocek001

well if amd is using very low tesselation then it's not really a surprise there's gonna be a visible difference.


----------



## Newbie2009

I notice no mans sky (indie game) is €10 more expensive on steam than Deus Ex Mankind divided, which is bizarre.

Apparently Edge magazine have given deus ex a 9/10, which is a great sign.


----------



## BIGTom

Earlier today, GameGPU has posted performance Benchmarks for Deus Ex: Mankind Divided

GameGPU Deus Ex: Mankind Divided GPU test

DIRECTX 11



DIRECTX12


----------



## Glottis

There seems to be massive performance difference between Ultra and High presets but hardly any picture quality difference.


----------



## FLCLimax

Such low performance for SLI and CF @ 1080p.


----------



## LoLomgbbq

Looks like a sale buy going by those benches.

No reason for that perf with those visuals


----------



## ZealotKi11er

Quote:


> Originally Posted by *Glottis*
> 
> There seems to be massive performance difference between Ultra and High presets but hardly any picture quality difference.


Been telling people that Ultra settings are pointless these days. Delta is too small to sacrifice so much performance.


----------



## Klocek001

When all the dx12/async hype blew out back in 2014/15 this was supposed be the showcase game for GCN. Fury X with 45 fps @1080p. Well done devs, you've outdone the worst gameworks games. GTX 1080 for 60 fps @1080p, I'm nautious from this dx12 crap same as Quantum Break. What a shame.


----------



## Fancykiller65

As far as I'm aware, drivers aren't out yet for Deus Ex from either AMD or Nvidia. Isn't it early to be pissed about graphics?


----------



## Woundingchaney

Quote:


> Originally Posted by *Fancykiller65*
> 
> As far as I'm aware, drivers aren't out yet for Deus Ex from either AMD or Nvidia. Isn't it early to be pissed about graphics?


Drivers for Nvidia released Tuesday or Wednesday I believe.


----------



## LoLomgbbq

gameready drivers for nv have already been released
Quote:


> -though we're running the latest Nvidia drivers, which say they're Game Ready for the game.


http://www.pcworld.com/article/3109519/software/deus-ex-mankind-divided-review-impressions-human-evolution.html

This driver - 372.54:
Quote:


> Game Ready
> Provides the optimal experience for No Man's Sky, Deus Ex: Mankind Divided, Obduction, F1 2016, and the Open Beta for Paragon


http://www.nvidia.com/download/driverResults.aspx/105847/en-us


----------



## Semel

*Glottis*
Quote:


> but hardly any picture quality difference.


Invisible tessellation?









PS It looks like the game is poorly optimized.. which is not surprising at all..if you consider what engine DAWN is based on..


----------



## boredgunner

Don't really care about the delayed DX12, it's not going to be significant for anyone anyway.

The fact that this game has only one hub is disappointing though. No matter how detailed and large it is, it can't be as diverse as the previous games in this regard. It might also mean the game is shorter. You still visit many different places, but doing so in a mission is nothing like doing so in a non-combat hub with NPC interaction and so much more exploration.


----------



## FLCLimax

As someone who played the original Deus Ex, these games make me sad.


----------



## Ha-Nocri

The poor DX12 performance is probably the reason it got delayed. Won't play the game until DX12 is ironed out, like I did with Doom and Tomb Raider


----------



## Semel

Quote:


> Originally Posted by *Ha-Nocri*
> 
> until DX12 is ironed out, like I did with Doom and Tomb Raider


RoTTR still has problems performance-wise. Unlike Doom.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Semel*
> 
> RoTTR still has problems performance-wise. Unlike Doom.


True, altho 18.8 AMD drivers improved performance in DX12 noticeable. AMD says 10%, but it seems to be even more.


----------



## ZealotKi11er

Game Ready drivers are a marketing scam.


----------



## LoLomgbbq

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Game Ready drivers are a marketing scam.


just like dx12


----------



## NightAntilli

Quote:


> Originally Posted by *BIGTom*
> 
> Earlier today, GameGPU has posted performance Benchmarks for Deus Ex: Mankind Divided
> 
> GameGPU Deus Ex: Mankind Divided GPU test
> 
> DIRECTX 11
> 
> 
> 
> DIRECTX12


Ok. That, I did not expect. Fury X beats 1070 in DX11 but loses significantly in DX12? What is this sorcery?


----------



## LoLomgbbq

Quote:


> Originally Posted by *NightAntilli*
> 
> Ok. That, I did not expect. Fury X beats 1070 in DX11 but loses significantly in DX12? What is this sorcery?


amd doesnt have a driver out for it? Though this is an amd game, so a bit odd. Maybe a vulkan patch will come along


----------



## NightAntilli

Quote:


> Originally Posted by *LoLomgbbq*
> 
> amd doesnt have a driver out for it? Though this is an amd game, so a bit odd. Maybe a vulkan patch will come along


If that was the reason we would expect the DX11 performance to be bad also...


----------



## LoLomgbbq

Quote:


> Originally Posted by *NightAntilli*
> 
> If that was the reason we would expect the DX11 performance to be bad also...


good point.


----------



## Ha-Nocri

We will have only DX11 at launch. No driver, DX12 is not ready... who knows.


----------



## jmcosta

Quote:


> Originally Posted by *BIGTom*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Earlier today, GameGPU has posted performance Benchmarks for Deus Ex: Mankind Divided
> 
> GameGPU Deus Ex: Mankind Divided GPU test
> 
> DIRECTX 11
> 
> 
> 
> DIRECTX12


horrible performance for a game that looks like its running on dx9

ports


----------



## Ha-Nocri

Quote:


> Originally Posted by *jmcosta*
> 
> horrible performance for a game that looks like its running on dx9


Yeah, don't put it on ultra. Developers meant it to be played on High settings. Ultra is just for e-peen


----------



## y2kcamaross

Quote:


> Originally Posted by *LoLomgbbq*
> 
> amd doesnt have a driver out for it? Though this is an amd game, so a bit odd. Maybe a vulkan patch will come along


a vulkan patch for a direct x game?...not likely.


----------



## NightAntilli

Quote:


> Originally Posted by *jmcosta*
> 
> horrible performance for a game that looks like its running on dx9
> 
> ports


You might not like the art style, but the game is definitely quite advanced graphically...


----------



## LoLomgbbq

Quote:


> Originally Posted by *y2kcamaross*
> 
> a vulkan patch for a direct x game?...not likely.


Isnt what happened with bf4..(mantle)?
Quote:


> Originally Posted by *NightAntilli*
> 
> You might not like the art style, but the game is definitely quite advanced graphically...


----------



## y2kcamaross

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Isnt what happened with bf4..(mantle)?


pretty sure mantle was being developed into battlefield 4 before it's release, not just a quick patch, anyways I highly doubt we ever see a game that offers direct x12 and vulkan


----------



## LoLomgbbq

Quote:


> Originally Posted by *y2kcamaross*
> 
> pretty sure mantle was being developed into battlefield 4 before it's release, not just a quick patch, anyways I highly doubt we ever see a game that offers direct x12 and vulkan


Also, Thief (mantle) and The Talos Principle (vulkan and dx11), battlefield BF. So, not sure why you find it a strange thing.


----------



## warpuck

Quote:


> Originally Posted by *nakano2k1*
> 
> Source: Wikipedia
> Sure seems like it... What does NVidia make that ISN'T proprietary?


There ya go pilgrim. As The Bill from Remound, WA said, Nvidia knows how to code and AMD don't. Well sorta it was in response to the code in a win ver that stopped DR DOS from running Windows. I wonder if VW had made a code that made GM cars run into a ditch, would it be acceptable?


----------



## y2kcamaross

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Also, Thief (mantle) and The Talos Principle (vulkan and dx11), battlefield BF. So, not sure why you find it a strange thing.


the only game on that list that has direct x and vulkan also has opengl, which it had to have since it's also on linux, hence the vulkan patch makes more sense imo


----------



## LoLomgbbq

Quote:


> Originally Posted by *y2kcamaross*
> 
> the only game on that list that has direct x and vulkan also has opengl, which it had to have since it's also on linux, hence the vulkan patch makes more sense imo


What?

All of the games i listed are dx11 or 12. Patches for mantle/vulkan all came after launch.

Whats so hard to grasp?


----------



## HaiderGill

Quote:


> Originally Posted by *FLCLimax*
> 
> As someone who played the original Deus Ex, these games make me sad.


I played the original too and then Human Revolution. Poor graphics on the original but the game/story was so good you just carried on. Played human revolution and it was a a patch on it in terms of story and game. They need to get Warren Spector back on the case, best thing they could do until then is come out with a HD remake...


----------



## Robenger

DX11 CPU





DX12 CPU





CPU Usage


----------



## stargate125645

When a good 27"+ 3440x1440 curved monitor with FreeSync is released, I will jump on that and get dual R9 490s at release. This will be the first game I play with the new hardware...I hope this game fully supports ultrawide resolutions!


----------



## NightAntilli

27" is very small for ultra wide... I thought it would be big enough but after seeing it in person, I'd definitely get a 34".

I likely can't run the game right now so, I'll get it and play it later. And I won't buy it for Xbox.


----------



## jmcosta

Quote:


> Originally Posted by *Robenger*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> DX11 CPU
> 
> 
> 
> 
> 
> DX12 CPU
> 
> 
> 
> 
> 
> CPU Usage


that's interesting, big differences with the driver
dx12 doesn't seem to be that efficient at least comparing to NV dx11


----------



## ZealotKi11er

Quote:


> Originally Posted by *Robenger*
> 
> DX11 CPU
> 
> 
> 
> 
> 
> DX12 CPU
> 
> 
> 
> 
> 
> CPU Usage


I have a feeling if those CPU usage numbers are correct this game was build for Zen marketing.


----------



## Robenger

Honestly, it feels like an incredible engineering feat. Hat's off to their programmers. Really impressive to see a 8c/16t being utilized so much.


----------



## y2kcamaross

Quote:


> Originally Posted by *LoLomgbbq*
> 
> What?
> 
> All of the games i listed are dx11 or 12. Patches for mantle/vulkan all came after launch.
> 
> Whats so hard to grasp?


none of the games you listed are DX12


----------



## Woundingchaney

These performance numbers are horrible, particularly for a title that doesnt "appear" to be pushing graphical boundaries.


----------



## Ha-Nocri

Horrible @ultra settings. You can theoretically boost tessellation so high to kill performance on any card, doesn't mean you need it. If a developer put an useless option that hardly adds to image quality, simply tone it down.


----------



## stargate125645

Quote:


> Originally Posted by *NightAntilli*
> 
> 27" is very small for ultra wide... I thought it would be big enough but after seeing it in person, I'd definitely get a 34".
> 
> I likely can't run the game right now so, I'll get it and play it later. And I won't buy it for Xbox.


I was wondering about that myself. I'm just worried the pixel pitch at 30"+ would get too large.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Horrible @ultra settings. You can theoretically boost tessellation so high to kill performance on any card, doesn't mean you need it. If a developer put an useless option that hardly adds to image quality, simply tone it down.


Tell that to Titan XP owners.


----------



## HowHardCanItBe

Quote:


> Originally Posted by *FLCLimax*
> 
> Such low performance for SLI and CF @ 1080p.


if it isn't obvious enough, CF and SLI these days are a waste of money.


----------



## boredgunner

Quote:


> Originally Posted by *HowHardCanItBe*
> 
> if it isn't obvious enough, CF and SLI these days are a waste of money.


Agreed.


----------



## sviru

I wonder if there willl be a VXAO support... It is a massive change - especially in the game like this.


----------



## NightAntilli

Considering VXAO is nVidia tech and this is a Gaming Evolved title, I doubt it will be supported.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sviru*
> 
> I wonder if there willl be a VXAO support... It is a massive change - especially in the game like this.


Was it not supported on RoTR DX11 only?


----------



## Xuper

DX11 , 1920x1080

GTX 980 32,41

RX 480 35,46

DX12 , 1920x1080

GTX 980 26,32

RX 480 30,38

Intel 5960 , all cores above 70% !!?


----------



## sviru

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Was it not supported on RoTR DX11 only?


DX12 - yes


----------



## fewness

Damn that cpu usage!

Are there any screen caps to confirm the built-in benchmark?


----------



## Aussiejuggalo

So err... how come those benchmarks are done in 720p and not 1080p?

On a side note, preloading now














.


----------



## Gir

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So err... how come those benchmarks are done in 720p and not 1080p?
> 
> On a side note, preloading now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


I would imagine they're run at low res to make performance more CPU bound.


----------



## boredgunner

Quote:


> Originally Posted by *Gir*
> 
> I would imagine they're run at low res to make performance more CPU bound.


Yep, and they eventually do 1080p ones too. I understand the desire to show the most CPU bound scenario but real world scenarios are a bit more important.


----------



## LoLomgbbq

Quote:


> Originally Posted by *y2kcamaross*
> 
> none of the games you listed are DX12


Neither is deus ex MD. its not launching with dx12.

Whats yor point?


----------



## Arizonian

OP updated. Thanks for the members who contributed the info.


----------



## reqq

Quote:


> Originally Posted by *Arizonian*
> 
> OP updated. Thanks for the members who contributed the info.


So your ok with stealing other webpage hardwork? Should only be allowed link the site but not steal and link the images.


----------



## Klocek001

Quote:


> Originally Posted by *reqq*
> 
> So your ok with stealing other webpage hardwork? Should only be allowed link the site but not steal and link the images.


there's a watermark on every picture.

anyway, is this game moved to Feb 2017 ?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Klocek001*
> 
> anyway, is this game moved to Feb 2017 ?


If your talking about the release date it's the 23rd this Tuesday... Steam preload started yesterday.


----------



## Klocek001

ah okay there's a mistake in the op


----------



## Basard

Quote:


> Originally Posted by *ImJJames*
> 
> Dual boot...its 2015 bro


I could see dual booting back in the late 90's or early 2000's.... But, it's 2015, bro, we shouldn't have to anymore--things should improve with time.


----------



## Biobalance

Tell me pls how much DX12 impact on agame's graphics?


----------



## boredgunner

Quote:


> Originally Posted by *Biobalance*
> 
> Tell me pls how much DX12 impact on agame's graphics?


No impact on any current DX12 game to my knowledge. DX11 was the big graphical leap, DX12/Vulkan are for optimization.


----------



## Klocek001

Vulkan is something totally unmatched when it comes to utilizing your CPU. The first thing to do in order to push the pefromance of our graphics cards to another level is reduce the CPU holding it back to 0%. Using all possible cores/threads along with prospects of increasing the core count on mainstream CPUs in the forseeable future is great news for your GPU, no matter whether it's the latest or some older one. Each one will get a boost.


----------



## Pholostan

Quote:


> Originally Posted by *LoLomgbbq*
> 
> just like dx12


Yeah, I don't feel like Dx12 is adding much value for us gamers really. Most devs (under the boot of the publisher), not all but most, seems to be catering to the lowest common denominator. Like it is some kind of great thing that the game will work in Dx12 on a GTX680 from 2012. Why? I feel more and more like Dx12 is uninteresting for gamers now. Vulkan seems to be where it is at. Id did it right with Doom. We should ask for proper Vulkan games. Dx12/11 is not very interesting. Give us Vulkan games already.

Problem is that we will probably mostly just get half-assed Dx12 games. One or two proper Vulkan games. Sucks.

On Topic: New Deus Ex might have been interesting from a tech standpoint if it had been Dx12. You can't really "patch" in Dx12. Game needs to be made ground up for it. Mankind Divided sure looks like a Dx11 game with some Dx12 bells and whistles taped on to it. Shame on the publisher, shame on AMD and Nvidia. I suspect Ms has had a fat finger in it too, but probably it is the publisher who's the main villain. As per usual. Square Enix never fails to disappoint. Wanna bet when they'll announce DLC for the game? Couple of weeks?


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> Vulkan is something totally unmatched when it comes to utilizing your CPU. The first thing to do in order to push the pefromance of our graphics cards to another level is reduce the CPU holding it *back to 0%.* Using all possible cores/threads along with prospects of increasing the core count on mainstream CPUs in the forseeable future is great news for your GPU, no matter whether it's the latest or some older one. Each one will get a boost.


I doubt you really want that


----------



## oxidized

Quote:


> Originally Posted by *NightAntilli*
> 
> 27" is very small for ultra wide... I thought it would be big enough but after seeing it in person, I'd definitely get a 34".


And gain what? less PPI? PC monitors should never really go over 27" unless a very huge resolution comes with them


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> I doubt you really want that


I mean cpu not bottlenecking the gpu at all in any scenario.


----------



## The Source

Quote:


> Originally Posted by *Glottis*
> 
> There seems to be massive performance difference between Ultra and High presets but hardly any picture quality difference.


There's this camp for every single game. Hardly any PQ difference between 1080p and 2160p either.









Quote:


> Originally Posted by *sviru*
> 
> DX12 - yes


VXAO was DX 11 only for Tomb Raider. I just finished playing it.

Quote:


> Originally Posted by *boredgunner*
> 
> No impact on any current DX12 game to my knowledge. DX11 was the big graphical leap, DX12/Vulkan are for optimization.


The idea behind this was to allow more complex scenery. More on screen npc's or whatever. Maybe we'll see something come of it with the console refresh. Doubt it though since those seem to be VR focused.


----------



## boredgunner

Quote:


> Originally Posted by *The Source*
> 
> The idea behind this was to allow more complex scenery. More on screen npc's or whatever. Maybe we'll see something come of it with the console refresh. Doubt it though since those seem to be VR focused.


Yeah but I think Ashes of the Singularity is the only game to showcase that. It'll be a while before real benefits of DX12/Vulkan become common.


----------



## bonami2

Quote:


> Originally Posted by *ImJJames*
> 
> Dual boot...its 2015 bro


Wut?

I got a laptop with a downclocked i7 2720qm with boinc on 8 thread maxxed out at 100% and im looking movie and browsing without any lagg and or audio.

Samething with all my other pc.

Just set program priority maybe


----------



## PontiacGTX

Quote:


> Originally Posted by *NightAntilli*
> 
> Ok. That, I did not expect. Fury X beats 1070 in DX11 but loses significantly in DX12? What is this sorcery?


VRAM?


----------



## BradleyW

Quote:


> Originally Posted by *PontiacGTX*
> 
> VRAM?


The game uses about 4.7GB VRAM @ 1080P with DX12, so yes the VRAM could be the issue. This game will push RX 480 sales.
Or perhaps the game will be better optimized when the DX12 patch is released in the wild along with suitable AMD drivers.

Source


----------



## Klocek001

pcgh

http://www.pcgameshardware.de/Deus-Ex-Mankind-Divided-Spiel-55470/Specials/Benchmarks-Test-DirectX-12-1204575/

1440p dx12


----------



## Klocek001

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Easy. HW looks worse in Witcher 3 then stock hair. TressFX looks way better in TR then stock hair.


bleh


Spoiler: Warning: Spoiler!


----------



## Ha-Nocri

Quote:


> Originally Posted by *Klocek001*
> 
> bleh
> 
> 
> Spoiler: Warning: Spoiler!


You will go to the end of the world to defend NV, won't you

I remember when I had GTX 580 and was playing 1st Tomb Raider, I had to turn on TressFX, even tho it hit the performance pretty hard, as it looked so awesome. PureHair/TressFX 2.0 looks better than HairWorks. In fact, I have no need to turn on HW while playing Wither.


----------



## Klocek001

lol saying that tressfx/purehair looks better than hw in tw3 is like saying vulkan improves core utilization in far cry 4.


----------



## oxidized

Quote:


> Originally Posted by *Ha-Nocri*
> 
> You will go to the end of the world to defend NV, won't you
> 
> I remember when I had GTX 580 and was playing 1st Tomb Raider, I had to turn on TressFX, even tho it hit the performance pretty hard, as it looked so awesome. PureHair/TressFX 2.0 looks better than HairWorks. In fact, I have no need to turn on HW while playing Wither.


I remember the same, except the fact it looked awesome, maybe better, but was some forced stupid effect which disabled would give you a great boost in terms of performance

Oh, and with this i'm not implying hairworks looks better...


----------



## jmcosta

both technologies have their up and downs, hairworks and tressfx seem to be equal in terms of simulation but one thing is for sure purehair might have minimal performance impact but quality wise sucks.
they optimized to run on console you can see triangles in the hair, isn't flexible like before..


----------



## ZealotKi11er

The difference between TressFX in TR and HairWork in Witcher 3 is that in TR, TressFX looked miles better then stock hair while in Witcher 3 stock hair actually looked better.


----------



## huzzug

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The difference between TressFX in TR and HairWork in Witcher 3 is that in TR, TressFX looked miles better then stock hair while in Witcher 3 stock hair actually looked better.


I don't know, but I have looked at HW on an 980ti when TW3 came out and liked the implementation. Especially the Griffin and the wolves had excellent implementation.


----------



## ZealotKi11er

Quote:


> Originally Posted by *huzzug*
> 
> I don't know, but I have looked at HW on an 980ti when TW3 came out and liked the implementation. Especially the Griffin and the wolves had excellent implementation.


I am pretty sure its not the same thing now. They changed how HW looked after initial launch. You can clearly see CDPR had its own hair technology and HW was really only a money grab.


----------



## Assirra

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am pretty sure its not the same thing now. They changed how HW looked after initial launch. You can clearly see CDPR had its own hair technology and HW was really only a money grab.


Not sure about that, but what i do know about it is that while it looked like Geralt had a bunch of noodles on his head, from the beginning the animals were absolutely gorgeous and you can see this on the nvidia performance guide that came out on the same day.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Assirra*
> 
> Not sure about that, but what i do know about it is that while it looked like Geralt had a bunch of noodles on his head, from the beginning the animals were absolutely gorgeous and you can see this on the nvidia performance guide that came out on the same day.


I played Witcher 3 for 250 hours and animal hair is like 0.01% of the time you play the game.


----------



## TopicClocker

Wow this game is super demanding, it's going to be good to see some comparisons between the graphical settings and get my hands on the game as-well.


----------



## Unkzilla

Quote:


> Originally Posted by *TopicClocker*
> 
> Wow this game is super demanding, it's going to be good to see some comparisons between the graphical settings and get my hands on the game as-well.


I've watched a few videos on youtube and there seems to be a massive performance hit from high>ultra with very little gain in visuals

I am hoping that my 1080 can hold up on High @ 4k at 50-60fps but yet to see any benchmarks. I've got the game preloaded so will soon find out


----------



## VeritronX

Quote:


> Originally Posted by *LoLomgbbq*
> 
> Thats not what i asked.
> 
> What dx12 done so far in games?
> 
> Amd maybe selling more cards, but that doesnt have an affect on dx12.


It's allowing microsoft to bring us pc versions of their own previously xbox one exclusive games, like forza horizon 3 and forza motorsport.. which is a huge thing for some people. It also seems to be reducing bottlenecks on existing hardware, which is always good.


----------



## chuy409

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I played Witcher 3 for 250 hours and animal hair is like 0.01% of the time you play the game.


Id say otherwise. I always encounter wolfs. It uses hairworks for that. Your horse hair also. If you are using the Grandmaster Ursine armor, it has fur on its neck. Rise on the tomb raider on the other hand, its literally only lara's hair and some animals but the animals are pretty rare to come by. And day 1 witcher HW is WAAAAY different than HW right now. It now only takes off a few frames in comparison at release where it removed a solid 30fps.


----------



## VeritronX

Quote:


> Originally Posted by *chuy409*
> 
> It now only takes off a few frames in comparison at release where it removed a solid 30fps.


Are you sure that's the case for all users? I'd be surprised if it didn't still take 30fps off a kepler card, maxwell and pascal are alot better at tesselation which is why hairworks was made in the first place.. to showcase that difference where nothing else on the market would.


----------



## gamervivek




----------



## jmcosta

The performance doesn't change much tweaking the options like cloth physics, tessellation, ambient occlusion, parallax mapping, most of those settings do 0.5-1fps difference. the ones that actually change the performance are the shadows, AA and texture quality for the amount of vram available, high quality seems to be stable for 2gb cards.
its a terrible port, it requires a strong gpu, an 970 min to get close to 60fps

graphical quality isn't that special but the design is pretty good..

the first s.t.a.l.k.e.r. had better shadows\light (volumetric), reflections and parallax occlusion mapping than this lol


----------



## Noufel

Quote:


> Originally Posted by *jmcosta*
> 
> The performance doesn't change much tweaking the options like cloth physics, tessellation, ambient occlusion, parallax mapping, most of those settings do 0.5-1fps difference. the ones that actually change the performance are the shadows, AA and texture quality for the amount of vram available, high quality seems to be stable for 2gb cards.
> its a terrible port, it requires a strong gpu, an 970 min to get close to 60fps
> 
> graphical quality isn't that special but the design is pretty good..
> 
> the first s.t.a.l.k.e.r. had better shadows\light (volumetric), reflections and parallax occlusion mapping than this lol


Is it realy a console port ?


----------



## jmcosta

Quote:


> Originally Posted by *Noufel*
> 
> Is it realy a console port ?


yea in terms of quality\performance, gameplay wise is great and it packs with tons of options to change the hud, indicators etc.
it feels more open, more sidequests than previous games so far..


----------



## Unkzilla

Tried to test performance before going to work but there was still a few gb to download which wasn't a part of the preload. Seems that this rivals quantum break performance on max settings , lets see if there is the same level of backlash


----------



## Gabkicks

my fps is teetering between 45 and 55fps and i'm in a vent basically D: I thought my gtx 1080 would have no problem with this game.


----------



## EniGma1987

So I just started the game up and found the menu really laggy. I turned on the afterburner overlay and found my overclocked TitanX is getting 42~ fps with most things on "high" and texture quality set to very high....







Tweaking settings didn't do much. Going to download the game ready driver for it now and see if the performance changes drastically. Right now the performance is just completely unplayable in 4K.

Edit: well, Nvidia's newest game ready drivers for the game help a ton. From 42 fps to 61 fps, same settings... Pretty crazy jump from just 1 driver version.


----------



## Gabkicks

going from 4x msaa to OFF almost doubled my fps in the benchmark... so i guess this is how i'll play the game lol.


----------



## jmcosta

Quote:


> Originally Posted by *EniGma1987*
> 
> Edit: well, Nvidia's newest game ready drivers for the game help a ton. From 42 fps to 61 fps, same settings... Pretty crazy jump from just 1 driver version.


oh i m gonna try that


----------



## boredgunner

Most modern games including this one really need to stop including MSAA. It's worthless, since most of the aliasing is from shaders not from geometry, and it's incompatible with the game's rendering techniques which is why performance gets tanked. Everyone just use TAA. This game's TAA blurs a lot, but the in-game sharpening counteracts that. Although this sharpening is too strong actually, better off using ReShade/SweetFX. I have the strength at around 0.8, seems perfect.

Even with MSAA disabled it runs poorly with everything else maxed. 45-60 FPS on my main sig rig (2560 x 1440). I sacrificed shadows first, lowering shadow quality to High (noticeable degradation in quality) and Contact Hardening Shadows to just On (minimal visual difference). Now 55-75 FPS which is at least acceptable on its own, although when you consider that numerous much better looking games run better... as far as optimization goes this is another typical AAA console port. No bueno.

Opening tutorial mission is long and reminds me a lot of Deus Ex: The Fall, or as I like to call it The Fail because that game blows. Not that there was anything really wrong with this mission in Mankind Divided mind you, perhaps a bit too long as it might become a bit boring in future playthroughs. Eager to play more. Excellent amount of UI options, I am enjoying "Give me Deus Ex" mode with all the helpers disabled.


----------



## Unkzilla

From the very beginning of the game I seem to be averaging 35fps on my OC'd 1080/High settings @ 4k. Worst performing game I have come across

If the visuals justified this perf I might be ok with it but I'm not seeing anything too special here... not too sure if I'll wait for the DX12 patch or just refund


----------



## The Source

Anyone using SLI should give these bits a try. 0x2C0120F5 (Overwatch)

Improved performance and usage for me.

First hub area I'm seeing 45-60fps 4K all ultra, MSAA Off.

I think the game looks pretty good. NPC armor and augs look great.


----------



## Ha-Nocri

TPU performance analysis


----------



## daviejams

I'll buy this at the weekend , is there a driver out for it yet on AMD ?


----------



## Mahigan

Something tells me that once the DX12 is ironed out and drivers are released by both AMD and nVIDIA... we might see the FuryX performing admirably.


----------



## Ha-Nocri

Quote:


> Originally Posted by *daviejams*
> 
> I'll buy this at the weekend , is there a driver out for it yet on AMD ?


I don't think so...


----------



## daviejams

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I don't think so...


Maybe they will just do one for the dx12 version. Perhaps is best to wait for that patch then buy it


----------



## Unkzilla

Last update here until DX12 comes out. Have finished the first mission

@4k, gtx 1080 @ 2050mhz

Ultra setting avg fps-27
High setting avg fps-40
Medium with Very high textures-55fps

Surprisingly on medium preset with the vh textures the game still looks alright.. main difference I have noticed vs ultra/high is with the shadows but nothing to justify the performance difference. luckily it has triple buffering so there is no screen tearing under 60fps. First mission was a lot of fun.

If DX12 can give a 10%+ boost that will help a lot - the gameplay feels quite fluid in the 50's but at 40fps its not playable imo


----------



## Mach 5

Quote:


> Originally Posted by *Valor958*
> 
> NOW I have a reason to upgrade my video card at least... and probably the rest so i'm DX12 ready. Given a few months of refining, I may even go Win10 to be ready for what's next.


Dont you need Win 10 for DX12?


----------



## Klocek001

Quote:


> Originally Posted by *Mach 5*
> 
> Dont you need Win 10 for DX12?


yessir
that's why Vulkan is 1000x better


----------



## Olivon

Benchmark scene is really weird and doesn't give a good indication of real gaming.

In 1440p, with a 1.5+GHz 980Ti, the bench gives 42fps (1440p, Ultra, max FOV, Aniso 16X, no MSAA) but in real gaming it's more around 60fps in Prague.
The same with 1080p, the bench gives 60fps but IRL it's more 80fps in Prague too.

edit : In game save benchmarks from PCGamesHardware show really different results than TPU ones :

http://www.pcgameshardware.de/Deus-Ex-Mankind-Divided-Spiel-55470/Specials/Benchmarks-Test-DirectX-12-1204575/



My results from yesterday session are very near from PCGH real gaming situation.


----------



## bossie2000

Is that rx 480 beating a stock 1060 i guese?


----------



## zalbard

Quote:


> Originally Posted by *Mahigan*
> 
> Something tells me that once the DX12 is ironed out and drivers are released by both AMD and nVIDIA... we might see the FuryX performing admirably.


The whole point of DX12 is to make the performance as driver-independent as possible.


----------



## Klocek001

it's gonna be funny if AMD's showcase dx12 game runs best on nvidia cards in dx11


----------



## daviejams

Quote:


> Originally Posted by *Klocek001*
> 
> it's gonna be funny if AMD's showcase dx12 game runs best on nvidia cards in dx11


That does not appear to be the case


----------



## Klocek001

Quote:


> Originally Posted by *daviejams*
> 
> That does not appear to be the case


I thought I've seen dx12 performance of this game at pretty awful numbers


----------



## daviejams

Quote:


> Originally Posted by *Klocek001*
> 
> I thought I've seen dx12 performance of this game at pretty awful numbers


DX12 is coming in an update

DX11 performance on Nvidia cards does not look great so maybe this will be the DX12 game changer like Doom's update to the Vulcan API


----------



## Ha-Nocri

480 is actually faster than 1060 on both sites and here too. So everything seems to be OK.


----------



## Valor958

Quote:


> Originally Posted by *Mach 5*
> 
> Dont you need Win 10 for DX12?


That post was a long time ago lol. I've long since upgraded to W10. I plan on still getting a DX12 card still though.

I've tried Vulkan a few times and it gave Me trouble so I switched back to DX. I'll be sure to test both when I can and see how things stack up.


----------



## p4inkill3r

Quote:


> Originally Posted by *Klocek001*
> 
> it's gonna be funny if AMD's showcase dx12 game runs best on nvidia cards in dx11


Quote:


> In our graphics cards reviews, the GTX 1060 conclusively beat the RX 480 when looking at DirectX 11 titles, which Mankind Divided is until the DirectX 12 patch comes out in September. Here, we see a different picture, with the RX 480 being 10% faster than the GTX 1060 in all our tests, even at up to 20% in some cases. NVIDIA's previous-generation GTX 980 Ti is barely faster than the RX 480, which is much slower than the Fury X, even though the Fury X only has 4 GB of VRAM.


https://www.techpowerup.com/reviews/Performance_Analysis/Deus_Ex_Mankind_Divided/6.html


----------



## Klocek001

sorry, I don't usually read the charts from bottom up.


----------



## EniGma1987

Quote:


> Originally Posted by *Unkzilla*
> 
> Last update here until DX12 comes out. Have finished the first mission
> 
> @4k, gtx 1080 @ 2050mhz
> 
> Ultra setting avg fps-27
> High setting avg fps-40
> Medium with Very high textures-55fps
> 
> Surprisingly on medium preset with the vh textures the game still looks alright.. main difference I have noticed vs ultra/high is with the shadows but nothing to justify the performance difference. luckily it has triple buffering so there is no screen tearing under 60fps. First mission was a lot of fun.
> 
> If DX12 can give a 10%+ boost that will help a lot - the gameplay feels quite fluid in the 50's but at 40fps its not playable imo


With a Titan XP @ 2050MHz (throttling down as low as 1750MHz in quite a few areas):

high settings overall, with Very High textures and LOD, using medium shadows: 60-63 fps average with dips to low 50s.
Shadows are the biggest performance hit.


----------



## Olivon

Quote:


> Originally Posted by *Ha-Nocri*
> 
> 480 is actually faster than 1060 on both sites and here too. So everything seems to be OK.


Maybe AMD paid for the integrated benchmark I dunno but real gaming results are really different so I don't think this is OK.


----------



## p4inkill3r

Quote:


> Originally Posted by *Olivon*
> 
> Maybe AMD paid for the integrated benchmark I dunno but real gaming results are really different so I don't think this is OK.


Shocking that you would think so.


----------



## Klocek001

gaming results got ultra preset with contact hard shadows off and ambient occlusion turned down a notch


----------



## poii

https://www.computerbase.de/2016-08/deus-ex-mankind-divided-benchmark/3/

computerbase with a small test (only 8 GPUs).

On page 2 (the link is page 3) a small test between fps from low to ultra and CHS on/off/ultra with 1060 and 480 was made.

On page 1 they mention Nixxes ported Deus Ex to PC and will add DX12, Nixxes ported RoTR earlier.


----------



## NightAntilli

Porting to DX12 generally means the benefits of DX12 were already killed off. If we want to see the benefits of the new API, it must be done the other way around. Create on DX12, port to DX11. Leave out the stuff that doesn't work on DX11.

But no developer is willing to do that yet.


----------



## boredgunner

Quote:


> Originally Posted by *NightAntilli*
> 
> Porting to DX12 generally means the benefits of DX12 were already killed off. If we want to see the benefits of the new API, it must be done the other way around. Create on DX12, port to DX11. Leave out the stuff that doesn't work on DX11.
> 
> But no developer is willing to do that yet.


Yup, most people don't seem to understand that. So expect another ~10% performance boost for AMD when DX12 is released, and a performance penalty for NVIDIA.


----------



## Klocek001

porting from console to pc with as little work as possible seems to be creme de la crop of moderng pc gaming for some.


----------



## Hl86

Quote:


> Originally Posted by *boredgunner*
> 
> Yup, most people don't seem to understand that. So expect another ~10% performance boost for AMD when DX12 is released, and a performance penalty for NVIDIA.


And no multi gpu support.


----------



## jmcosta

Quote:


> Originally Posted by *Klocek001*
> 
> porting from console to pc with as little work as possible seems to be creme de la crop of moderng pc gaming for some.


"for some"


----------



## boredgunner

Quote:


> Originally Posted by *jmcosta*
> 
> "for some"


For most AAA studios. Yet so many PC gamers still only play and know about AAA games.


----------



## Ha-Nocri

Some real results 480 vs 1060. 480 is definitely winning in both built-in benchmark and in game:


----------



## Noufel

Quote:


> Originally Posted by *Olivon*
> 
> Maybe AMD paid for the integrated benchmark I dunno but real gaming results are really different so I don't think this is OK.


OMG AMD paid for a benchmark what a shame, nvidia would have never done this especialy in TWIMTBP games where gameworks gimps even older nvidia gpus


----------



## Noufel

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Some real results 480 vs 1060. 480 is definitely winning in both built-in benchmark and in game:


whut i hope that nvidia does something ( i don't care if it's







thing ) for the dx patch cause i don't want my 1080 to be spanked by a fury


----------



## vaseria

Quote:


> Originally Posted by *lacrossewacker*
> 
> Launching with TressFX 3.0 is nice.


so is launching with NVidia hairworks or gameworks or just tessellation all over the place!


----------



## HaiderGill

I realise there aren't many programmers on here but don't people realise AMD GCN architecture is parallel if you go in and parallelise your code it will run faster on GCN. If you stick to sequential then NVidia is faster. Direct X12 allows even more parallelisation. It's why AMD want to increase their footprint in the PC space so that in porting stuff from consoles the parallel nature is carried across. I had a good read from cloud imperium games on how they took the parallelisation of code and operations then utilised what they could from DX12 in DX11. If you stick just using DX12 in sequential manner then you end up with RoTR where NVidia holds the advantage...


----------



## NightAntilli

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Some real results 480 vs 1060. 480 is definitely winning in both built-in benchmark and in game:


Call me crazy but, the colors on the RX 480 look a lot more vibrant compared to the 1060.


----------



## ToTheSun!

Quote:


> Originally Posted by *NightAntilli*
> 
> Call me crazy but, the colors on the RX 480 look a lot more vibrant compared to the 1060.


That's because they are.


----------



## FLCLimax

Quote:


> Originally Posted by *NightAntilli*
> 
> Call me crazy but, the colors on the RX 480 look a lot more vibrant compared to the 1060.


It's not uncommon for NVIDIA to sacrifice image quality to eek out an extra frame or two.

EDIT: Another game which NV needs to issue a seizure warning.


----------



## huzzug

I also see the 1060 side hitching every now and again (eg, when after the benchmark, him running towards a car and the movement on the 1060 was shaky). Anyone else notice or am I dreaming


----------



## Robenger

Quote:


> Originally Posted by *huzzug*
> 
> I also see the 1060 side hitching every now and again (eg, when after the benchmark, him running towards a car and the movement on the 1060 was shaky). Anyone else notice or am I dreaming


Ssshhhhh Nvidia has no problems, only AMD.


----------



## jmcosta

Quote:


> Originally Posted by *FLCLimax*
> 
> It's not uncommon for NVIDIA to sacrifice image quality to eek out an extra frame or two.
> 
> EDIT: Another game which NV needs to issue a seizure warning.


they always had different gamma\contrast. image quality is the same.
upload screenshots if you want, you will see the texture and af is the same in both sides


----------



## FLCLimax

image quality is different in other games as well though, most notably BF4, DOOM and AotS. It's not worth arguing, there will be more cases, as this is ancient going back to the early 00's.


----------



## jmcosta

Quote:


> Originally Posted by *FLCLimax*
> 
> image quality is different in other games as well though, most notably BF4, DOOM and AotS. It's not worth arguing, there will be more cases, as this is ancient going back to the early 00's.


it depends on the setting you have in the driver, if its set high by default obviously it will have worse LOD but apples to apples its the same, also performance wise its minor


----------



## mouacyk

Quote:


> Originally Posted by *jmcosta*
> 
> it depends on the setting you have in the driver, if its set high by default obviously it will have worse LOD but apples to apples its the same, also performance wise its minor


Performance wise in this family of graphics engines, NVidia is pretty manure.


----------



## iTurn

Quote:


> Originally Posted by *huzzug*
> 
> I also see the 1060 side hitching every now and again (eg, when after the benchmark, him running towards a car and the movement on the 1060 was shaky). Anyone else notice or am I dreaming


Known issue that the online news sites won't talk about... (typical AMD bashing publications)

https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/

https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/

Shoot here's a screen capture of the front page of the 10 series forum on Nvidia's official forums.


----------



## Robenger

Quote:


> Originally Posted by *iTurn*
> 
> Known issue that the online news sites won't talk about... (typical AMD bashing publications)
> 
> https://forums.geforce.com/default/topic/954602/geforce-1000-series/titan-x-pascal-high-dpc-latency-and-stuttering-/1/
> 
> https://forums.geforce.com/default/topic/941579/geforce-1000-series/gtx-1080-high-dpc-latency-and-stuttering/
> 
> Shoot here's a screen capture of the front page of the 10 series forum on Nvidia's official forums.


This is all in your mind.


----------



## iTurn

Quote:


> Originally Posted by *Robenger*
> 
> This is all in your mind.


LOL!


----------



## EniGma1987

What I found really interesting in the video was actually seeing a direct comparison between AMD and Nvidia memory compression. Everyone is talking how Nvidia compression is way advanced, but AMD was supposed to have greatly improved their compression and was claimed to be better. Now we see that 9/10 times in the test AMD does in fact have much better memory compression:


----------



## jmcosta

the memory compression thing isn't to lower the cache use but less bandwidth

idk which is smoother, the polaris seem to perform better with good cpus but it stutter as well in the video

its not a good proof of that latency issue, which i think they fixed in a few drivers ago.


----------



## Robenger

Quote:


> Originally Posted by *jmcosta*
> 
> the memory compression thing isn't to lower the cache use but less bandwidth
> 
> idk which is smoother, the polaris seem to perform better but it also stutter in the video
> its not a good proof of that latency issue, which i think they fixed in a few drivers ago.


Except its not fixed, check two posts up. Screenshot from the Geforce forums.


----------



## NightAntilli

Quote:


> Originally Posted by *EniGma1987*
> 
> What I found really interesting in the video was actually seeing a direct comparison between AMD and Nvidia memory compression. Everyone is talking how Nvidia compression is way advanced, but AMD was supposed to have greatly improved their compression and was claimed to be better. Now we see that 9/10 times in the test AMD does in fact have much better memory compression:


I'm not sure you can draw that conclusion though. Games often change VRAM use depending on how much is available. So the GTX 1060 might simply be using more VRAM because there's 6GB available rather than 4GB.

Unless he's using an 8GB RX 480, then disregard everything I said.


----------



## boredgunner

Quote:


> Originally Posted by *jmcosta*
> 
> it depends on the setting you have in the driver, if its set high by default obviously it will have worse LOD but apples to apples its the same, also performance wise its minor


Yep, people just don't bother tweaking those settings or even know that you can.


----------



## Mahigan

Quote:


> Originally Posted by *NightAntilli*
> 
> Call me crazy but, the colors on the RX 480 look a lot more vibrant compared to the 1060.


Yep... AMD use a warmer color palette than nVIDIA by default. You can tweak this in the settings for both GPUs though.

Interesting to see that nVIDIA have higher CPU usage than AMD in this game whilst the game is DX11. Seems that the Deus Ex Development team used AMDs advice and multi-threaded their game engine. This effectively helps AMD in that the CPU thread 0 can now concentrate on feeding the GPU commands while the other CPU threads are busy with complex simulations, AI etc.

This leads to a DX11 game which behaves somewhat like a DX12 game in terms of Multi-threaded performance and CPU utilization. nVIDIA end up with higher CPU usage (as we see under Vulkan and DX12 titles) because their software scheduling takes up extra CPU cycles. AMD end up spreading the load across many CPU cores and thus end up with lower CPU usage.

Interesting.

In other words.. I do not think that Deus-Ex Supports deferred rendering or multi-threaded command listing. I think that the game engine is, instead, threaded.

This is just a hunch.


----------



## NightAntilli

We'll know more when the DX12 patch releases I guess, and hopefully it means performance gains rather than losses.


----------



## boredgunner

Quote:


> Originally Posted by *Mahigan*
> 
> In other words.. I do not think that Deus-Ex Supports deferred rendering or multi-threaded command listing. I think that the game engine is, instead, threaded.
> 
> This is just a hunch.


The MSAA performance suggests it does use deferred rendering. I don't see any reason why MSAA would perform terribly otherwise.


----------



## killerhz

Quote:


> Originally Posted by *NightAntilli*
> 
> We'll know more when the DX12 patch releases I guess, and hopefully it means performance gains rather than losses.


yeah i am waiting for the DX12 patch to hit and scope out the performance before i buy it...


----------



## ZealotKi11er

Stopped using MSAA since BF3. Only thing I use is the Crysis 3 AA and for BF4 I use Resolution Scaling.


----------



## Unkzilla

Quote:


> Originally Posted by *EniGma1987*
> 
> With a Titan XP @ 2050MHz (throttling down as low as 1750MHz in quite a few areas):
> 
> high settings overall, with Very High textures and LOD, using medium shadows: 60-63 fps average with dips to low 50s.
> Shadows are the biggest performance hit.


That's good to know. I might try setting to high and just reducing shadows.

I also tried dropping to 3200x1800 - i'm playing on a 65" tv and it seems that my screen handles this resolution OK.. with that being said there seems to be a massive jump in performance which doesn't correlate with the reduction of pixel count ... hopefully the 4k performance can be improved via drivers/dx12


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Stopped using MSAA since BF3. Only thing I use is the Crysis 3 AA and for BF4 I use Resolution Scaling.


Frostbite 2 and 3 are actually an exception to the rule. Their MSAA has very little impact on performance and above average effectiveness. I guess they don't use deferred rendering. Built in downsampling is great too. Frostbite 3 is a great engine for shooters unlike Dawn engine.


----------



## sticks435

Quote:


> Originally Posted by *boredgunner*
> 
> Frostbite 2 and 3 are actually an exception to the rule. Their MSAA has very little impact on performance and above average effectiveness. I guess they don't use deferred rendering. Built in downsampling is great too. Frostbite 3 is a great engine for shooters unlike Dawn engine.


Except in Dragon Age turning on MSAA would destroy your framerate and you could only decrease the resolution scale, not increase it above your native res.


----------



## boredgunner

Quote:


> Originally Posted by *sticks435*
> 
> Except in Dragon Age turning on MSAA would destroy your framerate and you could only decrease the resolution scale, not increase it above your native res.


I wouldn't say destroy, but sure it was noticeable. I was able to max out the game with a GTX 780 Ti at 1440p and only rarely would it go under 50 FPS. The game looks great but the optimization is nowhere near DICE's own works on their engine. Probably has something to do with the game being originally made on Frostbite 2.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> I wouldn't say destroy, but sure it was noticeable. I was able to max out the game with a GTX 780 Ti at 1440p and only rarely would it go under 50 FPS. The game looks great but the optimization is nowhere near DICE's own works on their engine. Probably has something to do with the game being originally made on Frostbite 2.


MSAA was pointless for BF3/4. For example in BF3 MSAA had 0 impact for me because I was CPU limited and would just utilize the GPUs better. In BF4 the same thing but Resolution scaling is infinitely better.


----------



## sticks435

Quote:


> Originally Posted by *boredgunner*
> 
> I wouldn't say destroy, but sure it was noticeable. I was able to max out the game with a GTX 780 Ti at 1440p and only rarely would it go under 50 FPS. The game looks great but the optimization is nowhere near DICE's own works on their engine. Probably has something to do with the game being originally made on Frostbite 2.


Hmmmm, with every setting maxed with my 980Ti at DSR 1440P, I would still get pretty regular dips down into the 50's and sometimes even mid-low 40's. Might have been my 2500K struggling to feed it fast enough though.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> MSAA was pointless for BF3/4. For example in BF3 MSAA had 0 impact for me because I was CPU limited and would just utilize the GPUs better. In BF4 the same thing but Resolution scaling is infinitely better.


Downsampling to a certain point can be more helpful but more demanding. Using both would yield best results. MSAA is far from pointless in those games, it reduces aliasing considerably even if you might not notice (most gamers still don't seem to notice aliasing).


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> Downsampling to a certain point can be more helpful but more demanding. Using both would yield best results. MSAA is far from pointless in those games, it reduces aliasing considerably even if you might not notice (most gamers still don't seem to notice aliasing).


Only some games need AA to begin with. Games like GTA V need AA. I personally find that games just look a lot sharper without AA even if lines are not straight.


----------



## Klocek001

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only some games need AA to begin with. Games like GTA V need AA. I personally find that games just look a lot sharper without AA even if lines are not straight.


that is true with fxaa, it makes everything blurry as hell. smaa is not bad, but still a little blurred. msaa is sharp enough.

still, I personally use txaa when I can, looks most epic.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only some games need AA to begin with. Games like GTA V need AA. I personally find that games just look a lot sharper without AA even if lines are not straight.


Couldn't disagree more. Games are a lot more aliased now than ever, all this shader aliasing spawned from more complex shaders. As a result these games look like crap at anything less than 5120 x 2880, games like The Witcher 3 and Assassin's Creed: Unity. Everything is a blob even at 2560 x 1440. And MSAA doesn't help at all. TAA or better yet SSAA is needed. Or just more pixels... a lot more. 5k or more.

Not all AA blurs. Post-sharpening exists to counter those that do. TAA sometimes adds a negligible amount of blur that isn't actually noticeable, but this game is an obvious exception. TAA blurs it to hell. I use SweetFX sharpening via ReShade since in-game sharpening is too sharp and makes aliasing more visible.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> Couldn't disagree more. Games are a lot more aliased now than ever, all this shader aliasing spawned from more complex shaders. As a result these games look like crap at anything less than 5120 x 2880, games like The Witcher 3 and Assassin's Creed: Unity. Everything is a blob even at 2560 x 1440. And MSAA doesn't help at all. TAA or better yet SSAA is needed. Or just more pixels... a lot more. 5k or more.
> 
> Not all AA blurs. Post-sharpening exists to counter those that do. TAA sometimes adds a negligible amount of blur that isn't actually noticeable, but this game is an obvious exception. TAA blurs it to hell. I use SweetFX sharpening via ReShade since in-game sharpening is too sharp and makes aliasing more visible.


I played Witcher 3 with no AA and it was never noticeable. Played RoTR with no AA and looked a lot sharper or use the tank your fps SSAA.


----------



## Robenger

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I played Witcher 3 with no AA and it was never noticeable. Played RoTR with no AA and looked a lot sharper or use the tank your fps SSAA.


At what resolution though. Seems like 1080p you will certainly notice no AA. Side note I still love using SMAA as it's almost performace free.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Robenger*
> 
> At what resolution though. Seems like 1080p you will certainly notice no AA. Side note I still love using SMAA as it's almost performace free.


4K. 1080p and 1440p is for plebeians.


----------



## Robenger

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 4K. 1080p and 1440p is for plebeians.


----------



## Hl86

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 4K. 1080p and 1440p is for plebeians.


+Rep


----------



## ZealotKi11er

Quote:


> Originally Posted by *Robenger*


If I stayed with 1440p it would mean I would not even think about upgrading GPUs. 90% of games I play run 4K with single 290X.


----------



## EniGma1987

Quote:


> Originally Posted by *boredgunner*
> 
> Couldn't disagree more. Games are a lot more aliased now than ever, all this shader aliasing spawned from more complex shaders. As a result these games look like crap at anything less than 5120 x 2880, games like The Witcher 3 and Assassin's Creed: Unity. Everything is a blob even at 2560 x 1440. And MSAA doesn't help at all. TAA or better yet SSAA is needed. Or just more pixels... a lot more. 5k or more.
> 
> Not all AA blurs. Post-sharpening exists to counter those that do. TAA sometimes adds a negligible amount of blur that isn't actually noticeable, but this game is an obvious exception. TAA blurs it to hell. I use SweetFX sharpening via ReShade since in-game sharpening is too sharp and makes aliasing more visible.


I thought Temportal Alti-Aliasing (TAA) was more about objects in motion and helping to fight aliasing of the animation to make things appear to be moving more smoothly and correctly?


----------



## mouacyk

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 4K. 1080p and 1440p is for plebeians.


[email protected] is also for plebs. [email protected] and 165Hz are where it's at. [email protected] unfortunately doesn't exist.


----------



## stahlhart

Quote:


> Originally Posted by *mouacyk*
> 
> [email protected] is also for plebs. [email protected] and 165Hz are where it's at. [email protected] unfortunately doesn't exist.


REP+


----------



## boredgunner

Quote:


> Originally Posted by *EniGma1987*
> 
> I thought Temportal Alti-Aliasing (TAA) was more about objects in motion and helping to fight aliasing of the animation to make things appear to be moving more smoothly and correctly?


It is, but it's a combination of various things. NVIDIA has compared its quality (on geometry I believe) to MSAA in screenshots, and it's a lot more effective in general than MLAA/SMAA/FXAA.
Quote:


> Originally Posted by *mouacyk*
> 
> [email protected] unfortunately doesn't exist.


Soon.


----------



## Tobiman

Quote:


> Originally Posted by *boredgunner*
> 
> It is, but it's a combination of various things. NVIDIA has compared its quality (on geometry I believe) to MSAA in screenshots, and it's a lot more effective in general than MLAA/SMAA/FXAA.
> Soon.


.TM


----------



## Klocek001

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 4K. 1080p and 1440p is for plebeians.


lol enjoying spending $4.175K to play 30 fps ?
30/60 fps @60Hz is for plebeians, +100 fps/100Hz is where it's at.


----------



## Xuvial

Quote:


> Originally Posted by *mouacyk*
> 
> [email protected] unfortunately doesn't exist.


It exists, and it's an OLED.

We're just waiting for it to release.


----------



## ToTheSun!

Quote:


> Originally Posted by *Xuvial*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mouacyk*
> 
> [email protected] unfortunately doesn't exist.
> 
> 
> 
> It exists, and it's an OLED.
> 
> We're just waiting for it to release.
Click to expand...

Pretty sure SC will release first.


----------



## Xuvial

Quote:


> Originally Posted by *ToTheSun!*
> 
> SC


SC?

edit: oh Star Citizen. Haha.


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> Pretty sure SC will release first.


Black Mesa will finish first, S.T.A.L.K.E.R. Lost Alpha Director's Cut and Misery 2.2 will finish first, S.T.A.L.K.E.R. 2 will hit early access first, Half-Life 3 will be released first.


----------



## BradleyW

Quote:


> Originally Posted by *mouacyk*
> 
> [email protected] is also for plebs. [email protected] and 165Hz are where it's at. [email protected]120Hz unfortunately doesn't exist.


720p @ 24 Hz 4-LIFE!


----------



## The Source

Quote:


> Originally Posted by *Xuvial*
> 
> It exists, and it's an OLED.
> 
> We're just waiting for it to release.


And I'm waiting for the GPU horsepower to run it. It's always a compromise. Image quality vs fps. Plebery has nothing to do with it, despite what those less fortunate choose to vocalize.


----------



## Klocek001

GPU bound scene, I guess AMD is doing extremely well. 1080 only 15% faster than Fury X, Fury beating reference 980Ti/1070



CPU bound scene. 1080 is now 47% faster on 4790K and 55% faster on 4690K



DX11 much, AMD ?

The reviewer raid dx12 is unplayable as of right now, crashes & fps dips all over the place.


----------



## huzzug

The DX12 patch is not available for DeusEx. Maybe that is the cause


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> The DX12 patch is not available for DeusEx. Maybe that is the cause


I'm inclined to believe the cause is how awful amd cards are in dx11, not the absence of dx12 patch.


----------



## huzzug

Well nothing in that graph screams awful DX11 AMD cards, but good b8 m8. I was talking more about you saying that the game crashes and FPS dips in DX12


----------



## Newbie2009

So how is the game? I am holding out for a week or 2 for some patches and drivers.


----------



## Klocek001

lol it's completely fantastic when fury x comes within ~15% of 1080, but something's wrong when the difference between these cards can grow up to 55% on 4.5GHz 4690K in other part of the game. But I guess this is normal for you.



yeah dx12 is a mess from top to bottom.


----------



## daviejams

Quote:


> Originally Posted by *Newbie2009*
> 
> So how is the game? I am holding out for a week or 2 for some patches and drivers.


Bought it yesterday , it's pretty similar to the last one just bigger , looks better and has no boss battles so if you enjoyed that then yeah you should get it

As usual I will end up finishing it and they will bring out the DX12 patch the next day. It runs well enough on my i5 skylake and 290x on the high setting ,higher settings not so much


----------



## Ha-Nocri

TweakTown did review the game, with an AIB 480 (finally someone):



It's a big performance advantage over 1060, 24%. DX12 should increase it even more when CPU is no more bottleneck


----------



## The Source

DX12 patch isn't due until Sept 5th. So any reviews thus far are irrelevant.


----------



## Klocek001

Quote:


> Originally Posted by *Ha-Nocri*
> 
> TweakTown did review the game, with an AIB 480 (finally someone):
> 
> 
> 
> It's a big performance advantage over 1060, 24%. DX12 should increase it even more when *CPU is no more bottleneck*


but clearly forgot 1060 AIB exist, and those would most certainly be able to hold 30 min fps unlike 480 which can't do that even on 5960x (CPU bottleneck on 5960x







is a $1000 cpu not good enough for your $200 card







)
those fps on the whole amd lineup are absolutely horrid, wonder what sort of performance we would see on a mainstream CPU like Haswell i5, down to single digit min fps on amd ?


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ha-Nocri*
> 
> TweakTown did review the game, with an AIB 480 (finally someone):
> 
> 
> 
> It's a big performance advantage over 1060, 24%. DX12 should increase it even more when *CPU is no more bottleneck*
> 
> 
> 
> but clearly forgot 1060 AIB exist, and those would most certainly be able to hold 30 min fps unlike 480 which can't do that even on 5960x (CPU bottleneck on 5960x
> 
> 
> 
> 
> 
> 
> 
> )
> those fps on the whole amd lineup are absolutely horrid, wonder what sort of performance we would see on a mainstream CPU like Haswell i5, down to single digit min fps on amd ?
Click to expand...

That's something that I can agree with. Wonder what happens to their promise of affordable VR for everyone


----------



## Ha-Nocri

Well, 1060 FE boosts quite high anyways. Performance difference between reference 480 and AIB one is much bigger.

Also, it's probably about core speed, so I would guess an i5 wouldn't be too bad.

But DX12 is not far away anyways....


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> That's something that I can agree with. Wonder what happens to their promise of affordable VR for everyone


well they said everyone would be able to play it, not that it would be a good experience.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> Well, 1060 FE boosts quite high anyways. Performance difference between reference 480 and AIB one is much bigger.
> 
> Also, it's probably about core speed, so I would guess an i5 wouldn't be too bad.
> 
> But DX12 is not far away anyways....


still, I'm inclined towards saying that 5960x should be enough for amd cards not to drops into 20s, and finding an excuse in the game not being dx12 is sorta lame


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> That's something that I can agree with. Wonder what happens to their promise of affordable VR for everyone
> 
> 
> 
> well they said everyone would be able to play it, not that it would be a good experience.
Click to expand...

I'd need to keep a bucket and cloth handy for an extreme gaming session. Don't want to have to go to toilet to throw up.


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> I'd need to keep a bucket and cloth handy for an extreme gaming session. Don't want to have to go to toilet to throw up.











think what all this purging would do for your body tho. you'd be like a young Adonis after finishing a game with expansion.


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> I'd need to keep a bucket and cloth handy for an extreme gaming session. Don't want to have to go to toilet to throw up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> think what all this purging would do for your body tho. you'd be like a young Adonis after finishing a game with expansion.
Click to expand...

I'd be size 0 and be on America's Next Top Model. I'm sure it would get gaming more attention from the media


----------



## alawadhi3000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Well, 1060 FE boosts quite high anyways. *Performance difference between reference 480 and AIB one is much bigger.*
> 
> Also, it's probably about core speed, so I would guess an i5 wouldn't be too bad.
> 
> But DX12 is not far away anyways....


No its not.

MSI RX 480 Gaming X @ 1370/2250 did improve 12.2% over RX 480 reference.



MSI GTX 1060 Gaming X @ 2139/2435 improved 18.2% over GTX 1060 reference.




Even though the GTX 1060 sometimes boosts to ~1900MHz out of the box you can still get at least an extra 200MHz out of it therefore making it a better overclocker than an AMD RX 480.


----------



## Lass3

Quote:


> Originally Posted by *Ha-Nocri*
> 
> TweakTown did review the game, with an AIB 480 (finally someone):
> 
> 
> 
> It's a big performance advantage over 1060, 24%. DX12 should increase it even more when CPU is no more bottleneck


Custom 480 vs Reference / Founders 1060, and even then, the minimum fps is higher on 1060.. Looks at AMD's minimums in general.. Meh..

OC3D.net tested 1060 vs 480, both custom, and 1060 won most games comfortably. Also less watt usage and heat output = Less noisy.

http://overclock3d.net/reviews/gpu_displays/msi_rx_480_and_gtx_1060_gaming_x_review/1

1060 also scales better with OC. It's close to 980 Ti reference when fully clocked.

Not impressed by RX 480 tbh.. Looking forward to Vega tho. Might pick up a Vega if it's any good.


----------



## Klocek001

Quote:


> Originally Posted by *Lass3*
> 
> Custom 480 vs Reference / Founders 1060, and even then, the minimum fps is higher on 1060.. Looks at AMD's minimums in general.. Meh..
> 
> OC3D.net tested 1060 vs 480, both custom, and 1060 won most games comfortably. Also less watt usage and heat output = Less noisy.
> 
> http://overclock3d.net/reviews/gpu_displays/msi_rx_480_and_gtx_1060_gaming_x_review/1
> 
> 1060 also scales better with OC. It's close to 980 Ti reference when fully clocked.
> 
> Not impressed by RX 480 tbh.. Looking forward to Vega tho. Might pick up a Vega if it's any good.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> GPU bound scene, I guess AMD is doing extremely well. 1080 only 15% faster than Fury X, Fury beating reference 980Ti/1070
> 
> 
> 
> CPU bound scene. 1080 is now 47% faster on 4790K and 55% faster on 4690K
> 
> 
> 
> DX11 much, AMD ?
> 
> The reviewer raid dx12 is unplayable as of right now, crashes & fps dips all over the place.


So if DX12 is implemented correctly and they add support for ASync then we can see AMD doing even better?


----------



## Klocek001

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So if DX12 is implemented correctly and they add support for ASync then we can see AMD doing even better?


not up to me to say.
thew reviewer said there was no gain for amd cards in dx12 mode, although it was just some sort of press release dx12.

btw what I posted was only one of the parts of the review. 1070 beats it in other charts, the one I posted was an exception.



still, looks like what amd promised will pay off once dx12 patch is out (and it doesn't turn out to be botched)


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> not up to me to say.
> thew reviewer said there was no gain for amd cards in dx12 mode, although it was just some sort of press release dx12.
> 
> btw what I posted was only one of the parts of the review. 1070 beats it in other charts, the one I posted was an exception.
> 
> 
> 
> still, looks like what amd promised will pay off once dx12 patch is out (and it doesn't turn out to be botched)


Well the difference is in the benchmark AMD wins clearly but ingame they are a lot closer which is probably a CPU thing.


----------



## BradleyW

I'm having some performance issues with Deus. It's stuttering frequently and the fps keeps randomly dropping, even if I'm standing still. The mouse is horrible. One minute it's responsive, the next it's lagging/delayed.

RTS (Latest)
Win 10
290X
High
1080p Ultrawide
MSAA OFF

FPS around 35 to 45.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I'm having some performance issues with Deus. It's stuttering frequently and the fps keeps randomly dropping, even if I'm standing still. The mouse is horrible. One minute it's responsive, the next it's lagging/delayed.
> 
> RTS (Latest)
> Win 10
> 290X
> High
> 1080p Ultrawide
> MSAA OFF
> 
> FPS around 35 to 45.


Wish I could test it but I did not like the first game and not going to buy this one.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Wish I could test it but I did not like the first game and not going to buy this one.


I wonder if by first game, are you actually referring to the first Deus Ex game or Human Revolution which was the 3rd?


----------



## sumitlian

GTX 1070 is only 17% and 19% faster in 1080p and 1440p respectively than RX 480 8GB.
Nvidia should have supported DirectX 11 computer features.





http://www.legitreviews.com/deus-ex-mankind-divided-dx11-video-card-benchmarks_185666


----------



## sumitlian

And in here Guru3D,





http://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,7.html


----------



## Klocek001

how many amd guys will it take to correct before you realize that what you're posting are results from the built-in benchmark, where fury x edges out 1070. 1070 is actually winning in tests done during gameplay (the ones I posted). 1070FE wins by a hair while AIB 1070 gain substantially more lead.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> how many amd guys will it take to correct before you realize that what you're posting are results from the built-in benchmark, where fury x edges out 1070. 1070 is actually winning in tests done during gameplay (the ones I posted). 1070FE wins by a hair while AIB 1070 gain substantially more lead.


Like I said before that the benchmarks test the GPU more while the ingame test exposes AMD DX11 CPU overhead. Nothing new here. I just do not want people to really bother bringing that up considering this game will get DX12 and somehow people will be surprised why AMD gets the gains it does.


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> how many amd guys will it take to correct before you realize that what you're posting are results from the built-in benchmark, where fury x edges out 1070. 1070 is actually winning in tests done during gameplay (the ones I posted). 1070FE wins by a hair while AIB 1070 gain substantially more lead.


Don't worry it is not good for Nvidia either at all in many scenarios. Since unlike Nvidia, no DX12 negative scaling has been seen with AMD gpus except gameworks title (if any). The upcoming official DX12 support is probably going to fill the remaining gap. We might see even better scaling with AMD in GPU limited scenes as well.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Like I said before that the benchmarks test the GPU more while the ingame test exposes AMD DX11 CPU overhead. Nothing new here. I just do not want people to really bother bringing that up considering this game will get DX12 and somehow people will be surprised why AMD gets the gains it does.


Pretty much this.


----------



## Diffident

I'm wondering if they are also working on a Vulkan release since according to Steamdb.info linux is listed as and OS type.
Quote:


> launch/9/config/oslist: linux
> launch/9/executable: DeusExMD.sh


----------



## sumitlian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Like I said before that the benchmarks test the GPU more while the ingame test exposes *AMD DX11 CPU overhead*. Nothing new here. I just do not want people to really bother bringing that up considering this game will get DX12 and somehow people will be surprised why AMD gets the gains it does.


Though it is purely my opinion but saying something like AMD DX11 CPU overhead seems inappropriate in 2016 since we all have known by now the AMD GPU architecture is too parallel and new generation and that old DX11 is too serial to be fit into that. Better expression would be Microsoft DX11 overhead for AMD GPU architecture.







Just saying.


----------



## Blameless

Quote:


> Originally Posted by *sumitlian*
> 
> Though it is purely my opinion but saying something like AMD DX11 CPU overhead seems inappropriate in 2016 since we all have known by now the AMD GPU architecture is too parallel and new generation and that old DX11 is too serial to be fit into that. Better expression would be Microsoft DX11 overhead for AMD GPU architecture.
> 
> 
> 
> 
> 
> 
> 
> Just saying.


Relatively poor DX11 performance is one thing. Too much DX11 overhead is another.

Higher end GCN parts are too parallel to be efficiently leveraged by DX11, but that doesn't explain or justify the DX11 overhead.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Relatively poor DX11 performance is one thing. Too much DX11 overhead is another.
> 
> Higher end GCN parts are too parallel to be efficiently leveraged by DX11, but that doesn't explain or justify the DX11 overhead.


So you saying game developers should be more conservative with CPU DX11 overhead? Does it matter at this point. DX12 lets them get away with anything and Nvidia has not problem. In consoles it's not really a problem.


----------



## PontiacGTX

Quote:


> Originally Posted by *Blameless*
> 
> Relatively poor DX11 performance is one thing. Too much DX11 overhead is another.
> 
> Higher end GCN parts are too parallel to be efficiently leveraged by DX11, but that doesn't explain or justify the DX11 overhead.


Then also means most developers arent doing enough effort to optimize DX11 games using better multithreading.

Like Crytek,DICE,Ubisoft which improved their games and perform good with AMD GPUs


----------



## Gunderman456

Quote:


> Originally Posted by *BradleyW*
> 
> I'm having some performance issues with Deus. It's stuttering frequently and the fps keeps randomly dropping, even if I'm standing still. The mouse is horrible. One minute it's responsive, the next it's lagging/delayed.
> 
> RTS (Latest)
> Win 10
> 290X
> High
> 1080p Ultrawide
> MSAA OFF
> 
> FPS around 35 to 45.


I'm playing with two 290s. I tested with the built in benchmark and I get min 30fps and max 60fps with an average of 45fps on High and no MSAA.

I don't think CFX is working properly. We need a driver from AMD!

Anyway, I was having some screen flickering and I disabled the Steam overlay and that took care of it, but there is still flickering from light sources in the game.

Also the last game patch, incorporated a sliding scale - x/y for the mouse. Try playing with that. My defaults are 55/35.

There is a lot of bugs and crashes. The devs need to patch and optimize some more and AMD needs to release a dedicated driver. Before these things are done, trouble will persist.

Edit;

After playing for sometime, it has become apparent that disabling the Steam overlay did not help with the screen flickering issue.


----------



## Aussiejuggalo

Finished this last night on "give me deus ex", only had 3 crashes, first was in the train station cut scene (which has since been fixed), second was about mid way through the game, dunno what the hell happened and third was because I alt-tabbed out with the steam overlay up which caused a lock up. Had no other bugs or glitches that people have been talking about (ya'll sure your systems are stable?







)

I'm a bit disappointed with how short the game is though, even Invisible War was longer and it was one of the shortest / restricted in the series but I guess this is what happens when games are ported from console, file sizes are massive, games are shorter, pretty sad really.

Also playing on sig rig, ultra settings, no AA or motion blur, getting around 30 - 40fps, mouse is 11 / 7, was on 0 before the latest patch fixed acceleration. Waiting for DX12 before I do a play through on on "I Never Asked For This" difficulty for now though, collect all the achievements.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So you saying game developers should be more conservative with CPU DX11 overhead?


I didn't say anything about game developers. I was talking about AMD's drivers, which for the same performance demand more from the CPU than NVIDIA's in DX11.

Of course, game developers are always welcome to reduce overhead wherever they can.


----------



## Tobiman

Quote:


> Originally Posted by *Blameless*
> 
> I didn't say anything about game developers. I was talking about AMD's drivers, which for the same performance demand more from the CPU than NVIDIA's in DX11.
> 
> Of course, game developers are always welcome to reduce overhead wherever they can.


You are actually very wrong here. What Nvidia does is in fact use more of the cpu for more performance. In reality, it does lead to better performance in single threaded cpu heavy games for Nvidia but at the cost of far greater CPU usage.


----------



## daviejams

Quote:


> Originally Posted by *Gunderman456*
> 
> I'm playing with two 290s. I tested with the built in benchmark and I get min 30fps and max 60fps with an average of 45fps on High and no MSAA.
> 
> I don't think CFX is working properly. We need a driver from AMD!
> 
> Anyway, I was having some screen flickering and I disabled the Steam overlay and that took care of it, but there is still flickering from light sources in the game.
> 
> Also the last game patch, incorporated a sliding scale - x/y for the mouse. Try playing with that. My defaults are 55/35.
> 
> There is a lot of bugs and crashes. The devs need to patch and optimize some more and AMD needs to release a dedicated driver. Before these things are done, trouble will persist.


That's about the framerate I get on a single 290x so I don't think crossfire is doing much for you ! It does run better outside the benchmark and the drops only really happen outdoors , indoors it's more or less 60fps all the time. Maybe the DX12 patch will fix all that


----------



## jmcosta

Quote:


> Originally Posted by *BradleyW*
> 
> I'm having some performance issues with Deus. It's stuttering frequently and the fps keeps randomly dropping, even if I'm standing still. The mouse is horrible. One minute it's responsive, the next it's lagging/delayed.
> 
> RTS (Latest)
> Win 10
> 290X
> High
> 1080p Ultrawide
> MSAA OFF
> 
> FPS around 35 to 45.


what texture setting are you playing with?
high is for 2gb, Vh 4gb and ultra 8gb, if the vram isn't large enough it will stutter often.
also the jerky movement can be the low fps, mine was doing the same at around 50fps i had to drop some settings to be playable. the movement\mouse control is horrible lower than 60fps in this game


----------



## the9quad

I put everything up but textures, turned on triple buffering at 1080p 60hz, runs pretty smooth at 60 for me. Bout once every 5 to 10 minutes it will hitch, nothing game breaking since it is a single player game anyway. If i try to run 1440p it runs terrible despite showing 60fps.


----------



## AngryLobster

I think it's pretty sad that everyone in here is posting 1080p benches in 2016 as a indicator of performance for GPU's. What's even more sad is this thing developers do to artificially make a GPU demanding game where 1 or 2 settings at Very High drop performance by 30% or more with little to no perceivable IQ improvement. This game is a prime example of that and also looks no where near as good as it's performance at maxed out settings would indicate.

The game runs great @ 4K with settings set to high.


----------



## boredgunner

Quote:


> Originally Posted by *AngryLobster*
> 
> I think it's pretty sad that everyone in here is posting 1080p benches in 2016 as a indicator of performance for GPU's. *What's even more sad is this thing developers do to artificially make a GPU demanding game where 1 or 2 settings at Very High drop performance by 30% or more with little to no perceivable IQ improvement.* This game is a prime example of that and also looks no where near as good as it's performance at maxed out settings would indicate.
> 
> The game runs great @ 4K with settings set to high.


To clarify on the bolded part, contact hardening shadows is one such setting. On vs High makes almost no difference. Shadow Quality, in which High improves performance substantially, does look noticeably worse than on the highest setting however. I haven't played around with the other settings.


----------



## Klocek001

is CHS like PCSS or is it another thing ? I liked PCSS a lot in some games to be honest, especially in ACIVBF and Syndicate.


----------



## Woundingchaney

So I'm 24 hours in and honestly I have really enjoyed the game. Though I have the dreaded train station crash scenario and I am unable to finish the game until it is addressed. Its frustrating seeing game breaking issues in AAA titles in 2016........


----------



## end0rphine

Quote:


> Originally Posted by *Woundingchaney*
> 
> So I'm 24 hours in and honestly I have really enjoyed the game. Though I have the dreaded train station crash scenario and I am unable to finish the game until it is addressed. Its frustrating seeing game breaking issues in AAA titles in 2016........


Doesn't the train station scenario take place at the beginning of the game?


----------



## Unkzilla

Quote:


> Originally Posted by *AngryLobster*
> 
> I think it's pretty sad that everyone in here is posting 1080p benches in 2016 as a indicator of performance for GPU's. What's even more sad is this thing developers do to artificially make a GPU demanding game where 1 or 2 settings at Very High drop performance by 30% or more with little to no perceivable IQ improvement. This game is a prime example of that and also looks no where near as good as it's performance at maxed out settings would indicate.
> 
> The game runs great @ 4K with settings set to high.


Which GPU do you have? - I am a few hours into the game and with my OC'd 1080, framerate is all over the place on high. Dips into the high 30's at times in open areas. Framerate is 50+ in smaller spaces even in combat so that's a bit better. I have had to run this game on Medium to get the minimum fps into the 50's. I haven't investigated to see if I have a cpu bottleneck but I have a [email protected] 4.5ghz

Dropping my screen res to 3200x1800 or 2560x1440 seems to be the way to go - first game I have had to do this on. Surprisingly these resolutions still look quite good on a 4k screen. With that being said I get better performance on High in the witcher 3/crysis 3


----------



## Woundingchaney

Quote:


> Originally Posted by *end0rphine*
> 
> Doesn't the train station scenario take place at the beginning of the game?


Yes it does, but that isn't the segment I am referring to. The game is based on "zones" and to get to these zones you have to use the subway (train station). So given there is a known bug that causes a crash when using the train station during late game, the game is broken for many players. Eidos is reportedly investigating the issue as it seems to be very wide spread.


----------



## end0rphine

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yes it does, but that isn't the segment I am referring to. The game is based on "zones" and to get to these zones you have to use the subway (train station). So given there is a known bug that causes a crash when using the train station during late game, the game is broken for many players. Eidos is reportedly investigating the issue as it seems to be very wide spread.


Ah I see. I periodically get crashes when transitioning via train, but so far only happened twice.

I've actually experienced something else that brought my 'I never asked for this' run to a halt - when exploring some apartment the game just suddenly bottoms to 5 fps with a mini freeze every 2 seconds. Starting a new game fixed it.


----------



## Aussiejuggalo

I'd found a way to crash the game easily, doing a speed run, skipping through convos and cut scenes etc, crashed it 4 times in 10 mins







.

One thing that is starting to bug me is the auto run thing, for some reason when I load a game he's auto running which gets really annoying during steath, another thing is the "press space to continue" if your in cover he moves cover that is by far the most annoying thing I've found so far.


----------



## BradleyW

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I'd found a way to crash the game easily, doing a speed run, skipping through convos and cut scenes etc, crashed it 4 times in 10 mins
> 
> 
> 
> 
> 
> 
> 
> .
> 
> One thing that is starting to bug me is the auto run thing, for some reason when I load a game he's auto running which gets really annoying during steath, another thing is the "press space to continue" if your in cover he moves cover that is by far the most annoying thing I've found so far.


I've had this same issue.

Has anyone experienced slight stutters or pauses when travelling through the streets?


----------



## Edge0fsanity

Quote:


> Originally Posted by *BradleyW*
> 
> I've had this same issue.
> 
> Has anyone experienced slight stutters or pauses when travelling through the streets?


yes i get the stuttering in the streets


----------



## EniGma1987

Quote:


> Originally Posted by *Tobiman*
> 
> You are actually very wrong here. What Nvidia does is in fact use more of the cpu for more performance. In reality, it does lead to better performance in single threaded cpu heavy games for Nvidia but at the cost of far greater CPU usage.


But Nvidia performance doesnt drop off a cliff with lower performance CPUs, so how could their performance advantage come from extracting more usage like you say?

Quote:


> Originally Posted by *Woundingchaney*
> 
> Yes it does, but that isn't the segment I am referring to. The game is based on "zones" and to get to these zones you have to use the subway (train station). So given there is a known bug that causes a crash when using the train station during late game, the game is broken for many players. Eidos is reportedly investigating the issue as it seems to be very wide spread.


I have noticed that I often have the game crash on me when I take a train somewhere and then it will crash when I go to check my map right after to get my bearings.


----------



## Gunderman456

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I'd found a way to crash the game easily, doing a speed run, skipping through convos and cut scenes etc, crashed it 4 times in 10 mins
> 
> 
> 
> 
> 
> 
> 
> .
> 
> One thing that is starting to bug me is the auto run thing, for some reason when I load a game he's auto running which gets really annoying during steath, another thing is the "press space to continue" if your in cover he moves cover that is by far the most annoying thing I've found so far.


Same happens to me, just make sure you are not in stealth mode (f) when you save, since he will run off on you when you reload.

As for the crashes, everyone is getting them especially when taking the train, also the devs mentioned they were also working on other ones like when in the shooting gallery at the NSN, etc...

Stuttering in the street also occurs for me, as you exit from a building/sewer, but just for a second or two then it's gone.

Hopefully more patches this week to fix the bugs and for optimization.


----------



## VeritronX

Quote:


> Originally Posted by *sumitlian*
> 
> Though it is purely my opinion but saying something like AMD DX11 CPU overhead seems inappropriate in 2016 since we all have known by now the AMD GPU architecture is too parallel and new generation and that old DX11 is too serial to be fit into that. Better expression would be Microsoft DX11 overhead for AMD GPU architecture.
> 
> 
> 
> 
> 
> 
> 
> Just saying.


Quote:


> Originally Posted by *Blameless*
> 
> Relatively poor DX11 performance is one thing. Too much DX11 overhead is another.
> 
> Higher end GCN parts are too parallel to be efficiently leveraged by DX11, but that doesn't explain or justify the DX11 overhead.


The reason for the difference in DX11 overhead is that nvidia re-wrote their drivers to be multi-threaded starting in mid 2014, while AMD's drivers are still single-threaded. This actually makes AMD's GPUs get slower on the more expensive intel cpu's as the clockspeed and single core performance goes down compared to the high clocked quad cores. For reference the first driver from nvidia that was multi-threaded was 337.50, which improved performance by up to ~25% in some games on a single gpu and up to ~35% in SLI (according to a pcper test done at the time)


----------



## BradleyW

Quote:


> Originally Posted by *VeritronX*
> 
> The reason for the difference in DX11 overhead is that nvidia re-wrote their drivers to be multi-threaded starting in mid 2014, while AMD's drivers are still single-threaded. This actually makes AMD's GPUs get slower on the more expensive intel cpu's as the clockspeed and single core performance goes down compared to the high clocked quad cores. For reference the first driver from nvidia that was multi-threaded was 337.50, which improved performance by up to ~25% in some games on a single gpu and up to ~35% in SLI (according to a pcper test done at the time)


It's true that Nvidia did make changes to their driver for a more efficient operation in some respects, however AMD's poor single threaded performance is due to the actual architecture of GCN. That's why driver improvements for CPU efficiency show minimal gain.


----------



## Aussiejuggalo

I haven't had any stuttering in the streets its more in loading scenes like the train when it loads the next map, haven't had it crash during the train scene though which is odd.

It's a bit weird that my sig rig plays this thing maxed out and I don't have performance issues I just have the odd crash, and even they don't happen that much







.


----------



## boredgunner

I tend to get stuttering just before a crash. So when I see stuttering I quicksave and brace for impact.


----------



## EniGma1987

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I haven't had any stuttering in the streets its more in loading scenes like the train when it loads the next map, haven't had it crash during the train scene though which is odd.
> 
> It's a bit weird that my sig rig plays this thing maxed out and I don't have performance issues I just have the odd crash, and even they don't happen that much
> 
> 
> 
> 
> 
> 
> 
> .


If I had to guess Id say it is all the people on Nvidia cards getting crashes cause of the really sucky drivers Nvidia puts out


----------



## Klocek001

372.70 is out, game-ready for mankind divided


----------



## Ghoxt

Added or updated the following SLI profiles:


Deus Ex: Mankind Divided - added DirectX 11 SLI profile

*At 4K in my Sig Rig:*


----------



## BradleyW

Steam has just started downloading a 75.4mb patch on my end. This will be the 3rd patch/update I've received since release.

Game build now shows v1.2.


----------



## Gunderman456

Quote:


> Originally Posted by *BradleyW*
> 
> Steam has just started downloading a 75.4mb patch on my end. This will be the 3rd patch/update I've received since release.
> 
> Game build now shows v1.2.


Let's hope for good results!


----------



## Bal3Wolf

heres the patch notes for last update got them off the steam forums for the game, im loving the game plays pretty good with my 2 7970s.

Quote:


> [30-08-2016] PC Patch notes for Deus Ex: Mankind Divided Patch build 524.15
> 
> We have just released another PC patch for Deus Ex: Mankind Divided, v1.2 build 524.15. This patch focusses on critical issues that were reported by users during the week after release.
> 
> This patch will be applied by Steam automatically when you next start the game. If your game does not update, please restart the Steam client.
> 
> The following fixes are in this patch:
> 
> • Fixed a crash during the subway loading scene.
> • Fixed issue with loading a saved game while in cover and pressing [SPACE], causing Jensen to vault over cover.
> • Fixed crash when leaving for Golem City.
> • Fixed issues where the UI was offset at some aspect ratios, such as 21:9.
> • Fixed issue where vibration was not working on gamepad.
> • Fixed an issue where some keyboard language settings could cause the game to not launch (some users had a workaround by creating a new user account or by changing their keyboard language setting).
> • Fixed various issues related to Breach.
> • Fixed issues related to Tobii EyeTracking.
> 
> Note about performance:
> We are seeing people reporting performance issues when playing the game on Very High/Ultra settings with MSAA set to 2x, 4x, or 8x. We would like to emphasize again that these options are very demanding.
> We recommend everyone that is running at recommended spec or higher to start with the High Preset and MSAA turned off, and then tweak the options to optimize your experience.
> 
> While we expect this patch to be an improvement for everyone, if you do have trouble with this patch and prefer to stay on older versions, we have made a Beta available on Steam, v1.0_build 524.6, v1.0_build 524.7 and v1.1_build 524.10, that can be used to switch back to previous versions.
> 
> We will keep monitoring for feedback and will release further patches as it seems required. We always welcome your feedback!


http://steamcommunity.com/app/337000/discussions/0/352792037329870248/


----------



## BradleyW

I respect the fact that they allow us to revert to previous builds which I find very handy! If only every programming dev team would do this!


----------



## boredgunner

Still getting occasional crashes seemingly caused by bringing up menus (map menu is the worst offender), but less crashes than before it seems. I'm just now up to the point of no return for the game. It's shorter than Human Revolution and the original as expected, with most of the game taking place in Prague.


----------



## BradleyW

I get the feeling I will complete this game before the DX12 patch lands.....

Edit: How do I fix this ultrawide error? The red is from when you take damage.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I get the feeling I will complete this game before the DX12 patch lands.....
> 
> Edit: How do I fix this ultrawide error? The red is from when you take damage.


Don't play Ultrawide.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Don't play Ultrawide.












Does anyone have a more helpful suggestion?


----------



## EniGma1987

Quote:


> Originally Posted by *BradleyW*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does anyone have a more helpful suggestion?


Report to developers and hope it gets fixed in a patch?


----------



## BradleyW

How to I stabilize my augmentation system? I can't seem to select an aug and deactivate it.

Edit: never mind it's working now.


----------



## EastCoast

Any performance improvements on a single card yet?


----------



## BradleyW

Quote:


> Originally Posted by *EastCoast*
> 
> Any performance improvements on a single card yet?


I just tested the new 16.8.3 drivers (previously using 16.8.2) in an indoor test location and found no improvement. This is not to say there is no improvement in other areas of the game.


----------



## HaiderGill

Pretty much what I always do is give a game a year for the hot-fixes/patches to roll-out and then play. Should not be the case but it is...


----------



## Aussiejuggalo

Quote:


> Originally Posted by *EastCoast*
> 
> Any performance improvements on a single card yet?


Finished the game before and after the 16.8.3 update, no improvement.

On a side note getting "Foxiest of the hounds" achievement was kind of a pain, didn't realise some choices set off alarms







, aside from that good game but far far to short







.


----------



## Silent Scone

I don't get that issue in ultrawide.


----------



## BradleyW

Quote:


> Originally Posted by *Silent Scone*
> 
> I don't get that issue in ultrawide.


I think it's an FOV issue. I have FOV at 116%.


----------



## jcde7ago

I lament beating this game before the DX12 patch could be released....yeah, I spent around ~25 hours playing it but it felt much too short for me, story-wise at least. Not sure if I want to replay before the DLC missions hit unless DX12 is gonna be offering some spectacular performance improvements and features.

Curious to see if the obscene microstuttering issues get addressed....had to play with a single Titan X Maxwell as having those SLI'd caused ridiculous stuttering, and that stuttering is still present even with my switch to two Titan X Pascals...which is just stupid for a game with "official SLI support." Pretty much every other game i've played with SLI support in the last 2-3 years has been good-to-great, but Mankind Divided is downright awful with SLI.


----------



## BradleyW

I'm gutted. I did not meet the requirements to start side quest "SM09 - In the family" due to the events which unfolded in a previous side quest. Me sad....


----------



## boredgunner

Quote:


> Originally Posted by *BradleyW*
> 
> I'm gutted. I did not meet the requirements to start side quest "SM09 - In the family" due to the events which unfolded in a previous side quest. Me sad....


I had to look up that quest since I never got it, so same here. Not that I care since I plan more playthroughs in the future, once the game is a complete product.

I finished it today, took me about 40 hours. Similar length to Human Revolution without Missing Link, but that's because of more exploration. It has less main quest content than HR.


----------



## the9quad

Game is fantastic.


----------



## Silent Scone

Anyone else experienced the ambient music bug? The music just stops playing. Seems going into the sewers and coming out solves it, but on travelling to another part of Prague, it happens again. Kind of deal breaker.


----------



## BradleyW

I have a feeling we'll have to wait until Friday for the DX12 patch to land.


----------



## Nameless1988

On my Rig, the game runs smoother thanks to freesync. I suggest to disable hardening contact shadows, and low depth of field and volumetric illumination, these 3 settings introduce heavy stuttering.
I am waiting for dx12 patch.


----------



## BradleyW

I also suggest disabling contact shadows because it significantly reduces the shadow draw distance.

Please release DX12 tonight!


----------



## Lass3

Quote:


> Originally Posted by *the9quad*
> 
> Game is fantastic.


Uhm, how come user reviews are terrible for the most part?


----------



## Nameless1988

Quote:


> Originally Posted by *BradleyW*
> 
> I also suggest disabling contact shadows because it significantly reduces the shadow draw distance.
> 
> Please release DX12 tonight!


Today they release dx12 patch??


----------



## BradleyW

Quote:


> Originally Posted by *Nameless1988*
> 
> Today they release dx12 patch??


I have no idea. I'm just saying "please release the DX12 patch tonight".


----------



## Nameless1988

Quote:


> Originally Posted by *BradleyW*
> 
> I have no idea. I'm just saying "please release the DX12 patch tonight".


Ah ok ?


----------



## daviejams

Quote:


> Originally Posted by *BradleyW*
> 
> I have no idea. I'm just saying "please release the DX12 patch tonight".


I think they are going to

Their community manager on reddit said so ...


----------



## Newbie2009

Quote:


> Originally Posted by *daviejams*
> 
> I think they are going to
> 
> Their community manager on reddit said so ...


If it was Vulkan it would be exciting. DX 12, not so much.


----------



## BradleyW

Quote:


> Originally Posted by *daviejams*
> 
> I think they are going to
> 
> Their community manager on reddit said so ...


Yeah it looks like Thursday!
https://www.reddit.com/r/Deusex/comments/51jp6a/so_where_is_directx12/


----------



## boredgunner

Not sure why people are expecting miracles from a DX12 patch bolted on top of a DX11 game. I'm calling it right now; performance penalty for NVIDIA, and a 5-15% performance boost for AMD.


----------



## Nameless1988

I hope dx12 will bring more optimization, unlike ROTTR where dx12 = dx11.


----------



## boredgunner

Quote:


> Originally Posted by *Nameless1988*
> 
> I hope dx12 will bring more optimization, unlike ROTTR where dx12 = dx11.


I expect the same here since this game was not built from the ground up for DX12. Same for almost every other DX12 and Vulkan game, I think the only exception is Ashes of the Singularity.


----------



## Fancykiller65

Quote:


> Originally Posted by *boredgunner*
> 
> I expect the same here since this game was not built from the ground up for DX12. Same for almost every other DX12 and Vulkan game, I think the only exception is Ashes of the Singularity.


Doom is another exception.


----------



## EniGma1987

Quote:


> Originally Posted by *Lass3*
> 
> Uhm, how come user reviews are terrible for the most part?


IDK, everyone I know loves(ed) the game.


----------



## Robenger

Quote:


> Originally Posted by *Lass3*
> 
> Uhm, how come user reviews are terrible for the most part?


From Steam reviews I've read is because there is a microtransaction store in the game, which seems odd as it's a single player game.


----------



## Nameless1988

Many reviews are nagative because of the poor optimization.
someone is complaining about boring missions and storyline, but IDK beacuse i'm waiting for dx12 patch to start the campaign.

For sure, the game actually runs problematic due to the bad optimization.


----------



## Velathawen

Quote:


> Originally Posted by *Robenger*
> 
> From Steam reviews I've read is because there is a microtransaction store in the game, which seems odd as it's a single player game.


It's not solely because of the micro transactions. People are mad that the pre-order DLC basically was a bait and switch. After the launch the developers went back and re-named certain items as 'consumable' which means they can only be redeemed once (on any save).

Personally I haven't used the DLC items and haven't found any need to spend on credits, biocells, or what not yet. Game is quite fun though it has plenty of rough edges and quirks still.

First week was pretty terrible for me performance wise but I've been getting really good mileage out of my 980Ti SLI since a fresh install with 372.54.


----------



## boredgunner

Quote:


> Originally Posted by *Lass3*
> 
> Uhm, how come user reviews are terrible for the most part?


Reason #1 - Instability.

Reason #2 - Optimization.

Reason #3 - Microtransactions.

All stupid things that are fairly easy to avoid. We have Square Enix to thank for this.


----------



## Silent Scone

Only gripe i have is the music stops playing sometimes in a section of Prague.


----------



## the9quad

Game is fantastic.
Quote:


> Originally Posted by *Lass3*
> 
> Uhm, how come user reviews are terrible for the most part?


And some aren't.

Some people like chocalate ice cream, some don't.

Some people like stuff others dont.

To be honest though, it's way better than human revolution.

The micro transactions don't bother me, since they are completely unnecessary, if someone wants to waste money on unnecessary stuff, more power to them.
As far as performance goes, it is decent enough.


----------



## killerhz

Quote:


> Originally Posted by *BradleyW*
> 
> Yeah it looks like Thursday!
> https://www.reddit.com/r/Deusex/comments/51jp6a/so_where_is_directx12/


while i don't expect miracles... will not buy this title until the DX12 patch comes with it.
there is not excuse why this title with it's so called prestige or title knowledge that we shouldn't get a DX12 as the standard. $60 for not a complete titlle or as the developers told us such.


----------



## LoLomgbbq

If this was struggling with DX11, DX12 isnt going to make any difference really.


----------



## boredgunner

Quote:


> Originally Posted by *killerhz*
> 
> while i don't expect miracles... will not buy this title until the DX12 patch comes with it.
> there is not excuse why this title with it's so called prestige or title knowledge that we shouldn't get a DX12 as the standard. $60 for not a complete titlle or as the developers told us such.


In all fairness, it is a bit early to be designing games from the ground up in DX12 or Vulkan. Especially AAA games that are going to be played by everyone, with all kinds of hardware setups including older operating systems and GPUs not capable of DX12 or Vulkan.

There is no good excuse for the general poor optimization and instability though.


----------



## Silent Scone

Quote:


> Originally Posted by *LoLomgbbq*
> 
> If this was struggling with DX11, DX12 isnt going to make any difference really.


Quite a daft thing to say really. The only way you would know this well enough to comment is if you'd worked on both builds.


----------



## LoLomgbbq

Quote:


> Originally Posted by *Silent Scone*
> 
> *Quite a daft thing to say really*. The only way you would know this well enough to comment is if you'd worked on both builds.


*
Deus Ex: Mankind Divided - DX12 patch is now available, performs poorly on NVIDIA's hardware*
Quote:


> However, things went downhill once we visited the game's city hubs. As we can see, there is a significant performance hit under DX12 (we're talking about a 30fps decrease in some cases).


Pardon?

This game was delayed, then the dx12 patch was delayed and now the promised dx12 patch is still only a preview build of it.


----------



## Semel

DX12 is worse than DX11 performance-wise on my PC ([email protected], 16GB, amd fury 3840 1100/550 ). I tested in-game, not benchmark. I tried loading different saves\scenes and DX11 performed better in all of them using the same settings.

I guess I shouldn't be surprised. It's Nixxes after all (RoTTR DX12 fiasco). Ironically, dawn engine is heavily based on glacier 2 engine.. the one used in Hitman and Hitman had a great fps boost in dx12 on AMD....


----------



## Lass3

Quote:


> Originally Posted by *Semel*
> 
> DX12 is worse than DX11 performance-wise on my PC ([email protected], 16GB, amd fury 3840 1100/550 ). I tested in-game, not benchmark. I tried loading different saves\scenes and DX11 performed better in all of them using the same settings.
> 
> I guess I shouldn't be surprised. It's Nixxes after all (RoTTR DX12 fiasco). Ironically, dawn engine is heavily based on glacier 2 engine.. the one used in Hitman and Hitman had a great fps boost in dx12 on AMD....


DX12 fiasco? I'm getting higher minimum fps in RotTR using DX12, especially in Geo. Valley, here it's like 25% higher.

The minimum fps using DX11 Benchmark is actually insanely low compared to DX12 Benchmark. There's stutter. No stutter in DX12 at all.

So Nvidia sponsored titles run better on Nvidia and AMD sponsored titles run better on AMD, what a surprise.


----------



## Semel

I've just tested the game using its benchmark at High preset, same as guru3d and my average fps was better in DX11. However, DX12 was less stuttery.


----------



## GorillaSceptre

No stopping Hawaii.. Fury X under DX12 isn't exactly a slouch either.









DX11.



Spoiler: Warning: Spoiler!







DX12.



Spoiler: Warning: Spoiler!


----------



## huzzug

Good to see the 780ti holding its own against Tonga. Never doubted that card of anything less.


----------



## GorillaSceptre

Quote:


> Originally Posted by *huzzug*
> 
> Good to see the 780ti holding its own against Tonga. Never doubted that card of anything less.


We have now entered the Twilight Zone..


----------



## boredgunner

Told you so.


----------



## rluker5

Quote:


> Originally Posted by *huzzug*
> 
> Good to see the 780ti holding its own against Tonga. Never doubted that card of anything less.


Hey, it's still good with dx11







. But it's clock is ticking. I better fire up that box fan in the door to my room and squeeze some more juice out of them while I can.


----------



## Sleazybigfoot

I miss a couple of things in Guru3D's benchmark
1). A proper VRAM benchmark, not just DDR5 but I'd like to see how the Fury (x) performs with 4GB HBM.
2). Which graphics preset was selected during the FCAT latency test.
3). A DX12 FCAT latency test (I don't know if there is software that supports DX12 yet? I'd assume so).


----------



## huzzug

Quote:


> Originally Posted by *rluker5*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> Good to see the 780ti holding its own against Tonga. Never doubted that card of anything less.
> 
> 
> 
> Hey, it's still good with dx11
> 
> 
> 
> 
> 
> 
> 
> . But it's clock is ticking. I better fire up that box fan in the door to my room and squeeze some more juice out of them while I can.
Click to expand...

It's undoubtedly a nice card even today as it was at release. Just that the tech has moved on to better things with only those things having the staying power which thought about future tech.


----------



## KyadCK

Quote:


> Originally Posted by *huzzug*
> 
> Good to see the 780ti holding its own against Tonga. Never doubted that card of anything less.


Wow that's a brutal comment. I will admit to chuckling.

Geeze, the 470 is beating the 1060, and 480 between the 980TI and 1070...

nVidia needs better DX12 drivers. AMD got an alright boost, but they're winning by so much because nVidia's performance tanked hard.

EDIT: AMD launched DX12 drivers for this game. Has nVidia done so yet?


----------



## Semel

Quote:


> nVidia needs better DX12 drivers.


Why do you think it's nvidias fault though? The game has been having technical issues from the moment it was released.Especially dat stuttering in Prague.
Quote:


> AMD got an alright boost


I haven't got any boost. DX11 is faster here. ([email protected] 16GB, amd fury) latest drivers.


----------



## KyadCK

Quote:


> Originally Posted by *Semel*
> 
> Quote:
> 
> 
> 
> nVidia needs better DX12 drivers.
> 
> 
> 
> Why do you think it's nvidias fault though? The game has been having technical issues from the moment it was released.Especially dat stuttering in Prague.
> Quote:
> 
> 
> 
> AMD got an alright boost
> 
> Click to expand...
> 
> I haven't got any boost. DX11 is faster here. ([email protected] 16GB, amd fury) latest drivers.
Click to expand...

By reading?

http://www.guru3d.com/articles-pages/deus-ex-mankind-divided-pc-graphics-performance-benchmark-review,1.html

Fury went from 72 in DX11 High 1080 to 77 in DX12 High 1080.

... And the GTX 1080 went from 88 to 75.

But yea, let's blame the game for the obviously lopsided results that match almost every other DX12 bench out there; AMD gets better, nVidia gets worse. That's why I asked if nVidia launched drivers for this yet.


----------



## rluker5

Quote:


> Originally Posted by *huzzug*
> 
> It's undoubtedly a nice card even today as it was at release. Just that the tech has moved on to better things with only those things having the staying power which thought about future tech.


Still good with the dx11 versions of games and 3 years isn't bad for longevity, Hawaii just makes it look like that with how it is holding up, and will hold up for the foreseeable future with the consoles and all.

Edit: That dx11 sli scaling looks pretty good though.


----------



## fewness

Tide has turned. Now it's Nvidia's cards won't perform to their potential, just like AMD did bad in DX11 years...


----------



## huzzug

Nvidia cards always performed to the fullest potential. Only AMD cards were not utilized. Nvidia though be it DX11 or DX12 perform 100% so to speak


----------



## PlugSeven

Quote:


> Originally Posted by *KyadCK*
> 
> Wow that's a brutal comment. I will admit to chuckling.
> 
> Geeze, the 470 is beating the 1060, and 480 between the 980TI and 1070...
> 
> nVidia needs better DX12 drivers. AMD got an alright boost, but they're winning by so much because nVidia's performance tanked hard.
> 
> EDIT: AMD launched DX12 drivers for this game. Has nVidia done so yet?


You're NOT going to get better than dx11 performance on current nvidia hardware on dx 12. Drivers got nothing to do with it.


----------



## BradleyW

GCN is highly parallel, thus better suited to low level multi threaded API's. With Nvidia's current architecture, this isn't the case. Drivers won't help them much. Nvidia must develope an arcitecture similair to GCN in order to prevail in up and coming titles.


----------



## PlugSeven

Quote:


> Originally Posted by *fewness*
> 
> Tide has turned. Now it's Nvidia's cards won't perform to their potential, just like AMD did bad in DX11 years...


Current nvidia hardware has no potential in dx12.


----------



## fewness

Quote:


> Originally Posted by *PlugSeven*
> 
> Current nvidia hardware has no potential in dx12.


But doesn't these performance data say DX11 can realize more raw hardware capabilities for Nvidia?


----------



## Marios145

290/390 HUGE gains compared to the rest at 1080p
First the 780ti, then the 980, now the 980ti.... where does the madness stop?!??!?!?!


----------



## daviejams

Quote:


> Originally Posted by *Marios145*
> 
> 290/390 HUGE gains compared to the rest at 1080p
> First the 780ti, then the 980, now the 980ti.... where does the madness stop?!??!?!?!


390x is slightly faster than the 1070 in DX12 mode ...


----------



## Outcasst

My system lost 10FPS switching to DX12.


----------



## NightAntilli

The Fury beats a 1080 at 1080p... Dang.

nVidia likely loses performance under DX12 because their scheduler is not fully in hardware like AMD. Their software scheduler requires CPU resources on top of the game resources. So basically, under DX12 the CPU overhead issue flips, although nVidia's issue under DX12 is smaller than AMD's issue under DX11.


----------



## Woundingchaney

I guess Im rather surprised that people actually believed Nvidia would benefit from a DX12 adaptation of this title. What benefits could be accrued from a DX12 api for Nvidia are not going to be taken into account for an AMD sponsored game. Given this is an AMD sponsored game I would imagine that their DX12 implementation is directly associated with leveraging async compute, which would make sense. Async compute is of course only one facet, but it is an area that AMD benefit from hands down in comparison to Nvidia due to architecture differences.

At 1080p this implementation most assuredly is going to hamper performance for Nvidia's high end hardware. This of course should be relatively alleviated at higher resolutions, but Nvidia still wouldnt benefit. Driver support for the title in DX12 could make a difference, but I wouldnt imagine it to have much of an impact.

Quote:


> First the 780ti, then the 980, now the 980ti.... where does the madness stop?!??!?!?!


Im not sure what you are talking about.
Quote:


> Current nvidia hardware has no potential in dx12.


This is incorrect. Async compute leverage is not the equivalent DX12. Leveraging async compute would net benefits for AMD but these benefits will not necessarily transfer to Nvidia (generally speaking this is true particularly at lower resolutions) given architecture differences.


----------



## NightAntilli

Did anyone else notice that the RX 480 is outperforming a 980 Ti under DX12? Taking their best score, the RX 480 in DX12 is just a single frame slower than 980Ti under DX11, at 1080p high settings.


----------



## Semel

The game keeps crashing at startup in DX12 on my 4GB amd fury card if I have ultra textures enabled. DX11 works fine with ultra.


----------



## vloeibaarglas

After Nvidia/Eidos Montreal fixes this 'bug' with DX12, if Nvidia is still getting slaughtered, I'm filing a 90 days return protection with my credit card company on my GTX 1070.

Lack of true DX12 is a deal breaker.


----------



## Semel

Quote:


> Originally Posted by *vloeibaarglas*
> 
> After Nvidia/Eidos Montreal fixes this 'bug' with DX12, if Nvidia is still getting slaughtered, I'm filing a 90 days return protection with my credit card company on my GTX 1070.
> 
> Lack of true DX12 is a deal breaker.


You are gonna get a refund coz of several broken games? 1070 is a very solid card, especially when OCed.


----------



## NightAntilli

Quote:


> Originally Posted by *Semel*
> 
> The game keeps crashing at startup in DX12 on my 4GB amd fury card if I have ultra textures enabled. DX11 works fine with ultra.


Report it to AMD.


----------



## Woundingchaney

Quote:


> Originally Posted by *vloeibaarglas*
> 
> After Nvidia/Eidos Montreal fixes this 'bug' with DX12, if Nvidia is still getting slaughtered, I'm filing a 90 days return protection with my credit card company on my GTX 1070.
> 
> Lack of true DX12 is a deal breaker.


I dont think you understand "DX12".

Here is a bit of information given Nvidia Async Compute solution

http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/9


----------



## Robenger

Quote:


> Originally Posted by *Woundingchaney*
> 
> I dont think you understand "DX12".
> 
> Here is a bit of information given Nvidia Async Compute solution
> 
> http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/9


Async Compute is arguably one of the most important DX12 features to come out and will only continue to utilized even more than it is now.


----------



## looniam

Quote:


> Originally Posted by *vloeibaarglas*
> 
> After Nvidia/Eidos Montreal fixes this 'bug' with DX12, if Nvidia is still getting slaughtered, I'm filing a 90 days return protection with my credit card company on my GTX 1070.
> 
> Lack of true DX12 is a deal breaker.


start now because it will not change.


----------



## NightAntilli

Taking the best score for each (generally DX12 for AMD, DX11 for nVidia), this is the performance, for *1080p high settings*, based on Guru3D;

*High end cards;*
GTX 1080, 88FPS
R9 Fury X, 85 FPS
R9 Fury, 77 FPS ($310 currently, just FYI)
R9 Nano, 75 FPS
390X, 74 FPS
GTX 1070, 73 FPS
980 TI, 68 FPS

*Mid range cards*
RX 480, 67 FPS
R9 390, 62 FPS
RX 470, 56 FPS
GTX 1060 6GB, 54 FPS
GTX 980, 53 FPS
R9 380X, 50 FPS
GTX 970, 49 FPS

*Summary;*
RX 470 faster than GTX 1060 6GB and GTX 980
RX 480 only one frame slower than GTX 980 Ti
Fury X only 3 frames slower than GTX 1080
Fury faster than GTX 1070
390X faster than GTX 1070
380X faster than GTX 970

Same as before... Taking the best score for each (generally DX12 for AMD, DX11 for nVidia), this is the performance, but for *1440p high settings* rather than 1080p;

*High end cards;*
GTX 1080, 59FPS
R9 Fury X, 56 FPS
R9 Fury, 50 FPS ($310 currently, just FYI)
R9 Nano, 49 FPS
390X, 49 FPS
GTX 1070, 49 FPS
980 TI, 45 FPS

*Mid range cards*
RX 480, 43 FPS
R9 390, 42 FPS
RX 470, 37 FPS
GTX 1060 6GB, 35 FPS
GTX 980, 35 FPS
R9 380X, 32 FPS
GTX 970, 31 FPS

*Summary;*
Almost same as 1080p really...
RX 470 still faster than GTX 1060 6GB and GTX 980
RX 480 is now two frames slower than GTX 980 Ti compared to 1 frame slower at 1080p.
Fury X is still only 3 frames slower than GTX 1080
Fury still faster than GTX 1070
390X and Nano are now equally fast to a GTX 1070. They were faster at 1080p.
380X still faster than GTX 970

*At 4K high settings;*
Nothing is playable at high settings, but basically a Fury X is still 3 frames behind a GTX 1080 (28 vs 31), The Fury and Nano now equal a GTX 1070 (all 25 fps). The 390X drops to 980 Ti level with the RX 480 still on the heels of the 980 Ti.


----------



## Woundingchaney

Quote:


> Originally Posted by *Robenger*
> 
> Async Compute is arguably one of the most important DX12 features to come out and will only continue to utilized even more than it is now.


There is still async compute support in Nvidia architecture, though admittedly there are complications leveraging it. It's also important to note that you are casting a long timeline for its relevance. This is also an AMD sponsored title.......


----------



## KyadCK

Quote:


> Originally Posted by *Woundingchaney*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Robenger*
> 
> Async Compute is arguably one of the most important DX12 features to come out and will only continue to utilized even more than it is now.
> 
> 
> 
> There is still async compute support in Nvidia architecture, though admittedly there are complications leveraging it. It's also important to note that you are casting a long timeline for its relevance. This is also an AMD sponsored title.......
Click to expand...

He means Async Shaders.


----------



## johnblake

Warning some AMD fanboys may witness some pain after seeing the video and the benchmarks.

https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/

https://www.youtube.com/watch?v=pAKCOcSLViw


----------



## Semel

Quote:


> Originally Posted by *NightAntilli*
> 
> Report it to AMD.


Why would I report it to AMD? The same happens to nvidia gpus.


----------



## Defoler

The game seems to be heavily mapped to AMD hardware, which seems to be the main reason why nvidia are getting such a big hit in performance.
This is one of the issues with DX12. The developer is the one having to do all the work, and drivers can't fix anything anymore.

Hopefully EIDOS will work with nvidia on this. If not (if AMD puts their foot down on it), we will see no fix or improvement.

Also, the game kept crashing for me in DX12. Seems to be buggy as hell. Twice it also froze and reloaded back as DX11.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Defoler*
> 
> The game seems to be heavily mapped to AMD hardware, which seems to be the main reason why nvidia are getting such a big hit in performance.
> This is one of the issues with DX12. The developer is the one having to do all the work, and drivers can't fix anything anymore.
> 
> Hopefully EIDOS will work with nvidia on this. If not (if AMD puts their foot down on it), we will see no fix or improvement.
> 
> Also, the game kept crashing for me in DX12. Seems to be buggy as well. Twice it also froze and reloaded back as DX11.


We do not know for sure. AMD GPUs right now are like a straight line. You can't favor them. Nvidia uses the different rendering method which might not work as intended in all games.


----------



## Defoler

Quote:


> Originally Posted by *johnblake*
> 
> Warning some AMD fanboys may witness some pain after seeing the video and the benchmarks.
> 
> https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/
> 
> https://www.youtube.com/watch?v=pAKCOcSLViw


So basically with the right CPU, you can get better results with nvidia than AMD? So this means nvidia are using more software based than AMD? This might be interesting.


----------



## Defoler

Quote:


> Originally Posted by *ZealotKi11er*
> 
> We do not know for sure. AMD GPUs right now are like a straight line. You can't favor them. Nvidia uses the different rendering method which might not work as intended in all games.


Don't forget that the game has been collaborated with AMD. Meaning AMD had some good saying in terms of API usage and what and how to run stuff.


----------



## johnblake

Quote:


> Originally Posted by *Defoler*
> 
> The game seems to be heavily mapped to AMD hardware, which seems to be the main reason why nvidia are getting such a big hit in performance.
> This is one of the issues with DX12. The developer is the one having to do all the work, and drivers can't fix anything anymore.
> 
> Hopefully EIDOS will work with nvidia on this. If not (if AMD puts their foot down on it), we will see no fix or improvement.
> 
> Also, the game kept crashing for me in DX12. Seems to be buggy as hell. Twice it also froze and reloaded back as DX11.


No. When you play game you will see exactly these kind of result
https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/

But if you are only interested in build benchmarks then you will see Gurd3d kind of results.


----------



## SlackerITGuy

Quote:


> Originally Posted by *johnblake*
> 
> Warning some AMD fanboys may witness some pain after seeing the video and the benchmarks.
> 
> https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/
> 
> https://www.youtube.com/watch?v=pAKCOcSLViw


Makes no sense.

Guru3D is reporting the contrary.


----------



## GorillaSceptre

Well it's Nixxes again, they are clearly incapable of doing a stellar job with the new API's..

The Computerbase results are definitely odd.. Maybe mixed the results up? I don't get how reviewers can have such massive swings in results, even when using identical systems. Whenever i compare results with other forum members we're always within a few frames of each other..

As for the AMD "bias", i think some of you are giving AMD too much credit. I doubt they have a lot to do with anything, the recent trend (which can be seen with neutral titles like K.I) probably has a lot more to do with the fact that AMD have GCN in both of the leading consoles, and because very few port studios give a damn about PC, and the main studio is usually too busy to do it themselves, we end up with titles like these. Which will naturally favor AMD.


----------



## Semel

I'm sorry but dx11 fury computerbase benchmark in Prague is pure bull**** which is kinda strange coz generally they are pretty reliable.. I was getting much higher average fps there at the settings they tested it.


----------



## johnblake

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Makes no sense.
> 
> Guru3D is reporting the contrary.


No Guru3d benchmarks are of in bulid benchmark ,which are totally different from actual gameplay.

Sites like PChardware, PurePC, Computerbase,PClab, and sweatclockers always test in gameplay benchmark ,which test everything.


----------



## johnblake

Some AMD fans are really based? Do they really think that DX12 on high settings with in build benchmark not even gameplay represents DX12?


----------



## Glottis

Quote:


> Originally Posted by *johnblake*
> 
> No Guru3d benchmarks are of in bulid benchmark ,which are totally different from actual gameplay.
> 
> Sites like PChardware, PurePC, Computerbase,PClab, and sweatclockers always test in gameplay benchmark ,which test everything.


you are 100% correct and it's perfectly shown in this video. RX480 dominates GTX1060 while benchmarking, but in actual gameplay both cards perform neck and neck.


----------



## KyadCK

Quote:


> Originally Posted by *johnblake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlackerITGuy*
> 
> Makes no sense.
> 
> Guru3D is reporting the contrary.
> 
> 
> 
> No Guru3d benchmarks are of in bulid benchmark ,which are totally different from actual gameplay.
> 
> Sites like PChardware, PurePC, Computerbase,PClab, and sweatclockers always test in gameplay benchmark ,which test everything.
Click to expand...

And yet, you have a user right here who is telling you that from personal experience your praised Computerbase is full of it.
Quote:


> Originally Posted by *Semel*
> 
> I'm sorry but dx11 fury computerbase benchmark in Prague is pure bull**** which is kinda strange coz generally they are pretty reliable.. I was getting much higher average fps there at the settings they tested it.


Which is the problem with non-benchmarks. It's a live environment that can change per play-through.


----------



## Semel

Quote:


> Originally Posted by *KyadCK*
> 
> Which is the problem with non-benchmarks. .


Yup. Unfortunately built-in benchmarks quite often do not represent average fps you get in games....However, as you said, in-game testing is a dynamic live environment which makes it harder to properly compare different gpus\cpus etc.. So, there is that..

I haven't tested dx12 much in-game but it did seem to be working slower than dx11 (same as RoTTR) . Actually, DX11 was a bit faster than DX12 even in benchmark.


----------



## johnblake

Quote:


> Originally Posted by *Irrefutabl3*
> 
> First off this old and on DX11, DX12 was just implemented today.


There is no benefit even on RX 480 with DX12.
DX11 vs DX12 on RX 480 gameplay.
https://www.youtube.com/watch?v=pAKCOcSLViw


----------



## Glottis

Quote:


> Originally Posted by *Irrefutabl3*
> 
> First off this old and on DX11, DX12 was just implemented today.


so what? DX12 doesn't have any new graphical effects. so you pick whatever API is fastest for you. that doesn't mean you have to automatically assume DX12 is best (which is proven in BF1 btw). also, RX480 has next to no performance advantage in DX12 mode. so this video is completely legit today as it was 2 weeks ago.


----------



## Forceman

Quote:


> Originally Posted by *Glottis*
> 
> so what? DX12 doesn't have any new graphical effects. so you pick whatever API is fastest for you. that doesn't mean you have to automatically assume DX12 is best (which is proven in BF1 btw). also, RX480 has next to no performance advantage in DX12 mode. so this video is completely legit today as it was 2 weeks ago.


That's an important point. If there is no difference in graphics, just use whichever API is faster for your card. It's great for AMD cards that they get a performance boost, but a performance loss for Nvidia is kind of irrelevant if they can just keep using DX11.


----------



## ku4eto

Quote:


> Originally Posted by *johnblake*
> 
> Warning some AMD fanboys may witness some pain after seeing the video and the benchmarks.
> 
> https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/
> 
> https://www.youtube.com/watch?v=pAKCOcSLViw


Lol, no. Actually nVidia fanboys will cringe. The % of performance loss is almost double of what AMD has on that i3.


----------



## NightAntilli

Quote:


> Originally Posted by *Defoler*
> 
> *The game seems to be heavily mapped to AMD hardware, which seems to be the main reason why nvidia are getting such a big hit in performance.*
> *This is one of the issues with DX12.* The developer is the one having to do all the work, and drivers can't fix anything anymore.
> 
> Hopefully EIDOS will work with nvidia on this. If not (if AMD puts their foot down on it), we will see no fix or improvement.
> 
> Also, the game kept crashing for me in DX12. Seems to be buggy as hell. Twice it also froze and reloaded back as DX11.


Except nVidia is not really losing significant performance... Unlike AMD under GameWorks titles...

1080p DX11 vs DX12, based on Guru3D
Titan X: 74 vs 73
GTX 1080: 59 vs 57
GTX 1070: 48 vs 49 (actually gains performance)
980 Ti: 45 vs 43
GTX 980: 35 vs 34
GTX 1060: 35 vs 35
GTX 970: 31 vs 30

Well within the margin of error.

As for why the computerbase benchmarks differ from Guru3D... Guru3D uses High, ComputerBase uses Max settings... Maybe that's why?


----------



## Robenger

Quote:


> Originally Posted by *ku4eto*
> 
> Lol, no. Actually nVidia fanboys will cringe. The % of performance loss is almost double of what AMD has on that i3.


ssshhhh, as you can see john lives in his own reality. Please do not disturb him.


----------



## johnblake

Another video to prove my point. In gameplay, DX11>>>Dx12 by far in this game. Stuttering is all over the place for RX 480 and GTX 1060 on DX12.

https://www.youtube.com/watch?v=TtguubFmCwo


----------



## vloeibaarglas

Is Nvidia's mickey mouse software based Async scheduler causing these lack of increase in performance.


----------



## fewness

this is truly YMMV&#8230;&#8230;


----------



## NightAntilli

Hm... ComputerBase claims;

_The graphic details are always maximized. Only the textures are a step set back to "Very High" and the *Contact Hardening Shadows are turned off.*_

So, Ultra settings for CB, with contact hardening shadows off, while Guru3D has contact hardening shadows on Ultra.

Let's take the Fury X as an example...
Guru3D DX11 High settings 1080p: 78 FPS
Guru3D DX11 High settings 1440p: 53 FPS

Guru3D DX12 High settings 1080p: 85 FPS
Guru3D DX12 High settings 1440p: 56 FPS

Guru3D DX11 Ultra settings 1080p: 60 FPS
Guru3D DX11 Ultra settings 1440p: 41 FPS

CB DX11 Ultra settings 1080p: 67.9 FPS
CB DX11 Ultra settings 1440p: 46.1 FPS

CB DX12 Ultra settings 1080p: 59.1 FPS
CB DX12 Ultra settings 1440p: 43.6 FPS

We have no Ultra quality from Guru3D for DX12. But looking at the jump from High to Ultra under DX11, we have an 18 fps drop at 1080p, and a 12 fps drop at 1440p, which is 23% in fps for both resolutions. Assuming a similar drop under DX12, we should be getting 65 fps at 1080p and 43 fps at 1440p Ultra... It would be superior to Guru 3D's DX11 benchmarks. But then again, this is not too far off from what CB has reported for their DX12 framerate...


----------



## Mahigan

Yet another title which shows AMD gaining under the new APIs and nVIDIA losing performance (or stagnant in terms of performance). The only outlier is the poorly optimized Tomb Raider...

but yeah... I was lying about it all.


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> Another video to prove my point. In gameplay, DX11>>>Dx12 by far in this game. Stuttering is all over the place for RX 480 and GTX 1060 on DX12.
> 
> https://www.youtube.com/watch?v=TtguubFmCwo


Umm...

you need to get your eyes checked mate.


----------



## kx11

it runs really good with dx12 @ 1440p for me , ultra settings + msaax2 = 55fps to 87fps , average 65fps

@ 4k with lower settings and no AA at all it goes down to 30s and 40s , max fps 64fps

but there's stuttering


----------



## johnblake

Quote:


> Originally Posted by *Mahigan*
> 
> Yet another title which shows AMD gaining under the new APIs and nVIDIA losing performance (or stagnant in terms of performance). The only outlier is the poorly optimized Tomb Raider...
> 
> but yeah... I was lying about it all.


Yes another game ,which is a turd and give nothing to Square Enix and yes another gaming evolve ,which is a disaster on PC and yes this another one.

Please check the nearest doctor for a eye check up then see the video again and i mean see the full video not AMD rigged benchmarks.
https://www.youtube.com/watch?v=TtguubFmCwo


----------



## Silent Scone

Anything below 60 is terrible as far as I'm concerned


----------



## looniam

everytime i think of leaving this forum . . some of you give me hope.


----------



## Robenger

Quote:


> Originally Posted by *huzzug*
> 
> Isn't that the limit at which you turn your monitor from portrait to landscape. Or
> 
> You turn around and show your back to pc gaming for ever. Or
> 
> You move your drink to the left hand & free your right hand for other important stuffs


It actually requires dual monitors otherwise you're only screwing yourself.


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> Yes another game ,which is a turd and give nothing to Square Enix and yes another gaming evolve ,which is a disaster on PC and yes this another one.
> 
> Please check the nearest doctor for a eye check up then see the video again and i mean see the full video not AMD rigged benchmarks.
> https://www.youtube.com/watch?v=TtguubFmCwo


So Guru3D are now rigged benchmarks?

http://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,5.html

Riiiiiight









I see no stuttering for the RX 480 in that video. I also do not see any stuttering in the frame times captured by Guru3D.


----------



## huzzug

Quote:


> Originally Posted by *Robenger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> Isn't that the limit at which you turn your monitor from portrait to landscape. Or
> 
> You turn around and show your back to pc gaming for ever. Or
> 
> You move your drink to the left hand & free your right hand for other important stuffs
> 
> 
> 
> It actually requires dual monitors otherwise you're only screwing yourself.
Click to expand...

So that's what mGPU system users experience before game drivers


----------



## killerhz

Quote:


> Originally Posted by *boredgunner*
> 
> In all fairness, it is a bit early to be designing games from the ground up in DX12 or Vulkan. Especially AAA games that are going to be played by everyone, with all kinds of hardware setups including older operating systems and GPUs not capable of DX12 or Vulkan.
> 
> There is no good excuse for the general poor optimization and instability though.


yeah guess you are right. agree with your last statement now after reading your response than mine...


----------



## PontiacGTX

No review has first GCN iteration GPUs probably devs arent including the cards for DX12 like rise of tomb raider and total war warhammer
Quote:


> Originally Posted by *KyadCK*
> 
> Wow that's a brutal comment. I will admit to chuckling.
> 
> Geeze, the 470 is beating the 1060, and 480 between the 980TI and 1070...
> 
> nVidia needs better DX12 drivers. AMD got an alright boost, but they're winning by so much because nVidia's performance tanked hard.
> 
> EDIT: AMD launched DX12 drivers for this game. Has nVidia done so yet?


In computerbase RX 480 has negative performance with DX12 meanwhile in guru3d it shows improvement on High quality, anf another ite shows RX 480 at DX11 between 390 and Fury X
Quote:


> Originally Posted by *Marios145*
> 
> 290/390 HUGE gains compared to the rest at 1080p


For some reason R9 Fury / Nano gains are smaller


----------



## fewness

Hey all, please help! My DX12 is still greyed out after updating. Is the game trying to protect me from downgrade disappointment?


----------



## Mahigan

Quote:


> Originally Posted by *PontiacGTX*
> 
> No review has first GCN iteration GPUs probably devs arent including the cards for DX12 like rise of tomb raider and total war warhammer
> In computerbase RX 480 has negative performance with DX12 meanwhile in guru3d it shows improvement on High quality
> For some reason R9 Fury / Nano gains are smaller


Yep... GCN2 has much higher CPU overhead than both GCN3 and GCN4. This, of course, leads to higher DX12 gains for GCN2.


----------



## Olivon

Quote:


> Originally Posted by *Mahigan*
> 
> Yet another title which shows AMD gaining under the new APIs and nVIDIA losing performance (or stagnant in terms of performance). The only outlier is the poorly optimized Tomb Raider...
> 
> but yeah... I was lying about it all.


Guru3D use a non-playbale benchmark and ComputerBase use a normal gaming scene.
The difference between push buttons reviewers and real professionals
Quote:


> The test scene in Deus Ex Mankind Divided plays in Prague. Prague is the largest city in the game, the player is often traveling there. Overclockers has explicitly decided against the built-in benchmark, *because this does not correspond to real game scene*


Quote:


> DirectX 12 is in Deus Ex: Mankind Divided after the first patch on systems with Intel CPUs a huge disappointment. The fact that the real strength of DirectX 12, to significantly reduce the CPU limit, of all the major weakness in the Schleich play, is a farce. *Another critical factor is to see in this context that the built-in benchmark seems by far to be working much better under DirectX 12 as Prague, by far the largest region in Deus Ex: Mankind Divided*.


And regarding the ramdomdude video, check the 4:18 moment

Dx12 brings no new effects for the moment and it's all about performance and the problem for AMD is that nVidia got superior performance and not really need DX12 as much as AMD.
DX12 is not a game changer for the moment and I believe it will take time for it to be really revelant


----------



## PontiacGTX

Quote:


> Originally Posted by *Mahigan*
> 
> Yep... GCN2 has much higher CPU overhead than both GCN3 and GCN4. This, of course, leads to higher DX12 gains for GCN2.


any reason why it differs? I mean Tonga/Fury X didnt changed much in the architecture, except for some compute workload in fp16


----------



## daviejams

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> Hey all, please help! My DX12 is still greyed out after updating. Is the game trying to protect me from downgrade disappointment?


try opting into the beta on steam


----------



## kx11

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> Hey all, please help! My DX12 is still greyed out after updating. Is the game trying to protect me from downgrade disappointment?


get in the game properties in steam and click Beta tab then choose dx12


----------



## fewness

OK. + rep for both of you!


----------



## octiny

Wow, and that's with the 16.8.2 drivers.

The new 16.9.1 drivers that were released today gave me an additional 5-10% increase across all DX11/DX12 games @ 4K.


----------



## BradleyW

*Deus Ex: Mankind Divided v1.4 build 545.4*

*Configuration:*



Spoiler: Warning: Spoiler!



3930K 4.5Ghz HT
4x4GB DDR3 2400MHz
AMD 290X 4GB
Win 10 64-bit
RSC 16.9.1



*Settings:*



Spoiler: Warning: Spoiler!








*Results:*

*D3D11 (Location 1)*


Spoiler: Warning: Spoiler!






*D3D12 (Location 1)*


Spoiler: Warning: Spoiler!






*D3D11 (Location 2)*


Spoiler: Warning: Spoiler!






*D3D12 (Location 2)*


Spoiler: Warning: Spoiler!






*D3D11 (Location 3)*


Spoiler: Warning: Spoiler!






*D3D12 (Location 3)*


Spoiler: Warning: Spoiler!






*D3D11 (Location 4)*


Spoiler: Warning: Spoiler!






*D3D12 (Location 4)*


Spoiler: Warning: Spoiler!







*Conclusion:*

D3D12 has given a fair boost, however there are two reasons why I will continue to play in D3D11 mode!
1) Loading times have increased from 10 seconds to 2 minutes with DX12.
2) The game stutters very heavily to the point in which it's unplayable as Jenson moves through the streets of Prague with DX12.
VRAM usage was around 3500MB, so that's not an issue.


----------



## GorillaSceptre

@BradleyW

Great post as usual.









Rep+


----------



## LongtimeLurker

Quote:


> Originally Posted by *Mahigan*
> 
> Yet another title which shows AMD gaining under the new APIs and nVIDIA losing performance (or stagnant in terms of performance). The only outlier is the poorly optimized Tomb Raider...
> 
> but yeah... I was lying about it all.


Umm...no:



http://wccftech.com/nvidia-amd-deus-mankind-divided-dx12-benchmarks-performs-poorly-geforce/
Quote:


> There is a known bug that causes some very high end cards to regress relative to DirectX 11. This bug is being addressed by the development team.
> 
> So for the meantime we recommend all Nvidia users stick to DirectX 11 to avoid any performance loss


Well there you go...


----------



## Mahigan

Quote:


> Originally Posted by *LongtimeLurker*
> 
> Umm...no:
> 
> 
> 
> http://wccftech.com/nvidia-amd-deus-mankind-divided-dx12-benchmarks-performs-poorly-geforce/
> Well there you go...


Ah yeah.... the same "issue" which plagues nearly all DX12 titles eh?

I guess we have to wait for an nVIDIA Async driver for this one as well?


----------



## LongtimeLurker

Quote:


> Originally Posted by *Mahigan*
> 
> Ah yeah.... the same "issue" which plagues nearly all DX12 titles eh?
> 
> I guess we have to wait for an nVIDIA Async driver for this one as well?


It's not an Nvidia issue. It's the developers of the game's fault (EIDOS).

Considering this is an AMD Gaming Evolved title, you know full well they have not gone out of their way to optimize it for Nvidia cards. I'm surprised they are even going to fix it.

I'm not saying Nvidia doesn't do the same thing with their GameWorks titles, they do.

But to use this as an example of why AMD cards are better...your bias is showing.


----------



## FLCLimax

lmao.


----------



## NightAntilli

People still think GameWorks and Gaming Evolved are equivalent in terms of locking the competitor out...


----------



## Mahigan

Quote:


> Originally Posted by *LongtimeLurker*
> 
> It's not an Nvidia issue. It's the developers of the game's fault (EIDOS).
> 
> Considering this is an AMD Gaming Evolved title, you know full well they have not gone out of their way to optimize it for Nvidia cards. I'm surprised they are even going to fix it.
> 
> I'm not saying Nvidia doesn't do the same thing with their GameWorks titles, they do.
> 
> But to use this as an example of why AMD cards are better...your bias is showing.


Ahh... so we're going to re-hash claims that have already been debunked again?

Ironically... there is no bias showing on my part here. nVIDIA worked with the developers for Deus Ex. This was stated numerous times during the various developer's conferences. Deus Ex uses GPUOpen, the code for any features found in the game is thus available to nVIDIA in order to optimize for their own architectures (and this was done btw).

What AMD have is a marketing agreement with EIDOS/Square Enix for this title.

The difference with gameworks is that the code is not available to AMD. The code is strictly optimized for nVIDIA architectures under Gameworks.

We have gone through this several times and developers have discussed this issue at great lengths. Your refusal to accept this information does not show "my bias" but rather yours mate.

When AMD work with developers, their first task is to encourage multi-threaded game engines. We know why this is the case. AMD has not invested in a multi-threaded DX11 driver, as a result, AMDs GCN 1,2 and 3 architectures hammer the primary CPU thread hard. If a game engine runs its complex simulations on that same thread (as gameworks titles do) then you're going to end up with bad performance for AMD architecture in lower resolutions. BradleyW has covered this in great length.

The Fury-X is actually a great GPU. Hardware wise... it is a superb GPU. AMDs DX11 driver, however, lacks the ability to ensure that the GPU is properly fed.


----------



## LongtimeLurker

Quote:


> Originally Posted by *Mahigan*
> 
> Ahh... so we're going to re-hash claims that have already been debunked again?
> 
> Ironically... there is no bias showing on my part here. nVIDIA worked with the developers for Deus Ex. This was stated numerous times during the various developer's conferences. Deus Ex uses GPUOpen, the code for any features found in the game is thus available to nVIDIA in order to optimize for their own architectures (and this was done btw).
> 
> What AMD have is a marketing agreement with EIDOS/Square Enix for this title.
> 
> The difference with gameworks is that the code is not available to AMD. The code is strictly optimized for nVIDIA architectures under Gameworks.
> 
> We have gone through this several times and developers have discussed this issue at great lengths. Your refusal to accept this information does not show "my bias" but rather yours mate.
> 
> When AMD work with developers, their first task is to encourage multi-threaded game engines. We know why this is the case. AMD has not invested in a multi-threaded DX11 driver, as a result, AMDs GCN 1,2 and 3 architectures hammer the primary CPU thread hard. If a game engine runs its complex simulations on that same thread (as gameworks titles do) then you're going to end up with bad performance for AMD architecture in lower resolutions. BradleyW has covered this in great length.
> 
> The Fury-X is actually a great GPU. Hardware wise... it is a superb GPU. AMDs DX11 driver, however, lacks the ability to ensure that the GPU is properly fed.


And yet mysteriously, there's a bug that causes Nvidia to lose 30 fps in some instances. THIRTY. Think about that number for a minute.



The Fury X is a great GPU. It was overpriced, but now at the $350 mark they're a good buy. I am picking up an RX 480 tomorrow after I get paid because I want more than 4GB (and it's the only one I can afford right now). Going to be eating Top Ramen for a week to get it, but it's worth it to me to upgrade from my HD 6950.


----------



## Nameless1988

Quote:


> Originally Posted by *johnblake*
> 
> Another video to prove my point. In gameplay, DX11>>>Dx12 by far in this game. Stuttering is all over the place for RX 480 and GTX 1060 on DX12.
> 
> https://www.youtube.com/watch?v=TtguubFmCwo


yes! i have stuttering with dx12 too, but the dx12 benchmark run +5 fps (average) better than dx11 benchmark.

for sure they will solve the problem with next updates.


----------



## BradleyW

Quote:


> Originally Posted by *Nameless1988*
> 
> yes! i have stuttering with dx12 too, but the dx12 benchmark run +5 fps (average) better than dx11 benchmark.
> for sure they will solve the problem with next updates.


How bad is your stutter in D3D12? My stutter is pretty much terrible when running through streets. DX11 feels a lot smoother in this respect.


----------



## Nameless1988

yes, heavy stuttering. with dx11 the game run smooth, rarely stutters. ( I have freesync that help very much).
another 3-4 weeks to patch the game and we will play it smooth.


----------



## ZealotKi11er

This guy is getting much lower DX12 in both AMD and Nvidia.


----------



## BradleyW

Quote:


> Originally Posted by *Nameless1988*
> 
> yes, heavy stuttering. with dx11 the game run smooth, rarely stutters. ( I have freesync that help very much).
> another 3-4 weeks to patch the game and we will play it smooth.


Well I'm glad I am not the only one.









I wonder if RX480 users have the same stutter?


----------



## Rndomuser

Well... Would really like to try out the DX12 mode, unfortunately the game simply crashes after downloading the "dx12_preview" beta patch on Steam, selecting "DirectX 12" option in the launcher and trying to run the game, this is using latest Win10 372.70 drivers on my GTX980. Seems like Nixxes Software is not doing a good job with optimizing the PC port of this console game... Or maybe it's Eidos' fault. In any case, guess I'll just wait for final version.


----------



## end0rphine

Quote:


> Originally Posted by *Rndomuser*
> 
> Well... Would really like to try out the DX12 mode, unfortunately the game simply crashes after downloading the "dx12_preview" beta patch on Steam, selecting "DirectX 12" option in the launcher and trying to run the game, this is using latest Win10 372.70 drivers on my GTX980. Seems like Nixxes Software is not doing a good job with optimizing the PC port of this console game... Or maybe it's Eidos' fault. In any case, guess I'll just wait for final version.


Same thing happens to me on my AMD 290. Select DX12 in options --> Start game --> Instant crash. This is using the latest September Catalyst drivers.


----------



## Nameless1988

Guys be patient, this is a dx12 beta patch. Wait another 2-3 weeks for other updates to get proper dx12 performance and overall better performance both dx11 & dx12.


----------



## johnblake

Quote:


> Originally Posted by *Mahigan*
> 
> So Guru3D are now rigged benchmarks?
> 
> http://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,5.html
> 
> Riiiiiight
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see no stuttering for the RX 480 in that video. I also do not see any stuttering in the frame times captured by Guru3D.


I have never seen such a AMD based person like you. Did you know that he benchmark on high settings and used in build benchmark ,which has nothing not even 0% to do with gameplay? Did you know that?


----------



## Aussiejuggalo

Just tried DX12 with latest AMD drivers, got me about 10 - 15 more frames but dear god the cap whine is deafening from my 290







.

Will be doing a full DX12 run through eventually. Already finished the game twice.


----------



## BradleyW

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Just tried DX12 with latest AMD drivers, got me about 10 - 15 more frames but dear god the cap whine is deafening from my 290
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Will be doing a full DX12 run through eventually. Already finished the game twice.


Do you not see any stuttering in DX12 mode when running through the streets?


----------



## gamervivek

Quote:


> Originally Posted by *johnblake*
> 
> Warning some AMD fanboys may witness some pain after seeing the video and the benchmarks.
> 
> https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/
> 
> https://www.youtube.com/watch?v=pAKCOcSLViw


The video shows ~10% improvement for dx12 at the very start and looks to be GPU limited for the rest of the time. CB results look unusual.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *BradleyW*
> 
> Do you not see any stuttering in DX12 mode when running through the streets?


Haven't tested properly yet, just loaded my last save at the end of the game. Will test properly in the next few days.


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> I have never seen such a AMD based person like you. Did you know that he benchmark on high settings and used in build benchmark ,which has nothing not even 0% to do with gameplay? Did you know that?


I've been playing the game, as many users here, everyone is getting better frames under DX12. Some folks are getting stuttering, others are not.

Those results from CB look like BS to me.

This isn't being biased... it's the damn truth... even Joker was getting better frames under DX12.

If anyone is denying reality and being partisan like crazy it's you mate.

Let's try this... Forum folks... raise your hands...

1. Aside from stuttering... who is getting worse frames per second under DX12 vs DX11?

2. And to those getting stuttering... is the game installed on an HDD or SSD?

I am not getting any stuttering in prague, but many users have reported this issue. I am also getting a healthy FPS boost as well. Same behavior I experienced with BF1.

EDIT: Game just crashed on me twice in a row while testing Prague. I also noticed that my RAM usage and Graphics memory usage has skyrocketed. The game is definitely swapping data from system memory and my SSD. This is likely the culprit for some folks stuttering (as has been the case in the past on some systems with other DX12 titles such as Rise of the Tomb Raider.). It's in beta so to be expected though.


----------



## johnblake

Quote:


> Originally Posted by *Mahigan*
> 
> I've been playing the game, as many users here, everyone is getting better frames under DX12. Some folks are getting stuttering, others are not.
> 
> Those results from CB look like BS to me.
> 
> This isn't being biased... it's the damn truth... even Joker was getting better frames under DX12.
> 
> If anyone is denying reality and being partisan like crazy it's you mate.
> 
> Let's try this... aside from stuttering... who is getting worse frames per second under DX12 vs DX11?
> 
> And to those getting stuttering... is the game installed on an HDD or SSD?
> 
> I am not getting any stuttering in prague, but many users have reported this issue. I am also getting a healthy FPS boost as well. Same behavior I experienced with BF1.
> 
> EDIT: Game just crashed on me twice in a row while testing Prague. I also noticed that my RAM usage and Graphics memory usage has skyrocketed. The game is definitely swapping data from system memory and my SSD. This is likely the culprit for some folks stuttering (as has been the case in the past on some systems with other DX12 titles such as Rise of the Tomb Raider.). It's in beta so to be expected though.


Yes everyone.

https://www.youtube.com/watch?v=pAKCOcSLViw
https://www.youtube.com/watch?v=TtguubFmCwo
https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/

Please see the video of Joker more time.
https://www.youtube.com/watch?v=TtguubFmCwo


----------



## BiG StroOnZ

Guru3D Benchmarks are up for DX12

Looking really good for the red team:







http://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,9.html


----------



## Tgrove

Rip 980 ti


----------



## Olivon

Quote:


> Originally Posted by *Mahigan*
> 
> Those results from CB look like BS to me.


They're plenty examples that show the game run bad on Prague on AMD DX12 (ie. real gaming life, not a crappy integrated benchmark).
The Randomdude videos show a massive stutter that is not present in DX11 but also not on NV DX12.
You spammed this forum with false informations all the time and I see that you you're still on. Just look randomdude videos ans stop accusing good journalists that do their work and just don't push on a button like Guru3D who don't help real gamers at all by using this non representative integrated benchmark.



I understand that this kind of results are not really good for the DX12 AMD marketing evangelist you are, but it's just the truth and you got to deal with it. But stop denial and respect real reviewers, not just push buttons ones.


----------



## Silent Scone

Not to mention it's a beta anyway. Anyone with an AMD card getting excited over the games performance at the moment as a comparative is reassuring nobody but themselves lol. I've turned it off for the time being, till the next driver or build update.


----------



## Mahigan

Quote:


> Originally Posted by *Olivon*
> 
> They're plenty examples that show the game run bad on Prague on AMD DX12 (ie. real gaming life, not a crappy integrated benchmark).
> The Randomdude videos show a massive stutter that is not present in DX11 but also not on NV DX12.
> You spammed this forum with false informations all the time and I see that you you're still on. Just look randomdude videos ans stop accusing good journalists that do their work and just don't push on a button like Guru3D who don't help real gamers at all by using this non representative integrated benchmark.


You're being disingenuous bud. I haven't spammed anything. I've barely posted about Deus Ex (check my post history).

The stutter doesn't affect the FPS. Bradley is getting the stutter while getting better FPS under DX12. This seems to be a complaint from many folks right now (as I mentioned in my last post). Not everyone is having this issue. Some folks are getting better FPS under DX12 with their GTX 1080s while others are getting lower FPS on the same map. This points to another issue in the system or maybe in the software configuration being used. We have yet to pinpoint the issue.

I had two game crashes in a row so far. FPS wise I am getting anywhere from 7 to 10 FPS higher under DX12. The odd part is that I am not getting the stuttering that Bradley and some others have mentioned. I am thinking that it maybe has to do with the AMD drivers and more specifically the way resources are being paged into system RAM. All AMD 4GB GPUs (R9 290 and up) utilize AMD memory optimizations. AMD have dedicated two engineers to the task of VRAM optimizations. AMD aren't reducing VRAM consumption, but what they are doing is managing what resources are placed in VRAM and what resources are paged out to system RAM. AMD are also using a shader cache which utilizes your SSD/HDD. One of these two features could be causing issues with certain configurations. This is why not everyone is getting the issues. What Bradley did was to check the VRAM usage and it showed 3.5GB but it's not really 3.5GB, 3.5GB is what is being used in terms of the VRAM frame buffer but if you look at your system RAM usage you will see that it is much higher under DX12 than DX11 with 4GB AMD GPUs.

Two things can affect this, latency and bandwidth as well as the amount of available system RAM. Another thing is your pagefile. I have disabled my Windows Pagefile altogether. I have 32GB of RAM, therefore, I do not need a system pagefile. If you have a system pagefile than a portion of the data will reside on your SSD/HDD which will cause streaming issues when the AMD driver requests a transfer from the system RAM to the VRAM. This causes a stutter when the transfer is initiated.

We don't all configure our systems the same way which could explain some of the issues we're seeing (for AMD and nVIDIA). Some folks with GTX 1080s are getting really good performance, and no stuttering, while others are getting stuttering. Why? Probably differences in the hardware and software configurations between the two systems (that's an educated guess).










Oh and BradleyW's Post here: http://www.overclock.net/t/1610954/amd-radeon-crimson-16-9-1-8th-sep-2016/20#post_25502936

He has shots comparing DX11 and DX12. Notice something? His DX12 performance is consistently higher than his DX11 performance. So, as you can see, CB's results are not indicative of what we're experiencing (myself or him as well as many others who have posted here thus far).


----------



## Olivon

Quote:


> Originally Posted by *Mahigan*
> 
> The stutter doesn't affect the FPS.






4:15
The RX 480 shows a 31.8 fps low and this doesn't occur in DX11




The problem doesn't occur with the 1060 too.


----------



## Mahigan

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> 
> 
> 4:15
> The RX 480 shows a 31.8 fps low and this doesn't occur in DX11
> 
> 
> 
> 
> The problem doesn't occur with the 1060 too.


For us, the folks posting our results here. I can't speak for some random dude on YouTube who I do not know. AI couldn't help but notice that the RX 480 is consistently performing better than the GTX 1060 in your first video?

Also, notice how the RX 480 is using ~400MB more of System RAM.







That's because that feature I mentioned in my previous post is enabled even on 8GB VRAM Graphics cards. HardOCP ran into the same thing when benching ROTTR on an R9 390x and Fury-X back in Feb 2016.


Quote:


> With the AMD Radeon R9 390X we only ever say the dynamic VRAM hit 230MB usage; it was averaging 200MB through every resolution and AA setting. In terms of the dedicated VRAM it strangely did not consume anywhere near what the GTX 980 Ti or TITAN X did. Possibly due to AMD having more advanced texture compression algorithms being taken advantage of? At most we saw it rise up to 4.2GB of usage at 4K. Turning on SSAA did not make it rise hardly at all. We know SSAA was working though, the performance certainly showed it. We really cannot explain the VRAM behavior of the AMD Radeon R9 390X.
> 
> The AMD Radeon R9 Fury X VRAM behavior does make sense though if you look toward the dynamic VRAM. It seems the onboard dedicated VRAM was mostly pegged at or near its 4GB of capacity. Then it seems the video card is able to shift its memory load out to system RAM, by as much as almost 4GB at 4K with 4X SSAA. If you combine the dynamic VRAM plus the on board 4GB of VRAM the numbers come out to equal numbers much higher than the AMD Radeon R9 390X and closer to what the GeForce GTX 980 Ti achieved in terms of dedicated VRAM.


Source: http://www.hardocp.com/article/2016/02/29/rise_tomb_raider_graphics_features_performance/13#.V9Jc05grIwE

That's what I am talking about. You latched onto the first sentence and didn't even touch the rest of the post. I wonder why...


----------



## Olivon

Quote:


> Originally Posted by *Mahigan*
> 
> That's what I am talking about. You latched onto the first sentence and didn't even touch the rest of the post. I wonder why...


Maybe because I don't care about your unfair propaganda and prefer to show facts and don't want to comment on unprovable blabla.


----------



## Mahigan

Quote:


> Originally Posted by *Olivon*
> 
> Maybe because *I don't care about your unfair propaganda and prefer to show facts* and don't want to comment on unprovable blabla.


HardOCP:
Quote:


> VRAM Usage
> 
> VRAM utilization was one of the more interesting aspects of Rise of the Tomb Raider. It appears to be very different depending on what GPU you are running. There is no question this game can consume large amounts of VRAM at the right settings. With the GeForce GTX 980 Ti we maxed out its 6GB of VRAM just at 1440p with No AA and maximum settings. The TITAN X revealed that it was actually using up to 7GB of VRAM for that setting. In fact, when we pushed the TITAN X up to 4K with SSAA it used up to almost 10GB of VRAM.
> 
> However, for the AMD Radeon R9 390X utilization was a bit odd when you first see it, never exceeding 4.2GB and remaining "flat" with "Very High" textures and SSAA. We did see the proper decrease in VRAM using lower settings, but the behavior was odd indeed. This didn't seem to negatively impact the video card however. The VRAM is simply managed differently with the Radeon R9 300 series.
> 
> The AMD Radeon R9 Fury X kind of backs that statement up since it was able to allocate dynamic VRAM for extra VRAM past its 4GB of dedicated VRAM capacity. We saw up to a 4GB utilization of dynamic VRAM. That allowed the Fury X to keep its 4GB of dedicated VRAM maxed out and then use system RAM for extra storage. *In our testing, this did not appear to negatively impact performance. At least we didn't notice anything in terms of choppy framerates or "micro-stutter."* The Fury X seems to be using the dynamic VRAM as a cache rather than a direct pool of instant VRAM. This would make sense since it did not cause a performance drain and obviously system RAM is a lot slower than local HBM on the Fury X. *If you remember a good while ago that AMD was making claims to this effect, but this is the first time we have actually been able to show results in real world gaming. It is awesome to see some actual validation of these statements a year later. This is what AMD said about this in June of 2015*.
> 
> Note that HBM and GDDR5 memory sized can't be directly compared. Think of it like comparing an SSD's capacity to a mechanical hard drive's capacity. As long as both capacities are sufficient to hold local data sets, much higher performance can be achieved with HBM, and AMD is hand tuning games to ensure that 4GB will not hold back Fiji's performance. Note that the graphics driver controls memory allocation, so its incorrect to assume that Game X needs Memory Y. Memory compression, buffer allocations, and caching architectures all impact a game's memory footprint, and we are tuning to ensure 4GB will always be sufficient for 4K gaming. Main point being that HBM can be thought of as a giant embedded cache, and is not directly comparable to GDDR5 sizes.
> 
> Now specifically that statement backs up "4K gaming" and we will give AMD (and NVDIA for that matter) a pass at this moment as neither produce "4K gaming" GPUs. The important statement here is this, "AMD is hand tuning games to ensure that 4GB will not hold back Fiji's performance." We have said over and over again that this statement by AMD did not ring true in terms of needed HBM capacity, and this is the actually the first time we have seen AMD's statement make sense to us in real world gaming with Triple A titles. So kudos to AMD in being able to show us that this statement has come to fruition finally. The downside we see to this statement is AMD will have to "hand tune" games in order to make this work. What games will get hand tuned in the future? That said, AMD seems to have done an excellent job hand tuning Rise of the Tomb Raider for its HBM architecture.


Source: http://www.hardocp.com/article/2016/02/29/rise_tomb_raider_graphics_features_performance/14#.V9Jd3pgrIwE

Anandtech:
Quote:


> Throwing an extra wrench into things is that PCs have more going on than just console games. PC gamers buying high-end cards like the R9 Fury X will be running at resolutions greater than 1080p and likely with higher quality settings than the console equivalent, driving up the VRAM requirements. The Windows Desktop Window Manager responsible for rendering and compositing the different windows together in 3D space consumes its own VRAM as well. So the current PC situation pushes VRAM requirements higher still.
> 
> The reality of the situation is that AMD knows where they stand. 4GB is the most they can equip Fiji with, so it's what they will have to make-do with until HBM2 comes along with greater densities. In the meantime the marketing side of AMD needs to convince potential buyers that 4GB is enough, and the technical side of AMD needs to come up with other solutions to help mitigate the problem.
> 
> On the latter point, while AMD can't do anything about the amount of VRAM they have, they can and are working on doing a better job of using it. AMD has been rather straightforward in admitting that up until now they've never seriously dedicated resources to VRAM management on their cards, as they've always had enough VRAM that they have never considered it an issue. Until Fiji there was always enough VRAM.
> 
> Which is why for Fiji, AMD tells us they have dedicated two engineers to the task of VRAM optimizations. To be clear here, there's little AMD can to do reduce VRAM consumption, but what they can do is better manage what resources are placed in VRAM and what resources are paged out to system RAM. Even this optimization can't completely resolve the 4GB issue, but it can help up to a point. So long as game isn't actively trying to use all 4GB of resources at once, then intelligent paging can help ensure that only the resources that are actively in use reside in VRAM and therefore are immediately available to the GPU when requested.
> 
> As for the overall utility of this kind of optimization, it's going to depend on a number of factors, including the OS, the game's own resource management, and ultimately the real working set needs of a game. The situation AMD faces right now is one where they have to simultaneously fight an OS/driver paradigm that wastes memory, and the games that will be running on their GPUs traditionally treat VRAM like it's going out of style. The limitations of DirectX 11/WDDM 1.x prevent full reuse of certain types of assets by developers, and all the while it's extremely common for games to claim much (if not all) available VRAM for their own use with the intent of ensuring they have enough VRAM for future use, and otherwise caching as many resources as possible for better performance.
> 
> The good news here is that the current situation leaves overhead that AMD can optimize around. AMD has been creating both generic and game-specific memory optimizations in order to better manage VRAM usage and what resources are held in local VRAM versus paging out to system memory. By controlling duplicate resources and clamping down on overzealous caching by games, it is possible to get more mileage out of the 4GB of VRAM AMD has.


Source: http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/7

It ain't propaganda bud








That info I posted was all fact. Whether that is the issue or not, though, is merely an educated guess. I mean both titles are from SquareEnix and both seem to suffer from similar issues.

And guess what? HardOCP did not get stuttering but many users did under ROTTR due to this feature. The difference? System configurations. Software and Hardware. See here: http://bfy.tw/7c8i


----------



## PlugSeven

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Guru3D Benchmarks are up for DX12
> 
> Looking really good for the red team:
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,9.html


You can see why the Titan X came out in a hurry and why they're pushing hard with Volta. There's only so many times you can bench older dx11 titles and be shown to be faster with any sort of relevance.


----------



## johnblake

Quote:


> Originally Posted by *PlugSeven*
> 
> You can see why the Titan X came out in a hurry and why they're pushing hard with Volta. There's only so many times you can bench older dx11 titles and be shown to be faster with any sort of relevance.


Yes it could not have come sooner.
https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/


----------



## octiny

Runs great for me in DX12, MUCH bigger gains versus my 1080 SLI system









I expect more gains to be had and optimized on the AMD side.


----------



## Defoler

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Makes no sense.
> 
> Guru3D is reporting the contrary.


TBH guru3d have also tested on the 5960x, which is completely different from the 6700K.
I still wonder the implications of a different CPU on the game and on DX12.

If for example the game is more software required for nvidia than AMD, how does a higher IPC higher clocked but less cores CPU runs the game.


----------



## BradleyW

I am seeing a boost as high as 12 FPS and as low as 6 FPS in D3D12 mode on Ultra - 2560x1080. *However the stuttering/lag when running through the streets is terrible!*


----------



## zealord

R9 390X on par with GTX 1080 @ FullHD and WQHD DX12?

lol


----------



## jonathan123456789

I can't access the game at all in directx 12, it tries to load and crashes every time.....? Anyone else getting this?


----------



## BradleyW

Quote:


> Originally Posted by *jonathan123456789*
> 
> I can't access the game at all in directx 12, it tries to load and crashes every time.....? Anyone else getting this?


Your sig rig says you are using Windows 7. If true, you need Windows 10 for Direct-X 12 to operate.


----------



## jonathan123456789

My bad...I'm using Windows 10, i just haven't updated that ...I'll sort it now!


----------



## Dygaza

Quote:


> Originally Posted by *jonathan123456789*
> 
> I can't access the game at all in directx 12, it tries to load and crashes every time.....? Anyone else getting this?


I had same problem. Game always crashed when AMD's gaming evolved title would normally show up. For me the reason was Afterburner's OSD. If I made sure OSD was disabled during startup, it would work fine.

Check that if you have afterburner running, or any other 3rd party overlays.


----------



## GorillaSceptre

^
Yeah, check that first.


----------



## jonathan123456789

im using evga precision, as far as i can see the OSD setting is not activated....? any other ideas? it works fine in directx 11


----------



## Nameless1988

I am able to run the game with afterburner dx12 OSD overlay, never crashed on my rig.


----------



## Semel

Quote:


> Originally Posted by *jonathan123456789*
> 
> I can't access the game at all in directx 12, it tries to load and crashes every time.....? Anyone else getting this?


Reduce texture quality to very high. It always crashed on my rig when I had it set to ultra.


----------



## Dygaza

Here is what Unwinder said on Guru3d afterburner section:
Quote:


> If I were you I'd check twice because I can reproduce it as soon as Plays.tv service from AMD Gaming Evolved bundle is running.


http://forums.guru3d.com/showthread.php?t=409693&page=3

Read the section though, check if you have Gaming evolved client active, or that plays.tv from raptor bundle.


----------



## jonathan123456789

Quote:


> Originally Posted by *Semel*
> 
> Reduce texture quality to very high. It always crashed on my rig when I had it set to ultra.


tried tinkering with the settings, it doesn't appear to matter it just goes black screen and the "is not responding" message.


----------



## Glottis

Quote:


> Originally Posted by *Olivon*
> 
> They're plenty examples that show the game run bad on Prague on AMD DX12 (ie. real gaming life, not a crappy integrated benchmark).
> The Randomdude videos show a massive stutter that is not present in DX11 but also not on NV DX12.
> You spammed this forum with false informations all the time and I see that you you're still on. Just look randomdude videos ans stop accusing good journalists that do their work and just don't push on a button like Guru3D who don't help real gamers at all by using this non representative integrated benchmark.
> 
> 
> 
> I understand that this kind of results are not really good for the DX12 AMD marketing evangelist you are, but it's just the truth and you got to deal with it. But stop denial and respect real reviewers, not just push buttons ones.


excellent. this also confirms DigitalFoundry's video that gameplay performance is completely different from benchmark performance.


----------



## LongtimeLurker

What's interesting to me is you have posters coming in here posting benchmarks and saying comments like:

_"Looking really good for the red team"_

and
_
"You can see why the Titan X came out in a hurry and why they're pushing hard with Volta"_

Completely IGNORING the fact that there's a known BUG affecting NVidia cards at the moment, causing an up to 30 FPS DECREASE in performance.



People just have blinders on I guess? Just choose to completely ignore this when commenting? Weird...


----------



## NightAntilli

The bug is nVidia hardware xD Just kidding...

Obviously DX12 is not stable yet. Developers wanted direct access and control over memory allocation etc, but it seems like in practice, they're not capable (yet).


----------



## KyadCK

Quote:


> Originally Posted by *LongtimeLurker*
> 
> What's interesting to me is you have posters coming in here posting benchmarks and saying comments like:
> 
> _"Looking really good for the red team"_
> 
> and
> _
> "You can see why the Titan X came out in a hurry and why they're pushing hard with Volta"_
> 
> Completely IGNORING the fact that there's a known BUG affecting NVidia cards at the moment, causing an up to 30 FPS DECREASE in performance.
> 
> 
> 
> People just have blinders on I guess? Just choose to completely ignore this when commenting? Weird...


You can compare nVidia's DX11 to AMD's DX12 if you like. Unless you think their DX11 is also "bugged".


----------



## HaiderGill

Quote:


> Originally Posted by *LongtimeLurker*
> 
> What's interesting to me is you have posters coming in here posting benchmarks and saying comments like:
> 
> _"Looking really good for the red team"_
> 
> and
> _
> "You can see why the Titan X came out in a hurry and why they're pushing hard with Volta"_
> 
> Completely IGNORING the fact that there's a known BUG affecting NVidia cards at the moment, causing an up to 30 FPS DECREASE in performance.
> 
> 
> 
> People just have blinders on I guess? Just choose to completely ignore this when commenting? Weird...


I wish Intel would buy out Imagination Technologies and starting building X64 PowerVR APUs and cards...

I'm wondering why the RX 480 AIB cards aren't coming out and the GTX 1070 competitor from AMD. All they need to do is give the RX 480 a 35% speed boost to compete...


----------



## LongtimeLurker

Quote:


> Originally Posted by *KyadCK*
> 
> You can compare nVidia's DX11 to AMD's DX12 if you like. Unless you think their DX11 is also "bugged".


The built-in game benchmark is incorrect, the code is buggy, the DX12 code branch is a disaster, and the 1000-series cards handle DX12 a lot better than the 900-series (I almost bought a 1060).

I wouldn't put stock in any sort of benchmarking for this game just yet.

I really don't get this red-vs-green thing going on here. I am picking up an RX 480 today after work, however if I could afford it, I would be buying a GTX 1080. Both companies have appealing options at different price points. For me, the RX 480 wins hands down against the GTX 1060 because of DX 12 and the memory amount and the cost.

But because I'm buying AMD doesn't mean I don't also understand the 1080 is a step above (or 2, depending on the game) a Fury X....

Why can't people just be honest and admit that? Instead of "Oh well NVidia's hardware is not built for DX 12, bla bla bla", yet in actual working completed games, I see the 1080 at the top EVERY TIME.

Not even talking about the Titan Pascal (which I also agree is ridiculously priced, would never buy one even if I could afford it)...


----------



## kx11

my 1440p gameplay

https://www.youtube.com/watch?v=fjmyS8I9ogM


----------



## jonathan123456789

I found out what was preventing the game from launching in directx 12, it was kaspersky internet security 2016. If i completely disable it, i can get it to launch. I'm not sure how to configure it to work without disabling it though, i tried adding it to the trusted app category but that doesn't work, any ideas? I don't want to completely disable kaspersky every time i want to play deus ex!


----------



## KyadCK

Quote:


> Originally Posted by *LongtimeLurker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> You can compare nVidia's DX11 to AMD's DX12 if you like. Unless you think their DX11 is also "bugged".
> 
> 
> 
> The built-in game benchmark is incorrect, the code is buggy, the DX12 code branch is a disaster, and the 1000-series cards handle DX12 a lot better than the 900-series (I almost bought a 1060).
> 
> I wouldn't put stock in any sort of benchmarking for this game just yet.
> 
> I really don't get this red-vs-green thing going on here. I am picking up an RX 480 today after work, however if I could afford it, I would be buying a GTX 1080. Both companies have appealing options at different price points. For me, the RX 480 wins hands down against the GTX 1060 because of DX 12 and the memory amount and the cost.
> 
> But because I'm buying AMD doesn't mean I don't also understand the 1080 is a step above (or 2, depending on the game) a Fury X....
> 
> Why can't people just be honest and admit that? Instead of "Oh well NVidia's hardware is not built for DX 12, bla bla bla", yet in actual working completed games, I see the 1080 at the top EVERY TIME.
> 
> Not even talking about the Titan Pascal (which I also agree is ridiculously priced, would never buy one even if I could afford it)...
Click to expand...

Yea don't care about much of any of that.

DX12 is not typically much (if at all) better for nVidia than DX11. You have the option to use both. Therefore you can compare DX11 nVidia and DX12 AMD, which is best of both sides. DX12's "bug" is irrelevant.


----------



## kx11

Quote:


> Originally Posted by *jonathan123456789*
> 
> I found out what was preventing the game from launching in directx 12, it was kaspersky internet security 2016. If i completely disable it, i can get it to launch. I'm not sure how to configure it to work without disabling it though, i tried adding it to the trusted app category but that doesn't work, any ideas? I don't want to completely disable kaspersky every time i want to play deus ex!


last time i used Kaspersky was back in 2011 i think

who uses that anymore ?? maybe big companies


----------



## LongtimeLurker

Quote:


> Originally Posted by *KyadCK*
> 
> Yea don't care about much of any of that.
> 
> DX12 is not typically much (if at all) better for nVidia than DX11. You have the option to use both. Therefore you can compare DX11 nVidia and DX12 AMD, which is best of both sides. DX12's "bug" is irrelevant.


Right. Exactly! That's exactly the point I was trying to say (although doing a poor job of it







). People are throwing up DX12 charts saying "look how much better AMD is!". Yes that's true in this game, but the qualifier is, you're looking at broken DX12 on Nvidia







. So you can't use a straight DX 12 comparison chart here.

What we really need is what you just said - a chart showing the best from each company, THEN you can draw your conclusions. And preferably on a game that's not buggy with a broken and stuttering DX 12 implementation lol







.


----------



## KyadCK

Quote:


> Originally Posted by *LongtimeLurker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> Yea don't care about much of any of that.
> 
> DX12 is not typically much (if at all) better for nVidia than DX11. You have the option to use both. Therefore you can compare DX11 nVidia and DX12 AMD, which is best of both sides. DX12's "bug" is irrelevant.
> 
> 
> 
> Right. Exactly! That's exactly the point I was trying to say (although doing a poor job of it
> 
> 
> 
> 
> 
> 
> 
> ). People are throwing up DX12 charts saying "look how much better AMD is!". Yes that's true in this game, but the qualifier is, you're looking at broken DX12 on Nvidia
> 
> 
> 
> 
> 
> 
> 
> .
> 
> What we really need is what you just said - a chart showing the best from each company, THEN you can draw you conclusions. And preferably on a game that's not buggy with a broken and stuttering DX 12 implementation lol
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Thing is, things will eventually go the way it did for DX11/9. nVidia still needs to fix it, and while it isn't relevant now, that may/will not hold true in the future. Knowing all data is important, not "the best".


----------



## jonathan123456789

Quote:


> Originally Posted by *kx11*
> 
> last time i used Kaspersky was back in 2011 i think
> 
> who uses that anymore ?? maybe big companies


What do you use instead....?

Does anyone know how to configure it so it doesn't crash it? I'm guessing someone will patch it in time?


----------



## Pro3ootector

I wanted to write that Polaris crushed 1070, but Yeah DX11 is fairly well optimized for both.
Let's wait for non beta patch. Only thing is damn.. AMD really is kinking ass in 3D graphics these days, they're on friggin rampage.


----------



## kx11

Quote:


> Originally Posted by *jonathan123456789*
> 
> What do you use instead....?
> 
> Does anyone know how to configure it so it doesn't crash it? I'm guessing someone will patch it in time?


nothing , windows basic defender crap

also Malewarebytes when something goes really bad but then i remove that app afterwards


----------



## Mahigan

Quote:


> Originally Posted by *LongtimeLurker*
> 
> Right. Exactly! That's exactly the point I was trying to say (although doing a poor job of it
> 
> 
> 
> 
> 
> 
> 
> ). People are throwing up DX12 charts saying "look how much better AMD is!". Yes that's true in this game, but the qualifier is, you're looking at broken DX12 on Nvidia
> 
> 
> 
> 
> 
> 
> 
> . So you can't use a straight DX 12 comparison chart here.
> 
> What we really need is what you just said - a chart showing the best from each company, THEN you can draw your conclusions. And preferably on a game that's not buggy with a broken and stuttering DX 12 implementation lol
> 
> 
> 
> 
> 
> 
> 
> .


That comment from the devs does not specify nVIDIA. It says that some issues occur on high end GPUs. We see this with both nVIDIA and AMD GPUs right now. It doesn't affect everyone though. Some folks are getting better performance with nVIDIA and DX12, oddly enough, than DX11 in various forums.

This patch is in BETA, but like AMD and nVIDIA mentioned in their joint GDC presentation, porting DX11 to DX12 leads to issues. You either fully commit to DX12 or you do not attempt it.


----------



## pengs

Quote:


> Originally Posted by *LongtimeLurker*
> 
> What's interesting to me is you have posters coming in here posting benchmarks and saying comments like:
> 
> _"Looking really good for the red team"_
> 
> and
> _
> "You can see why the Titan X came out in a hurry and why they're pushing hard with Volta"_
> 
> Completely IGNORING the fact that there's a known BUG affecting NVidia cards at the moment, causing an up to 30 FPS DECREASE in performance.


The bug is called DX12... hu hu huhu hu...

hu


----------



## Nameless1988

IMO, moving from dx11 to dx12 in a game so problematic is a pain. There is so bad optimization that they may focuse about dx11 optimization and fix the problems, instead of introducing dx12 stuff. IMO It's absurd!


----------



## black96ws6

Yeah it is kind of interesting, by moving the responsibility of the implementation away from the GPU manufacturers and onto the devs themselves, I think we're going to get implementations that are all over the map!

Luckily, DX 11 is not going anywhere, you have to have Windows 10 to use DX 12 anyway, and look at how many people have not upgraded to Windows 10, so, DX 11 will be around for quite awhile.

By the time DX 11 is no longer relevant, we'll be looking at cards well past Vega and Volta...


----------



## boredgunner

Quote:


> Originally Posted by *Nameless1988*
> 
> IMO, moving from dx11 to dx12 in a game so problematic is a pain. There is so bad optimization that they may focuse about dx11 optimization and fix the problems, instead of introducing dx12 stuff. IMO It's absurd!


Games need to be designed from the ground up in Vulkan or even DX12, and then ported to DX11 or OpenGL afterwards. All problems solved!


----------



## BiG StroOnZ

Quote:


> Originally Posted by *PlugSeven*
> 
> You can see why the Titan X came out in a hurry and why they're pushing hard with Volta. There's only so many times you can bench older dx11 titles and be shown to be faster with any sort of relevance.


There's really no other way around it, DX12 and Vulkan is and will be the future of gaming. When it finally takes off and is fully being used by many game developers is still debatable (how many years it will be until that happens) but when that time comes we are seeing small little glimpses of AMD's future in the dGPU sector. Now while I don't think NVIDIA is going to just sit on their hands and do nothing about their DX12/Vulkan performance. I do think that regardless of how much progress NVIDIA ends up making in DX12/Vulkan, AMD still will have great performance guaranteed in the future in those APIs. Even if doesn't mean total AMD domination like some might want to imagine. It might at least mean neck and neck performance between both camp's cards, which is what you want and need in the market place.

Which means this nonsense GPU market we see today is going to be nonexistent. This incredible inflated pricing of cards and lack of availability of cards, basically no competition between the two manufacturers. This will all be a thing of the past. AMD is going to be able to release cards in two years from now when Vulkan and DX12 are relevant and they will immediately be competitive with NVIDIA.

Which is why I've been having trouble picking out my next GPU upgrade because long term investment says AMD but short term investment says NVIDIA. I'm still waiting to see what AMD has to offer in the $350-400 range. I'm basically waiting to see if AMD has a card that lands around $350 that is faster than a 1070 but slower than a 1080. Hoping that is what the 490 will be. Then I want to see full Fury/Nano replacements that start at 1080 performance and slowly make their way to Pascal Titan X performance.


----------



## boredgunner

^ Yeah, I'm really interested in seeing how NVIDIA will try to adapt, because they will have to. Time is running out for them; DX12 and Vulkan are moving in faster than new GPU chips are. They're going to need some serious changes in architecture.


----------



## tpi2007

Quote:


> Originally Posted by *boredgunner*
> 
> ^ Yeah, I'm really interested in seeing how NVIDIA will try to adapt, because they will have to. Time is running out for them; DX12 and Vulkan are moving in faster than new GPU chips are. They're going to need some serious changes in architecture.


That's what Volta is supposed to be for. If it is, then they will be on time; if it isn't, then they are going to start having a perception problem right up front and then a practical problem as the months go by. They have a year of goodwill counting from now while most games released were still built from the ground up in DX 11. Volta is rumoured to come earlier in the year though, so it shouldn't be a problem.


----------



## Bem69

The Computerbase's benchmark is fake. My R9 390 at 1080p scores:
benchmark DX11 avg 47,6 DX12 avg 51,4
Prague DX11 50fps DX12 57 fps
DX12 improves performance on AMDs.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *boredgunner*
> 
> ^ Yeah, I'm really interested in seeing how NVIDIA will try to adapt, because they will have to. Time is running out for them; DX12 and Vulkan are moving in faster than new GPU chips are. They're going to need some serious changes in architecture.


Yes I think this primarily why we are seeing those rumors about Volta launching in 2017. NVIDIA might already have the groundwork laid to compete in DX12/Vulkan and that is probably why Volta is coming earlier than expected:

http://wccftech.com/nvidia-roadmap-2017-volta-gpu/

That can be as early as July 2017. Which means basically a replication of this year's Summer with never ending Volta cards launching instead of Pascal cards.


----------



## Woundingchaney

Quote:


> Originally Posted by *Bem69*
> 
> The Computerbase's benchmark is fake. My R9 390 at 1080p scores:
> benchmark DX11 avg 47,6 DX12 avg 51,4
> Prague DX11 50fps DX12 57 fps
> DX12 improves performance on AMDs.


Are you using the in-game benchmark or actual gameplay (seems you are showing both, but I wasnt sure). Your results very much mimic what we have seen utilizing the in game benchmark rather than gameplay. Unfortunately one of downsides of having an in-game benchmark is that reviewers have a tendency to utilize it considerably more often than gameplay benchmarks in scenarios such as this. Its also considerably easier to cater driver performance to synthetic benchmarks (which is something we have seen both companies do in the past). User reviews are all over the place particularly given the "stuttering" people are experiencing. I dont doubt doubt that AMD would/should benefit more so than nVidia under such a DX12 implementation (Pascal should be showing a performance increase honestly). but these synthetic results are suspect.

One of the only professional reviewers I have seen use actual gameplay is:

https://www.computerbase.de/2016-09/deus-ex-mankind-divided-dx12-benchmark/2/


----------



## johnblake

contract hardening shadow (AMD GPU Open) hit 20% of performance on Nvidia and does not do anything ,which is the reason why Pchardware.de and Computerbase results are vastly different from other sites. Post 436.
https://forums.anandtech.com/threads/deus-ex-mankind-dividied-system-specs-revealed.2482832/page-18

I know it is Nvidia fault that AMD features did not work and it also nvidia fault their own features does not work on AMD. Nvidia is the evil and criminal in this.


----------



## ku4eto

Quote:


> Originally Posted by *johnblake*
> 
> contract hardening shadow (AMD GPU Open) hit 20% of performance on Nvidia and does not do anything ,which is the reason why Pchardware.de and Computerbase results are vastly different from other sites. Post 436.
> https://forums.anandtech.com/threads/deus-ex-mankind-dividied-system-specs-revealed.2482832/page-18
> 
> I know it is Nvidia fault that AMD features did not work and it also nvidia fault their own features does not work on AMD. Nvidia is the evil and criminal in this.


You know, the Contact Hardening Shadows existed in GTA 5 as well. And i don't see any issues with nVidia performance there.


----------



## Mahigan

Quote:


> Originally Posted by *Bem69*
> 
> The Computerbase's benchmark is fake. My R9 390 at 1080p scores:
> benchmark DX11 avg 47,6 DX12 avg 51,4
> Prague DX11 50fps DX12 57 fps
> DX12 improves performance on AMDs.


Yep.

Every single one of us is getting better FPS. Many are getting stuttering, though.


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> contract hardening shadow (AMD GPU Open) hit 20% of performance on Nvidia and does not do anything ,which is the reason why Pchardware.de and Computerbase results are vastly different from other sites. Post 436.
> https://forums.anandtech.com/threads/deus-ex-mankind-dividied-system-specs-revealed.2482832/page-18
> 
> I know it is Nvidia fault that AMD features did not work and it also nvidia fault their own features does not work on AMD. Nvidia is the evil and criminal in this.


http://docs.nvidia.com/gameworks/content/gameworkslibrary/graphicssamples/d3d_samples/d3dsoftshadowssample.htm

Riiiiiiiight.

GPUOpen means that the math being used can be optimized for both architectures.

Contact Hardening Shadows is similar to Percentage Closer Soft Shadows (PCSS) except it is vendor agnostic. So yeah... it does do something.

I also find this odd.. is this you?


----------



## Defoler

Quote:


> Originally Posted by *johnblake*
> 
> contract hardening shadow (AMD GPU Open) hit 20% of performance on Nvidia and does not do anything ,which is the reason why Pchardware.de and Computerbase results are vastly different from other sites. Post 436.
> https://forums.anandtech.com/threads/deus-ex-mankind-dividied-system-specs-revealed.2482832/page-18
> 
> I know it is Nvidia fault that AMD features did not work and it also nvidia fault their own features does not work on AMD. Nvidia is the evil and criminal in this.


Makes sense.
The usual OCN double standard...
Quote:


> Originally Posted by *Mahigan*
> 
> http://docs.nvidia.com/gameworks/content/gameworkslibrary/graphicssamples/d3d_samples/d3dsoftshadowssample.htm
> 
> Riiiiiiiight.
> 
> GPUOpen means that the math being used can be optimized for both architectures.


The one needing to optimise it is the developer in DX12. Nvidia can't optimise for it, since it is fully controlled by the developer and it is integrated into their engine.
Meaning if the developer is using it as-is for AMD, but not adjust it for nvidia, it is not optimised, and cannot be optimised once utilised.

This is not DX11. Very little is now being controlled by the drivers.

And since the game is collaborated by AMD.... Guess what will happen.
Quote:


> Originally Posted by *ku4eto*
> 
> You know, the Contact Hardening Shadows existed in GTA 5 as well. And i don't see any issues with nVidia performance there.


Because for nvidia there is also PCSS in GTA5, which runs better on nvidia. In Deus ex there is only CHS.
If Deus ex was also implementing PCSS like GTA5, than the performance hit wouldn't have been similar.


----------



## Mahigan

Quote:


> Originally Posted by *Defoler*
> 
> Makes sense.
> The usual OCN double standard...
> The one needing to optimise it is the developer in DX12. Nvidia can't optimise for it, since it is fully controlled by the developer and it is integrated into their engine.
> Meaning if the developer is using it as-is for AMD, but not adjust it for nvidia, it is not optimised, and cannot be optimised once utilised.
> 
> This is not DX11. Very little is now being controlled by the drivers.
> 
> And since the game is collaborated by AMD.... Guess what will happen.


False...

http://www.overclock3d.net/news/gpu_displays/deus_ex_mankind_divided_will_use_both_nvidia_and_amd_technologies/1
http://www.tweaktown.com/news/53064/deus-ex-mankind-divideds-latest-screenshots-look-great/index.html
Quote:


> It has been revealed that Deus Ex: Mankind Divided will be making use of use both Nvidia and AMD technologies at GDC, with the game using PureHair, a technology that is based on AMD's TressFX and Nvidia's PhysX and APEX middlewares.
> 
> Deus Ex is one of the most highly anticipated games that will be coming out on PC this year, with the game already having been delayed to August 23rd with the developers stating that they needed to delay the game "to meet those expectations".
> 
> It has already been confirmed that Deus Ex: Mankind Divided will run at 30FPS on consoles but with a unlocked framerate on PC, with the PC version of the game running on DirectX 12. At this time a DirectX 11 version of the game is likely, but unconfirmed.


Quote:


> Deus Ex: Mankind Divided will have support for both NVIDIA and AMD technologies, with AMD's PureHair technology included, as well as NVIDIA's PhysX and APEX technologies. The game will support Asynchronous Compute for the DX12 side of things, so we should expect AMD to have a good advantage in Deus Ex: Mankind Divided. The game will also feature a built-in benchmark, so expect us to roll Mankind Divided into our GPU benchmarks as soon as it's released.
> 
> Read more: http://www.tweaktown.com/news/53064/deus-ex-mankind-divideds-latest-screenshots-look-great/index.html


You should probably watch more developer conferences. Both AMD and nVIDIA collaborated on optimizing Deus Ex. AMD has a marketing agreement though.


----------



## Defoler

Quote:


> Originally Posted by *Mahigan*
> 
> False...
> 
> http://www.overclock3d.net/news/gpu_displays/deus_ex_mankind_divided_will_use_both_nvidia_and_amd_technologies/1
> You should probably watch more developer conferences. Both AMD and nVIDIA collaborated on optimizing Deus Ex. AMD has a marketing agreement though.


When the settings only activate CHS, it is not false.

Also the game using physx and APEX (which are open sourced) does not mean it is collaborating with Nvidia.
Yet, if they state that that they are fully collaborating with AMD and not just marketing as you make it sound...

Meaning you are trying to read into it intentionally what is not written, in order to prove a point which isn't there.
Also OC3D is not developer conference, which you have also I'm sure not watch as well and just follow up on what OC3D said (and assumed), and not what the developer said.

overall you are just hanging by half broken straws with those arguments.


----------



## johnblake

Quote:


> Originally Posted by *Defoler*
> 
> When the settings only activate CHS, it is not false.
> 
> Also the game using physx and APEX (which are open sourced) does not mean it is collaborating with Nvidia.
> Yet, if they state that that they are fully collaborating with AMD and not just marketing as you make it sound...
> 
> Meaning you are trying to read into it intentionally what is not written, in order to prove a point which isn't there.
> Also OC3D is not developer conference, which you have also I'm sure not watch as well and just follow up on what OC3D said (and assumed), and not what the developer said.
> 
> overall you are just hanging by half broken straws with those arguments.


Exactly..


----------



## Mahigan

Quote:


> Originally Posted by *Defoler*
> 
> Because for nvidia there is also PCSS in GTA5, which runs better on nvidia. In Deus ex there is only CHS.
> If Deus ex was also implementing PCSS like GTA5, than the performance hit wouldn't have been similar.


Also false..


Source: http://www.hardocp.com/article/2015/04/20/grand_theft_auto_v_video_card_performance_preview/6#.V9PD8ZgrIwE

Both cause an 11-14% performance hit. PCSS actually causes a larger hit than CHS on both AMD and nVIDIA hardware.

OCN doesn't have a double standard with regards to this...









Oh... and both hit AMD harder than nVIDIA.

Of course.. unless you want to now claim that HardOCP is AMD biased (like Johnny claimed Guru3D is)...


----------



## Mahigan

Quote:


> Originally Posted by *Defoler*
> 
> When the settings only activate CHS, it is not false.
> 
> Also the game using physx and APEX (which are open sourced) does not mean it is collaborating with Nvidia.
> Yet, if they state that that they are fully collaborating with AMD and not just marketing as you make it sound...
> 
> Meaning you are trying to read into it intentionally what is not written, in order to prove a point which isn't there.
> Also OC3D is not developer conference, which you have also I'm sure not watch as well and just follow up on what OC3D said (and assumed), and not what the developer said.
> 
> overall you are just hanging by half broken straws with those arguments.


Nope... I've just pretty much shot down all of your arguments with facts. Anyone reading our exchanges, unless they're partisan, will clearly see that you and Johnny are grasping at straws here.

Oh.. and the full collaboration is for Async Compute under DX12. It is the one added feature that truly benefits AMD hardware in this title. This is ironic because Johnny (Sontin) came all the way here to attack me from Anandtech due to the arguments he lost over there. Arguments I made with regards to Async Compute and in particular how much the feature truly benefits AMD hardware and, if use to a large extent, can negatively impact nVIDIA hardware.

So far, other than Time Spy (which isn't a game), every single implementation of Async Compute has done exactly what I claimed it would do.


----------



## johnblake

Quote:


> Originally Posted by *Mahigan*
> 
> Nope... I've just pretty much shot down all of your arguments with facts. Anyone reading our exchanges, unless they're partisan, will clearly see that you and Johnny are grasping at straws here.
> 
> Oh.. and the full collaboration is for Async Compute under DX12. It is the one added feature that truly benefits AMD hardware in this title. This is ironic because Johnny (Sontin) came all the way here to attack me from Anandtech due to the arguments he lost over there. Arguments I made with regards to Async Compute and in particular how much the feature truly benefits AMD hardware and, if use to a large extent, can negatively impact nVIDIA hardware.
> 
> So far, other than Time Spy (which isn't a game), every single implementation of Async Compute has done exactly what I claimed it would do.


Yes i am shill and look another benchmark of gameplay ,in which AMD suffer more stuttering and hitching compare to Nvidia on DX12 in Dues EX. Now you make that wrong, he paid by nvidia or tell other stories by posting in game build benchmark.
http://techreport.com/review/30639/examining-early-directx-12-performance-in-deus-ex-mankind-divided/3


----------



## GorillaSceptre

Almost tempted to pick up Deus Ex to test myself.. I'll rather wait until a patch is out though, as i have a feeling no matter what results i post i'll be labeled a "based person" anyway.









At this point @Mahigan I'm pretty sure they're not even bothering to read the arguments in your posts..

It doesn't help that id Software are the only ones to have an outstanding release on the new API's.. Until devs get accustomed to DX12 and Vulkan the general rhetoric of "DX11 good, DX12 bad" will persist. Can't say i really blame them either.. Nixxes need to get their act together, or preferably, studios need to stop using port houses altogether.


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> Yes i am shill and look another benchmark of gameplay ,in which AMD suffer more stuttering and hitching compare to Nvidia on DX12 in Dues EX. Now you make that wrong, he paid by nvidia or tell other stories by posting in game build benchmark.
> http://techreport.com/review/30639/examining-early-directx-12-performance-in-deus-ex-mankind-divided/3


And I have not denied that many people are experiencing stuttering... I've mentioned it about a dozen times or so thus far...









Someone get this guy some reading glasses or something.


----------



## johnblake

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Almost tempted to pick up Deus Ex to test myself.. I'll rather wait until a patch is out though, as i have a feeling no matter what results i post i'll be labeled a "based person" anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At this point @Mahigan I'm pretty sure they're not even bothering to read the arguments in your posts..
> 
> It doesn't help that id Software are the only ones to have an outstanding release on the new API's.. Until devs get accustomed to DX12 and Vulkan the general rhetoric of "DX11 good, DX12 bad" will persist. Can't say i really blame them either.. Nixxes need to get their act together, or preferably, studios need to stop using port houses altogether.


I do not understand why Nixxes even bother to release a preview version of DX12, which is broken to its core. They should have done a lot of internal testing and QA before even releasing this. It is like alpha version DX12.


----------



## Newbie2009

Quote:


> Originally Posted by *johnblake*
> 
> I do not understand why Nixxes even bother to release a preview version of DX12, which is broken to its core. They should have done a lot of internal testing and QA before even releasing this. It is like alpha version DX12.


cheaper for the public to do it for them.


----------



## GorillaSceptre

They did the same with ROTR, they seem to want to get it out as fast as possible for another bullet-point on a marketing slide, instead of making it a meaningful performance improvement for consumers.

The ROTR patch turned out alright, so lets see. So far id and Oxide have embarrassed the rest of them, the new API's rest on the shoulders of developers so i think a lot of them take it personally too.

Still wish CDPR would release a DX12/Vulkan build for TW3.. But instead of doing a half-measure they probably decided against it.

Oh well, for now id are the kings of the mountain, it's a pretty unique situation for studios to show off their technical prowess.


----------



## Power Drill

I just don't get it, why would any developer go DX12 over vulkan? That way you could reach all the gamers in the world. Is there some bonus money involved from microsoft side? Or dev tools so much better in DX side that most devs think it's no brainer to go that way?


----------



## johnblake

Quote:


> Originally Posted by *Power Drill*
> 
> I just don't get it, why would any developer go DX12 over vulkan? That way you could reach all the gamers in the world. Is there some bonus money involved from microsoft side? Or dev tools so much better in DX side that most devs think it's no brainer to go that way?


Windows and MS. They cannot even try to ditch DX if they are porting game to windows


----------



## Power Drill

Quote:


> Originally Posted by *johnblake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Power Drill*
> 
> I just don't get it, why would any developer go DX12 over vulkan? That way you could reach all the gamers in the world. Is there some bonus money involved from microsoft side? Or dev tools so much better in DX side that most devs think it's no brainer to go that way?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Windows and MS. They cannot even try to ditch DX if they are porting game to windows
Click to expand...

What do you mean? Vulkan works on every windows not just win10.


----------



## Woundingchaney

Quote:


> Originally Posted by *johnblake*
> 
> Windows and MS. They cannot even try to ditch DX if they are porting game to windows


DX is not a necessity for Windows development. Though DX is extremely common in development, so engineers are very familiar with it and has a history of not only excellent support, but excellent tool sets as well.

These things become a buzzword in forums and things of the like, but from a developmental standpoint familiarity and support goes a long way. There are obvious reasons why developers/publishers continue to use DX over other alternatives (that dont need a tin foil hat).


----------



## black96ws6

The sad thing is game developers don't know how to properly code for the API's. This is a little old but bears repeating:
Quote:


> *"Nearly every game ships broken. We're talking major AAA titles from vendors who are everyday names in the industry. In some cases, we're talking about blatant violations of API rules - one D3D9 game never even called BeginFrame/EndFrame.* Some are mistakes or oversights - one shipped bad shaders that heavily impacted performance on NV drivers. *These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it.* There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games - up to and including total replacement of the shipping shaders with custom versions by the driver team.*Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go."*


And commenting on trying to optimize for multi-GPUs:
Quote:


> "You cannot begin to conceive of the number of failure cases that are involved until you see them in person. I suspect that more than half of the total software effort within the IHVs is dedicated strictly to making multi-GPU setups work with existing games. (And I don't even know what the hardware side looks like.) If you've ever tried to independently build an app that uses multi GPU - especially if you tried to do it in OpenGL - you may have discovered this insane rabbit hole. There is ONE fast path, and it's the narrowest path of all."


Quote:


> It remains to be seen whether developers will be able to code properly for these new APIs!


I think we already know the answer to this last one...in most cases, the answer is no...

http://www.dsogaming.com/news/ex-nvidia-driver-developer-on-why-every-triple-a-games-ship-broken-multi-gpus


----------



## sumitlian

Quote:


> Originally Posted by *johnblake*
> 
> Windows and MS. They cannot even try to ditch DX if they are porting game to windows


LOL


----------



## Assirra

Quote:


> Originally Posted by *Power Drill*
> 
> I just don't get it, why would any developer go DX12 over vulkan? That way you could reach all the gamers in the world. Is there some bonus money involved from microsoft side? Or dev tools so much better in DX side that most devs think it's no brainer to go that way?


Probably a lot easier for developers to work with something they have been using for over a decade in some way or form compared to something completely new which has to be learned from scratch.


----------



## Olivon

Quote:


> Originally Posted by *johnblake*
> 
> Yes i am shill and look another benchmark of gameplay ,in which AMD suffer more stuttering and hitching compare to Nvidia on DX12 in Dues EX. Now you make that wrong, he paid by nvidia or tell other stories by posting in game build benchmark.
> http://techreport.com/review/30639/examining-early-directx-12-performance-in-deus-ex-mankind-divided/3


Quote:


> So that's a thing. Switching over to DXMD's DirectX 12 renderer doesn't improve performance on any of our cards, and it actually makes life much worse for the Radeons. The R9 Fury X turns in an average FPS result that might make you think its performance is on par with the GTX 1070 once again, but don't be fooled-that card's 99th-percentile frame time number is no better than even the GTX 1060's. Playing DXMD on the Fury X and RX 480 was a hitchy, stuttery experience, and our frame-time plots confirm that impression.


Quote:


> It's also clear that it's too early to call a winner between the green and red teams for DirectX 12 performance in this beta build of Deus Ex, even if AMD seems to feel confident in doing so. The Radeon cards we tested perform poorly in our latency-sensitive frame-time metrics in DX12 mode, meaning that the Fury X's hitchy gameplay stands in stark contrast to its respectable average-FPS result. Even if Nvidia isn't shouting from the rooftops about Pascal's performance in DXMD's DX12 mode right now, the green team has some kind of smoothness advantage despite the game's beta tag. To be fair, we used different settings than AMD did while gathering its performance numbers, but we don't feel like the choices we made would be much different than those the average enthusiast would have with this hardware.


Fair analysis by TechReport. Thanks for sharing.


----------



## BradleyW

I get the same experience as found by Tech Report.


----------



## GorillaSceptre

Quote:


> Originally Posted by *BradleyW*
> 
> I get the same experience as found by Tech Report.


I don't have the game to test..

What's your Windows power plan set to? If you have time can you set it to balanced and report back?


----------



## Mahigan

Quote:


> Originally Posted by *johnblake*
> 
> I do not understand why Nixxes even bother to release a preview version of DX12, which is broken to its core. They should have done a lot of internal testing and QA before even releasing this. It is like alpha version DX12.


It is a BETA, the only thing they got working was the internal benchmark. They are calling it a DX12 preview (hence why it is installed via a BETA patch).


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> They did the same with ROTR, they seem to want to get it out as fast as possible for another bullet-point on a marketing slide, instead of making it a meaningful performance improvement for consumers.
> 
> The ROTR patch turned out alright, so lets see. So far id and Oxide have embarrassed the rest of them, the new API's rest on the shoulders of developers so i think a lot of them take it personally too.
> 
> Still wish CDPR would release a DX12/Vulkan build for TW3.. But instead of doing a half-measure they probably decided against it.
> 
> Oh well, for now id are the kings of the mountain, it's a pretty unique situation for studios to show off their technical prowess.


I would kill for Witcher 3 DX12/Vulkan build. I could finally see CFX working. They really have to test DX12 somewhere since their engine is going to be use for CP77 and that has to be DX12 at the very least.


----------



## killerhz

well finally broke down and used cdkeys to get the game with the dx12 patch. to my surprise, my rig is showing it's age... i7 4970k and a 980Ti Classified and getting low as 12fps and max of 24fps under the DX12 using Ultra Settings. not a good hack so having issues with the keypads LOL!!! will keep trying but think my PC to old...


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would kill for Witcher 3 DX12/Vulkan build. I could finally see CFX working. They really have to test DX12 somewhere since their engine is going to be use for CP77 and that has to be DX12 at the very least.


I think it was on your recommendation i picked it up, turned out to be one of the best games I've ever played, over 70Hrs so far.







I thought TW2 was great, but TW3 is on a whole other level.. Who knows, maybe they'll get to Vulkan/DX12 at some point.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I think it was on your recommendation i picked it up, turned out to be one of the best games I've ever played, over 70Hrs so far.
> 
> 
> 
> 
> 
> 
> 
> I thought TW2 was great, but TW3 is on a whole other level.. Who knows, maybe they'll get to Vulkan/DX12 at some point.


I am still thinking of going back and playing Witcher II and even I. It's crazy to see how much CDPR has improved upon each game. You never see that kind of progression with today's games.


----------



## Woundingchaney

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am still thinking of going back and playing Witcher II and even I. It's crazy to see how much CDPR has improved upon each game. You never see that kind of progression with today's games.


There are reasons for that though. It's a trilogy that spans 8-9 years and has seen a considerable budget and labor increase with each sequel. Generally speaking major development house titles don't see this large of time spans between titles and/or this much budget differences.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Woundingchaney*
> 
> There are reasons for that though. It's a trilogy that spans 8-9 years and has seen a considerable budget increase with each sequel. Generally speaking major development house titles don't see this large of time spans between titles and/or this much budget differences.


Just look at the latest game right now. Deus EX. Its not much better than Human Revolution.


----------



## boredgunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just look at the latest game right now. Deus EX. Its not much better than Human Revolution.


The Witcher games had far more room for improvement than both Deus Ex and Human Revolution, which are less flawed than almost every other game I've played (I can't say the same for Mankind Divided though). The ultimate Deus Ex game would just be a full remake of the original with mechanics like this one and some new/reworked dialogue.


----------



## ZealotKi11er

Quote:


> Originally Posted by *boredgunner*
> 
> The Witcher games had far more room for improvement than both Deus Ex and Human Revolution, which are less flawed than almost every other game I've played (I can't say the same for Mankind Divided though). The ultimate Deus Ex game would just be a full remake of the original with mechanics like this one and some new/reworked dialogue.


Probably right for Witcher I. It came out around the same time as Crysis 1 but they looked generations apart. TW2 when it came out was very graphically demanding and looked good for its time. At least that is what I remember people talking about back in the day.


----------



## Shogon

Quote:


> tests like these demonstrate why one simply can't take average FPS numbers at face value when measuring graphics-card performance


----------



## Defoler

Quote:


> Originally Posted by *Mahigan*
> 
> Also false..
> 
> 
> Source: http://www.hardocp.com/article/2015/04/20/grand_theft_auto_v_video_card_performance_preview/6#.V9PD8ZgrIwE
> 
> Both cause an 11-14% performance hit. PCSS actually causes a larger hit than CHS on both AMD and nVIDIA hardware.
> 
> OCN doesn't have a double standard with regards to this...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh... and both hit AMD harder than nVIDIA.
> 
> Of course.. unless you want to now claim that HardOCP is AMD biased (like Johnny claimed Guru3D is)...


Learn to read. You are again, holding on straws, as usual it seems.

CHS on nvidia seems to do 20% performance hit? PCSS on nvidia does 11-14% hit? That is not the same. That was what I meant.

Stop trying to invent things in your mind and claim false/true because it fits your agenda.
Quote:


> Originally Posted by *Mahigan*
> 
> Nope... I've just pretty much shot down all of your arguments with facts. Anyone reading our exchanges, unless they're partisan, will clearly see that you and Johnny are grasping at straws here.
> 
> Oh.. and the full collaboration is for Async Compute under DX12. It is the one added feature that truly benefits AMD hardware in this title. This is ironic because Johnny (Sontin) came all the way here to attack me from Anandtech due to the arguments he lost over there. Arguments I made with regards to Async Compute and in particular how much the feature truly benefits AMD hardware and, if use to a large extent, can negatively impact nVIDIA hardware.
> 
> So far, other than Time Spy (which isn't a game), every single implementation of Async Compute has done exactly what I claimed it would do.


I'm not johnny, I don't speak for johnny.

You trying to use your arguments to attack him by attacking me, i find it kinda funny.

Who said anything about async compute? Only you just started to talk about it. I find it funny that your CHS/PCSS argument fell apart, so you are not moving to the next safe thing.
Again, without proof.


----------



## sviru

VXAO? available?


----------



## Pantsu

Quote:


> Originally Posted by *sviru*
> 
> VXAO? available?


Nvidia Gameworks feature in an AMD sponsored title? What do you think?


----------



## johnblake

Quote:


> Originally Posted by *Defoler*
> 
> Learn to read. You are again, holding on straws, as usual it seems.
> 
> CHS on nvidia seems to do 20% performance hit? PCSS on nvidia does 11-14% hit? That is not the same. That was what I meant.
> 
> Stop trying to invent things in your mind and claim false/true because it fits your agenda.
> I'm not johnny, I don't speak for johnny.
> 
> You trying to use your arguments to attack him by attacking me, i find it kinda funny.
> 
> Who said anything about async compute? Only you just started to talk about it. I find it funny that your CHS/PCSS argument fell apart, so you are not moving to the next safe thing.
> Again, without proof.


he can go much lower then this for AMD. You check out his history of BS and post in anandtech and you will be dam surprised what kind assumption and dooms day he predicted for Nvidia.


----------



## Mahigan

Quote:


> Originally Posted by *Defoler*
> 
> Learn to read. You are again, holding on straws, as usual it seems.
> 
> CHS on nvidia seems to do 20% performance hit? PCSS on nvidia does 11-14% hit? That is not the same. That was what I meant.
> 
> Stop trying to invent things in your mind and claim false/true because it fits your agenda.
> I'm not johnny, I don't speak for johnny.
> 
> You trying to use your arguments to attack him by attacking me, i find it kinda funny.
> 
> Who said anything about async compute? Only you just started to talk about it. I find it funny that your CHS/PCSS argument fell apart, so you are not moving to the next safe thing.
> Again, without proof.


Grand Theft Auto has both CHS and PCSS. You claimed that CHS incurred a larger performance penalty than PCSS under GTA. This is false.
Quote:


> This graph couldn't be any more clear, *Rockstar's Softest setting is the fastest setting* while *AMD CHS is slower*, and *NVIDIA PCSS is the slowest*.
> 
> *Between Rockstar's Softest shadow setting and NVIDA PCSS the difference is a noticeable 11% difference*. This is on the GTX 980 as well, the video card you'd think would be the most efficient at NVIDIA PCSS mode. That's a fairly large difference in performance that could make a set of settings playable or not at a given resolution.


Quote:


> *The exact same pattern is true on the AMD Radeon R9 290X*. *Rockstar's Softest shadow option is the fastest*, *then AMD CHS* and *then NVIDIA PCSS as the slowest*. The difference is a slightly larger 14% on the R9 290X between Rockstar's Softest setting and NVIDIA PCSS. The drop is also greater between Softest shadows and AMD CHS.
> 
> It is clear *AMD CHS and NVIDIA PCSS are not "free" soft shadow options*, these *cost performance*, and *NVIDIA PCSS more than AMD CHS*.


http://www.hardocp.com/article/2015/04/20/grand_theft_auto_v_video_card_performance_preview/6#.V9UUFJgrIwF

PCSS incurred a larger performance penalty than CHS under GTA.

What you're now saying is that under Deus Ex, a hypothetical implementation of PCSS would somehow (magically) incur less of a performance hit than the current CHS implementation. You have zero evidence for this claim. The one piece of evidence you attempted to use to justify this statement (GTA) does not actually prove your statement (quite the contrary).

I'm not the one inventing things mate, you're the one making an unsubstantiated claim and tying it to an example (GTA) which does not prove your claim.


----------



## Silent Scone

When all is said and done, I enjoyed this in DX11 and it ran flawlessly for me. Makes these arguments seem extremely petty, especially coming from the perspective of trying to defend lesser performing products, no matter what the circumstance.


----------



## Klocek001

two questions:
1. how long is this game ? I wanna pick it up but after playing tw3 with two expansions as my first ever rpg I don't wanna pay the same money for a 30h one.
2. is the async free yet or is it still kept hostage tied to a chair in JHH's cellar ?


----------



## Woundingchaney

Quote:


> Originally Posted by *Klocek001*
> 
> two questions:
> 1. how long is this game ? I wanna pick it up but after playing tw3 with two expansions as my first ever rpg I don't wanna pay the same money for a 30h one.
> 2. is the async free yet or is it still kept hostage tied to a chair in JHH's cellar ?


I don't know much at #2, but yes even with doing just about all the side missions and what not the game is about 30 hours.


----------



## johnblake

Quote:


> Originally Posted by *Klocek001*
> 
> two questions:
> 1. how long is this game ? I wanna pick it up but after playing tw3 with two expansions as my first ever rpg I don't wanna pay the same money for a 30h one.
> 2. is the async free yet or is it still kept hostage tied to a chair in JHH's cellar ?


Main story is very short and most likely 8 hr.


----------



## Olivon

Quote:


> Originally Posted by *Klocek001*
> 
> two questions:
> 1. how long is this game ? I wanna pick it up but after playing tw3 with two expansions as my first ever rpg I don't wanna pay the same money for a 30h one.


I played 42h to end the game and complete all side quests and PoI in pacifist (no kill at all) and "Give me Deux Ex" difficulty mode.
Game is really good, great story and dialogues, loved it. But yes, the game is a little short and AI deserved a better treatment though.
But I will definetly recommend this game, especially if you like the Deus Ex franchise.


----------



## Klocek001

Quote:


> Originally Posted by *Woundingchaney*
> 
> I don't know much at #2, but yes even with doing just about all the side missions and what not the game is about 30 hours.


Quote:


> Originally Posted by *johnblake*
> 
> Main story is very short and most likely 8 hr.


Quote:


> Originally Posted by *Olivon*
> 
> I played 42h to end the game and complete all side quests and PoI in pacifist (no kill at all) and "Give me Deux Ex" difficulty mode.
> Game is really good, great story and dialogues, loved it. But yes, the game is a little short and AI deserved a better treatment though.
> But I will definetly recommend this game, especially if you like the Deus Ex franchise.


so, short but still good enough to be worth it ?


----------



## Edge0fsanity

Quote:


> Originally Posted by *Klocek001*
> 
> so, short but still good enough to be worth it ?


i liked the game a lot but it was too short for $60. I got 30 hours out of it doing all side missions+main. It really needed another 15 hours of gameplay, most of it on the main story, to feel like a complete game. If you need a new game to play right now go for it. Otherwise wait for when it goes on sale for $30 at some point in the next 6 months. It would be well worth the money at that price.


----------



## boredgunner

Quote:


> Originally Posted by *Klocek001*
> 
> so, short but still good enough to be worth it ?


I think it's worth it, at least once stable. It took me around 40 hours as well, so much side quest content and it's all worth it. No menial, repetitive side quests here; some of its absolute best story moments are in the side quests. Mankind Divided has some of the very best dialogue and world building I've found in gaming. Overall the story isn't as good as HR or the original, which have more exposition and are more coherent and flow better and have a more grand plot at the same time, along with deeper characters in HR, but this game fits right in with the franchise and mouse acceleration aside it has some of the most fun gameplay ever. But damn that mouse acceleration...


----------



## Klocek001

Quote:


> Originally Posted by *boredgunner*
> 
> I think it's worth it, at least once stable. It took me around 40 hours as well, so much side quest content and it's all worth it. No menial, repetitive side quests here; some of its absolute best story moments are in the side quests. Mankind Divided has some of the very best dialogue and world building I've found in gaming. Overall the story isn't as good as HR or the original, which have more exposition and are more coherent and flow better and have a more grand plot at the same time, along with deeper characters in HR, but this game fits right in with the franchise and mouse acceleration aside it has some of the most fun gameplay ever. But damn that mouse acceleration...


Good post, good post. How is it gonna run on 2GHz GTX 1080 @1440p ? I'd like at least +80 fps @ high details, Witcher 3 runs above 100 fps avg. @1440p high settings on my rig.


----------



## boredgunner

Quote:


> Originally Posted by *Klocek001*
> 
> Good post, good post. How is it gonna run on 2GHz GTX 1080 @1440p ? I'd like at least +80 fps @ high details, Witcher 3 runs above 100 fps avg. @1440p high settings on my rig.


It just so happens I also have a 2 GHz GTX 1080 and play at 1440p. Let me check to see how High settings runs. I ran the game mostly maxed out, minus MSAA of course which is worthless here and in most other modern games, and I run Contact Hardening Shadows on On instead of Ultra. 55-70 FPS is what I got on average.

- EDIT: Yeah forget about 80+ FPS on high. 60-70, not much different than the better looking settings I run. Most areas in Prague that were in the low 50s are now at 60, the areas at 60 FPS are now 70 FPS, although getting more than 70 FPS just doesn't seem to happen.


----------



## NightAntilli

Ah. I remember the times where a game without multiplayer would last about 8 hours and was considered a good length...

Nowadays, 30 hours is short... Wow.


----------



## Assirra

Quote:


> Originally Posted by *NightAntilli*
> 
> Ah. I remember the times where a game without multiplayer would last about 8 hours and was considered a good length...
> 
> Nowadays, 30 hours is short... Wow.


That entirely depends on the genre tough.
RPG's have a standard of being long since forever. Imagine if Baldur's gate was like 8 hours.
Nobody (decent) complained about the short length of the latest wolfenstein.


----------



## Gunderman456

I finished the game two days ago. I just checked Steam and it said 89 hours played.

That is because I'm a completionist. I did not rush the quests, but knocked out/killed everyone and checked every room, duct, basement and sewer. I left nothing unturned.

Yes, I believe it was a short game, I think, at minimum, they need to double the content for the next Deus-Ex.

I think the two main problems with the game were;

THE MAP DESIGN; I managed to figure this one out quickly, and could foretell where every shortcut, duct, etc... was going to be which left me wanting and kind of killed the tension.

The LACK OF AUGMENTATION REQUIREMENTS; Another thing, by the time I was ready to use all the cool toys and super powers the game was done. I think that tank with the detonator should have been the mid point of the game, where you feel you have to use the extra special augmentations, before that you were just fighting pions the whole game through. I paused at the door, purchased and lined up my special augmentations for the first time and after the cut scene I realized that I had a "surprise for him", he was dead and I never got to use anything. They would have needed to at least push you to use your augmentations more but that never happened. That scenario would have at least given people pause to consider the augmentations and if the game continued from there, then more people would have enjoyed those augmentations even more.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NightAntilli*
> 
> Ah. I remember the times where a game without multiplayer would last about 8 hours and was considered a good length...
> 
> Nowadays, 30 hours is short... Wow.


It is short. For example Rise of Tomb Raider was short for me. I finished the game for 19 hours doing 100%. Tried the first game and did that in 13 hours.


----------



## NightAntilli

Quote:


> Originally Posted by *Assirra*
> 
> That entirely depends on the genre tough.
> RPG's have a standard of being long since forever. Imagine if Baldur's gate was like 8 hours.
> Nobody (decent) complained about the short length of the latest wolfenstein.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> It is short. For example Rise of Tomb Raider was short for me. I finished the game for 19 hours doing 100%. Tried the first game and did that in 13 hours.


It's always quite interesting... A movie is done in 2 hours on average. We pay maybe $10 - $15 tickets to watch a movie of such a length. That's $5 - $7.50 an hour.
A game that generally has a lot more work that needs to be done, we expect a $60 price for let's say ~45 hours, which is $1.33 an hour. An 8 hour game for $60 would be the equivalent of the movie price.
A 30 hour game, if we take the movie price, it would should cost $150...

In any case, I have no problem with the length of the game.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NightAntilli*
> 
> It's always quite interesting... A movie is done in 2 hours on average. We pay maybe $10 - $15 tickets to watch a movie of such a length. That's $5 - $7.50 an hour.
> A game that generally has a lot more work that needs to be done, we expect a $60 price for let's say ~45 hours, which is $1.33 an hour. An 8 hour game for $60 would be the equivalent of the movie price.
> A 30 hour game, if we take the movie price, it would should cost $150...
> 
> In any case, I have no problem with the length of the game.


They are not the same. Movies are super rich experiences. With games you create experiences. If a game is only like 10 hours than 3-4 movies can easily tell that story for less.


----------



## NightAntilli

You look at it from the perspective of how much story can be presented, but I was arguing from the production perspective, which is why I stated that games have a lot more work that needs to be done.

A basic movie needs;
Writers
Actors
Camera crew
Sound crew
Composer
Special effects crew
Editors

You can make a full movie with only those

A basic game needs to have all that (except maybe camera crew) and additionally;
Programmers
Concept artists
Level designers
Modelers
Animators
Game testers
Debuggers

More people need to be paid, and yet we expect to pay a much lower price for it. And if we take the story part seriously in the game, it's even harder. Combining a good story with interactivity is very hard to do, which is why stories of games sucked for such a long time (and often still do). The only difference is the fame between movie actors/directors and game creators. Game creators are not equally recognized for the work they do, even when they do more.


----------



## GorillaSceptre

There's a reason film budgets generally dwarf the ones found in gaming.. From a "production perspective" it's not even close. The amount of planning, work, etc., that goes into just one scene in a big budget movie is insane..

Ever heard of CGI? Most films these days need the same personal that make games. Even AI is used in movies, like the Orks in Lord of the Rings for example. Not to mention projects like Avatar..

Take a look at this: https://stephenfollows.com/how-many-people-work-on-a-hollywood-film/

Might give you a new perspective on just how many people are needed to bring these 2 hour experiences to life.

Iron Man 3 had 3300 crew members..

I think i know what you're trying to say though, i don't judge a games value based on it's length. These days i somewhat prefer shorter more linear games with very polished/high production set pieces and story. Open-world monsters have to be really special to keep me playing for dozens of hours, like TW3.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> There's a reason film budgets generally dwarf the ones found in gaming.. From a "production perspective" it's not even close. The amount of planning, work, etc., that goes into just one scene in a big budget movie is insane..
> 
> Ever heard of CGI? Most films these days need the same personal that make games. Even AI is used in movies, like the Orks in Lord of the Rings for example. Not to mention projects like Avatar..
> 
> Take a look at this: https://stephenfollows.com/how-many-people-work-on-a-hollywood-film/
> 
> Might give you a new perspective on just how many people are needed to bring these 2 hour experiences to life.
> 
> Iron Man 3 had 3300 crew members..
> 
> I think i know what you're trying to say though, i don't judge a games value based on it's length. These days i somewhat prefer shorter more linear games with very polished/high production set pieces and story. Open-world monsters have to be really special to keep me playing for dozens of hours, like TW3.


You will not get much games like that. 8-10 hours is never good unless its a masterpiece. Even so if a game is good it should be longer. I do not want a game where I am always wanting for more.


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You will not get much games like that. 8-10 hours is never good unless its a masterpiece. Even so if a game is good it should be longer. I do not want a game where I am always wanting for more.


To each their own i guess. Personally, I'd take titles like TR and Bioshock over 90% of the open-world games that could take even a hundred + hours to finish.

I'm not necessarily saying 8 hours is a good length, just that a 30hr game isn't automatically worse than a 100hr one, value is subjective.

A short game that i thought was great is more valuable to me than a long one i thought was good.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> To each their own i guess. Personally, I'd take titles like TR and Bioshock over 90% of the open-world games that could take even a hundred + hours to finish.
> 
> I'm not necessarily saying 8 hours is a good length, just that a 30hr game isn't automatically worse than a 100hr one, value is subjective.
> 
> A short game that i thought was great is more valuable to me than a long one i thought was good.


No but if a game like RoTR takes me 19 Hours to do everything there is to do and main story is only 10 hours than it sucks. RoTR is semi open world.


----------



## boredgunner

Quote:


> Originally Posted by *NightAntilli*
> 
> A basic game needs to have all that (except maybe camera crew) and additionally;
> Programmers
> Concept artists
> Level designers
> Modelers
> Animators
> Game testers
> Debuggers
> 
> More people need to be paid, and yet we expect to pay a much lower price for it. And if we take the story part seriously in the game, it's even harder. Combining a good story with interactivity is very hard to do, which is why stories of games sucked for such a long time (and often still do). The only difference is the fame between movie actors/directors and game creators. Game creators are not equally recognized for the work they do, even when they do more.


AAA games like this one yeah, but developers can take on multiple roles which is seen a lot in smaller studios. Lots of people playing the role of debuggers, modelers also doing the animation work and/or concept art, programmers also doing level design, etc.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> To each their own i guess. Personally, I'd take titles like TR and Bioshock over 90% of the open-world games that could take even a hundred + hours to finish.
> 
> I'm not necessarily saying 8 hours is a good length, just that a 30hr game isn't automatically worse than a 100hr one, value is subjective.
> 
> A short game that i thought was great is more valuable to me than a long one i thought was good.


I agree, one can't just say "8 hours is too short." It depends entirely on the type of game and there are is an infinite amount of game variety since it's an art form after all. Not every 8 hour game will leave you wanting for more. You would not want _Anna: Extended Edition_ to be that long or longer; it takes place only in a house/sawmill essentially, yet it is an exquisite game. You wouldn't want _The Vanishing of Ethan Carter_ or _Cryostasis: Sleep of Reason_ (an outstanding game) to be that long.

Deus Ex: Mankind Divided is a good length for what it is. Nearly as long as Human Revolution and the original despite only having 1.5 hubs and mostly being set in Prague (which is still a downside and not very Deus Ex like).


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No but if a game like RoTR takes me 19 Hours to do everything there is to do and main story is only 10 hours than it sucks. RoTR is semi open world.


\

and.... you are clearly the outlier in the example you keep using. Most people are not doing ROTRR in 19 hours at 100% completion. I'd say 99.9999999999999999% of people are not doing it that fast. Maybe you should pick another example?

@GorillaSceptre
Some games are taking 20-50 million and even 140 million to make with launch budgets of 200-500 million. So they are clearly starting to get into summer blockbuster movie territory.


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No but if a game like RoTR takes me 19 Hours to do everything there is to do and main story is only 10 hours than it sucks. RoTR is semi open world.


Sucks to _you_. I thought RoTR was fantastic, but if you always base a games value depending on how long it is, then i could just say - " Dragon Age: Inquisition is the better game because it's longer." While ignoring the story telling, the incredible set-pieces in RoTR, the production value and visuals, etc.

Quote:


> Originally Posted by *boredgunner*
> 
> I agree, one can't just say "8 hours is too short." It depends entirely on the type of game and there are is an infinite amount of game variety since it's an art form after all. Not every 8 hour game will leave you wanting for more. You would not want _Anna: Extended Edition_ to be that long or longer; it takes place only in a house/sawmill essentially, yet it is an exquisite game. You wouldn't want _The Vanishing of Ethan Carter_ or _Cryostasis: Sleep of Reason_ (an outstanding game) to be that long.
> 
> Deus Ex: Mankind Divided is a good length for what it is. Nearly as long as Human Revolution and the original despite only having 1.5 hubs and mostly being set in Prague (which is still a downside and not very Deus Ex like).


Yup.

I can appreciate (especially when i was younger and didn't have the disposable income i do now) the bang for the buck you can get in terms of sheer hours played with some games, but as I've gotten older i don't have the time to play as much as i did. So now i _value_ the quality of things over the quantity of them.

Ironic how when we're kids (most of us) we have all the time in the world to play games but we can't afford all of them, and then when we're older we can afford them but don't have the time..


----------



## boredgunner

Quote:


> Originally Posted by *the9quad*
> 
> \
> 
> and.... you are clearly the outlier in the example you keep using. Most people are not doing ROTRR in 19 hours at 100% completion. I'd say 99.9999999999999999% of people are not doing it that fast. Maybe you should pick another example?


Also, Rise of the Tomb Raider is "semi open world?" I know Tomb Raider 2013 was linear as hell, I didn't think RotR was any different. Maybe it is, but I have to say that "open world" is one of the most misused terms in gaming.


----------



## the9quad

Just a little info about ROTRR, a little over 4% people have completed it 100%. the average time for 100% of the users is 21 hours. So yeah 19 hours for 100% completion is a tad outside the norm by a wide wide margin.

I feel 20 hours for a game like that is plenty of value btw, and by that time they have usually outstayed their welcome for me. (looks like I am not alone since that is about he average time spent on the game).


----------



## Olivon

Quote:


> Today, two and a half years later, there are about ten available DirectX 12 games, more have been announced. Too little? Looking at what has been delivered, is the clear answer is no.


Quote:


> DirectX 12 is so far a tragedy
> 
> For what the previously released games deliver, on average, the disappointed. In many cases, DirectX 11 remains faster, in most cases, significantly smoother API


.
Quote:


> In discussions with manufacturers and developers I hear two stories. On the one hand, that DirectX 12 was released too early, without being completed. On the other hand, that the correct programming "on drivers passing" is really difficult. There is a lack of experience, also would have needed more time.


Quote:


> In order to gain experience with the API, developers should romp quiet. But the attempts should be limited to internal versions and are not publicly advertised theoretical advantages.


http://www.computerbase.de/2016-09/kommentar-directx-12-es-hakt/

So far, DX12 path is quite chaotic and not an easy way to borrow.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Just a little info about ROTRR, a little over 4% people have completed it 100%. the average time for 100% of the users is 21 hours. So yeah 19 hours for 100% completion is a tad outside the norm by a wide wide margin.
> 
> I feel 20 hours for a game like that is plenty of value btw, and by that time they have usually outstayed their welcome for me. (looks like I am not alone since that is about he average time spent on the game).


RoTR is one of my favorite game but was too short for me. I had to play the first game again to get TR fix. It's just that after playing Witcher 3 eveything seems short. I like to play a game in the span of month. Witcher 3 did exactly that. I played that 4 times in 1 year.


----------



## Jay-Sharp

Having played through this game on a Gtx 1080, I found it to be mostly cpu limited, and more so in the dx12 beta patch. My 1080 sits at 60% utilization in the majority of the game during dx12, and this is on a 3770k at 4.5 ghz. Yet it was impossible to get a 60 fps minimum. Observing cpu utilization shows that the dx12 implementation is poorly threaded. Serious benchmarks ( not using the pre-canned demo) verify this. Also,there is no validity to the theory that async compute has any bearing on this games performance for NVidia. The 1080 scaled appropriately with settings where the cpu was not a bottleneck. However, on the cpu side, the dx12 implementation in this game doesn't use more than 4 threads. This is the first dx12 port to show these symptoms.

It interests me that people are trying to classify this as an AMD win. Stuttering is terrible, cpu distribution is worse in dx12, and frametimes are all over the place. GCN has many merits, but nothing about this beta is worth celebration.


----------



## PhantomTaco

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Having played through this game on a Gtx 1080, I found it to be mostly cpu limited, and more so in the dx12 beta patch. My 1080 sits at 60% utilization in the majority of the game during dx12, and this is on a 3770k at 4.5 ghz. Yet it was impossible to get a 60 fps minimum. Observing cpu utilization shows that the dx12 implementation is poorly threaded. Serious benchmarks ( not using the pre-canned demo) verify this. Also,there is no validity to the theory that async compute has any bearing on this games performance for NVidia. The 1080 scaled appropriately with settings where the cpu was not a bottleneck. However, on the cpu side, the dx12 implementation in this game doesn't use more than 4 threads. This is the first dx12 port to show these symptoms.
> 
> It interests me that people are trying to classify this as an AMD win. Stuttering is terrible, cpu distribution is worse in dx12, and frametimes are all over the place. GCN has many merits, but nothing about this beta is worth celebration.


If you could post a time lapse of the graphs or otherwise like they do in reviews of a gaming session I'd love to see that, thanks for the info


----------



## Jay-Sharp

First two are dx12. Note the max cpu usage on msi




Now dx11. Much higher gpu usage when the cpu is allowed to scale.





First time using pics. Sorry for the format.


----------



## Xuper

What's your GPU?


----------



## Jay-Sharp

Quote:


> Originally Posted by *Xuper*
> 
> What's your GPU?


1080fe


----------



## moustang

I think most people are forgetting a basic fact.

DX12 is Windows 10 ONLY. For a game to be designed from the ground up around DX12, that game would have to be designed as a Windows 10 exclusive.

If a game is designed to work on Windows 7 and/or 8 then it is NOT designed around DX12. It can't be, those OSs lack several core features of DX12. So any game designed to run on those older versions of Windows will be designed around DX11, and any DX12 support is simply hacked in to their existing game engine. Some developers will do a better job of hacking the DX12 support in, some game engines will respond to DX12 support better than others. But make no mistake, if the game supports DX11 and older versions of Windows at all then it's designed around DX11, and surprise surprise, it often will run better in the API it was designed for.


----------



## Jay-Sharp

Quote:


> Originally Posted by *moustang*
> 
> I think most people are forgetting a basic fact.
> 
> DX12 is Windows 10 ONLY. For a game to be designed from the ground up around DX12, that game would have to be designed as a Windows 10 exclusive.
> 
> If a game is designed to work on Windows 7 and/or 8 then it is NOT designed around DX12. It can't be, those OSs lack several core features of DX12. So any game designed to run on those older versions of Windows will be designed around DX11, and any DX12 support is simply hacked in to their existing game engine. Some developers will do a better job of hacking the DX12 support in, some game engines will respond to DX12 support better than others. But make no mistake, if the game supports DX11 and older versions of Windows at all then it's designed around DX11, and surprise surprise, it often will run better in the API it was designed for.


At the very least CPU threading should be better. Remember, drawcall bottlenecking was the primary concern with dx11.


----------



## boredgunner

Quote:


> Originally Posted by *moustang*
> 
> I think most people are forgetting a basic fact.
> 
> DX12 is Windows 10 ONLY. For a game to be designed from the ground up around DX12, that game would have to be designed as a Windows 10 exclusive.


All the more reason to use Vulkan. Especially AAA games such as this; Eidos and Square Enix don't need Microsoft's "support." Design engine/game around Vulkan, shoehorn in DX11 or OGL toward the end for people with very old GPUs.


----------



## Klocek001

So far all DX12 implementations we've seen are garbage compared to how fantastic Vulkan runs in DOOM.


----------



## ToTheSun!

Quote:


> Originally Posted by *Klocek001*
> 
> So far all DX12 implementations we've seen are garbage compared to how fantastic Vulkan runs in DOOM.


That's probably because id have always been about open standards and no-compromise game development. And they have decades of experience with OGL, which probably helped. I mean, if the future is anything like how well optimized and how buttery smooth Doom Vulkan is, we have nothing to worry about.


----------



## moustang

Quote:


> Originally Posted by *Jay-Sharp*
> 
> At the very least CPU threading should be better. Remember, drawcall bottlenecking was the primary concern with dx11.


CPU threading by itself, with no consideration made to how that effects the rest of the game engine? You'll see minimal improvement at best. It could even hurt you by creating stalls in other areas since your timing would be all off with an otherwise DX11 based renderer. Your entire rendering design would have to be changed to take full advantage of a drastic change in CPU threading.


----------



## Jay-Sharp

Quote:


> Originally Posted by *moustang*
> 
> CPU threading by itself, with no consideration made to how that effects the rest of the game engine? You'll see minimal improvement.


Well, Mankind Divided for example is well threaded in Dx11 on Gcn(which has notoriously poor driver overhead when rendering is not decoupled from game logic), so if drawcalls were not a major concern, it would stand to reason that nixxes needed to further thread game logic to see any uplift.

Which makes dx12 pointless if that's all that held the engine back. But this is based on a beta, and the cost of chasing compute gains for AMD architectures may have been deemed "worth it".


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Well, Mankind Divided for example is well threaded in Dx11 on Gcn(which has notoriously poor driver overhead when rendering is not decoupled from game logic), so if drawcalls were not a major concern, it would stand to reason that nixxes needed to further thread game logic to see any uplift.
> 
> Which makes dx12 pointless if that's all that held the engine back. But this is based on a beta, and the cost of chasing compute gains for AMD architectures may have been deemed "worth it".


You do see an increase in DX12 though over DX11 in fact I see ~25% increase in framerate at 1440p, but you also see more uneven frametimes (not as bad as CFX dx11 though which is a freakin nightmare).


CFX with 290x's is really awful, its all but useless. (see frametimes below)
DX12 gives ~25% more FPS than DX11, but DX11 is slightly smoother.

*Settings for DX11:*


















*Settings for DX12:*
















*
Frametimes*
*Frametimes DX11 (1 GPU)*










*Frametimes DX12 (1GPU)*










*Frametimes DX11 (2 GPUs):*










*Framerates:*

*DX11 (1GPU)*








*DX12 (1GPU)*








*DX11 (2GPU)*


----------



## Jay-Sharp

The trade-off being less smoothness. Which negates the gains. Also, those settings are obviously overwhelming the 290x. Minimums look harsh.


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> The trade-off being less smoothness. Which negates the gains. Also, those settings are obviously overwhelming the 290x. Minimums look harsh.


Well yeah the settings are too high, that is not how I play it. But if you are going to test the benefits of DX11 vs DX12 that is how you show the gains.

Also, those spikes are fairly fast, so it doesn't show up as a stutter ( at those settings anyway), others though are having longer spikes with noticeable stutter.

Whole point is, DX12 is reducing the over head, but this DXMD beta needs work.


----------



## Jay-Sharp

Your minimum frame rate is highest with 2 GPU, which shows that your experiences are currently GPU bound.

At 720p, low settings, dx12, does the 290x get a minimum of 60? I expect Gcn to gain performance, but I'm interested in CPU side.


----------



## moustang

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Well, Mankind Divided for example is well threaded in Dx11 on Gcn(which has notoriously poor driver overhead when rendering is not decoupled from game logic), so if drawcalls were not a major concern, it would stand to reason that nixxes needed to further thread game logic to see any uplift.
> 
> Which makes dx12 pointless if that's all that held the engine back. But this is based on a beta, and the cost of chasing compute gains for AMD architectures may have been deemed "worth it".


Did it ever occur to you that the reason AMD sees such an improvement with DX12 is simply because their DX11 drivers SUCK?

You know how AMD has had steady improvements in the performance of their old GPUs over the years? How even old cards like the 7950 can perform reasonably well in new games with new drivers? That's because AMDs DX11 drivers are absolutely horrible. Their hardware hasn't changed, all of these wonderful performance increases in DX11 have been pure software. As far as the hardware goes the performance was always there, and it's always been their DX11 drivers holding them back.


----------



## Jay-Sharp

Yes, the drivers have held the architecture back for years. Overhead is still an issue on vulkan and dx12 too, but at least there AMD get rendering done on multiple threads.

Honestly, Gcn doesn't interest me. It's a lot of hardware being thrown at the task of rendering. Performance is predictable and the software side is hit or miss. It's not an elegant architecture to say the least.


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Your minimum frame rate is highest with 2 GPU, which shows that your experiences are currently GPU bound.
> 
> At 720p, low settings, dx12, does the 290x get a minimum of 60? I expect Gcn to gain performance, but I'm interested in CPU side.


Anyone playing this game is going to be GPU bound. No one is playing it at those settings. Don't see the point of testing it at a res/settings no one would be using.


----------



## Kuivamaa

Quote:


> Originally Posted by *the9quad*
> 
> Anyone playing this game is going to be GPU bound. No one is playing it at those settings. Don't see the point of testing it at a res/settings no one would be using.


True. I pretty much rushed to upgrade to a Nano(running it at a steady 1050MHz) over my 290X because Hawaii was dropping in the 40s (1440p mostly ultra).


----------



## daviejams

Quote:


> Originally Posted by *the9quad*
> 
> Anyone playing this game is going to be GPU bound. No one is playing it at those settings. Don't see the point of testing it at a res/settings no one would be using.


It's a very CPU bound game in DX11

My skylake i5 actually shows 100% usage on afterburner , when it hits those points the game actually stops for a split second. It's only happened a couple of times and near the end but still

I've not tried DX12 mode much but that is the kind of thing that it should eliminate


----------



## the9quad

Quote:


> Originally Posted by *daviejams*
> 
> It's a very CPU bound game in DX11
> 
> My skylake i5 actually shows 100% usage on afterburner , when it hits those points the game actually stops for a split second. It's only happened a couple of times and near the end but still
> 
> I've not tried DX12 mode much but that is the kind of thing that it should eliminate


Are you playing 640x480 with low settings?


----------



## daviejams

Quote:


> Originally Posted by *the9quad*
> 
> Are you playing 640x480 with low settings?


1440P at high.

Playing this game is the hardest my CPU has ever worked


----------



## boredgunner

Something interesting I noticed: On my system at 2560 x 1440, High (with Contact Hardening Shadows set to On) barely runs better Ultra with CHS set to On. 60-70 FPS on the former, 50-70 FPS on the latter.


----------



## Jay-Sharp

Quote:


> Originally Posted by *the9quad*
> 
> Are you playing 640x480 with low settings?


You underestimate the CPU bottleneck. Regardless of the resolution, the CPU will dictate your minimums under reasonable settings.

Which is why I asked for a 720p benchmark. If you cannot get 60 fps minimums there, you can't period. 1080p should be sufficient to test.


----------



## Jay-Sharp

Quote:


> Originally Posted by *daviejams*
> 
> It's a very CPU bound game in DX11
> 
> My skylake i5 actually shows 100% usage on afterburner , when it hits those points the game actually stops for a split second. It's only happened a couple of times and near the end but still
> 
> I've not tried DX12 mode much but that is the kind of thing that it should eliminate


Dx11 mode uses up to 70% of an i7, which is more than 4 cores, so yes, 4 cores automatically bottlenecks this game.

But that is high core usage for a dx11 game, which is admirable. It seems to be juggling a lot of tech, unlike a crytech or ubisoft game where the return on the investment is mostly superficial ( swaying grass or unrealistic draw distance). Any i7 over ivybridge should deliver perfect CPU performance.

Dx12 may change how the threading works once out of beta, but I don't see drawcalls being the biggest problem. The world logic is probably a candidate for optimizations.


----------



## daviejams

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Dx11 mode uses up to 70% of an i7, which is more than 4 cores, so yes, 4 cores automatically bottlenecks this game.
> 
> But that is high core usage for a dx11 game, which is admirable. It seems to be juggling a lot of tech, unlike a crytech or ubisoft game where the return on the investment is mostly superficial ( swaying grass or unrealistic draw distance). Any i7 over ivybridge should deliver perfect CPU performance.
> 
> Dx12 may change how the threading works once out of beta, but I don't see drawcalls being the biggest problem. The world logic is probably a candidate for optimizations.


Ironically I recently "upgraded" from an ivy i7 to the skylake i5. Needed to upgrade as the audio port in my motherboard was faulty but I must admit I was expecting the skylake chip to be faster. Should have spent the extra £100 and went with an i7


----------



## scorch062

Quote:


> Originally Posted by *daviejams*
> 
> Ironically I recently "upgraded" from an ivy i7 to the skylake i5. Needed to upgrade as the audio port in my motherboard was faulty but I must admit I was expecting the skylake chip to be faster. Should have spent the extra £100 and went with an i7


Would it not be more convenient to simply buy a PCI Sound card and use ports available there?


----------



## daviejams

Quote:


> Originally Posted by *scorch062*
> 
> Would it not be more convenient to simply buy a PCI Sound card and use ports available there?


I had the upgrade bug and wanted to move up to skykake and DDR4 anyway

As I say pretty disappointing compared to a four year old ivy chip


----------



## ToTheSun!

Quote:


> Originally Posted by *scorch062*
> 
> Quote:
> 
> 
> 
> Originally Posted by *daviejams*
> 
> Ironically I recently "upgraded" from an ivy i7 to the skylake i5. Needed to upgrade as the audio port in my motherboard was faulty but I must admit I was expecting the skylake chip to be faster. Should have spent the extra £100 and went with an i7
> 
> 
> 
> Would it not be more convenient to simply buy a PCI Sound card and use ports available there an external DAC because soundcards suck?
Click to expand...

Fixed that for ya! =D


----------



## Jay-Sharp

Quote:


> Originally Posted by *daviejams*
> 
> I had the upgrade bug and wanted to move up to skykake and DDR4 anyway
> 
> As I say pretty disappointing compared to a four year old ivy chip


Skylake does more per core, so that may offset any frame rate drop you might have got from missing hyper threading (against ivy). But it may only match 3770k where as the 6700k would be a big win.

The use of hyper threading, especially on easily threaded code (audio, physics), easily make an i7 worth it. Plus, hyper threading starting with haswell (4770k) has much more resources dedicated to thread sharing. 4770k beats a 6 core 980, 6700k should beat 6 core sandy, etc.


----------



## Olivon

New patch for DE:MD permit AMD to take the lead in DX11 (test scene : Prague, 10% gain), offers better performance in DX12 for the RX480 and help a lot to resolve the usual CPU limitation for AMD GPU.


----------



## BradleyW

Quote:


> Originally Posted by *Olivon*
> 
> New patch for DE:MD permit AMD to take the lead in DX11 (test scene : Prague, 10% gain), offers better performance in DX12 for the RX480 and help a lot to resolve the usual CPU limitation for AMD GPU.


The performance is worse in DX12 according to this chart.


----------



## Jay-Sharp

Quote:


> Originally Posted by *Olivon*
> 
> New patch for DE:MD permit AMD to take the lead in DX11 (test scene : Prague, 10% gain), offers better performance in DX12 for the RX480 and help a lot to resolve the usual CPU limitation for AMD GPU.


CPU threading is improved slightly in dx12. Tested this last night. GPU performance is still way down. Nobody should bother with dx12 on this game yet.


----------



## Olivon

Quote:


> Originally Posted by *BradleyW*
> 
> The performance is worse in DX12 according to this chart.


Compared to the last review DX12 results are better but still inferior than DX11.


----------



## LongtimeLurker

This game is so poorly coded.

How on earth can a 480 be doing worse in DX12 than DX11.

Boggles the mind.

Who's running that company...monkeys?!?!

hehe


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> CPU threading is improved slightly in dx12. Tested this last night. GPU performance is still way down. Nobody should bother with dx12 on this game yet.


and you'd be wrong, would love to see the results you guys get using actual measurements and not "eye balling" it.

here is my results, with actual data. benchmark run latest patch and 16.9.1 drivers. Dx12 in Red and Dx11 in Blue, you can see DX12 is consistently faster the whole bench. Higher min, max, and average framerate as well.

http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-04-05.png









http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-14-22.png









http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-36-20.png









DX11.csv 67k .csv file


DX12.csv 92k .csv file


----------



## NightAntilli

Frame times are still better under DX11... Until developers program their games around DX12 as the primary API and DX11 goes second, the results will remain similar to this.


----------



## the9quad

Quote:


> Originally Posted by *NightAntilli*
> 
> Frame times are still better under DX11... Until developers program their games around DX12 as the primary API and DX11 goes second, the results will remain similar to this.


Except at least _in my case_, the spikes are so fast like one frame in a thousand, that it is unnoticeable (unnoticeable to anyone period they are that fast, see below). The variance is like .0004 percent.......so yeah that one frame every 1,000 frames no one is noticing at all period end of story , no matter how inhumanely "sensitive" they say they are to it.

Others though have longer spikes, I think bradley does. And those are really noticeable he says..So I 'd say people need to quit blanket statementing everything, and try for themselves and then decide which one runs better for them. In my case, DX12 runs much better, and actually looks better when it comes to aliasing. Crossfire on the other hand is a mess when it comes to frametimes.

DX12:

http://www.the9quad.com/images/2016/09/16/perf_12.png









DX11:
http://www.the9quad.com/images/2016/09/16/perf_11.png


----------



## Jay-Sharp

Quote:


> Originally Posted by *the9quad*
> 
> and you'd be wrong, would love to see the results you guys get using actual measurements and not "eye balling" it.
> 
> here is my results, with actual data. benchmark run latest patch and 16.9.1 drivers. Dx12 in Red and Dx11 in Blue, you can see DX12 is consistently faster the whole bench. Higher min, max, and average framerate as well.
> 
> http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-04-05.png
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-14-22.png
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.the9quad.com/images/2016/09/16/BenchStudioGpu2015_2016-09-16_16-36-20.png
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DX11.csv 67k .csv file
> 
> 
> DX12.csv 92k .csv file


Thanks for the data.

With a minimum of 35fps at the same settings you posted you are getting better performance. That's great for an overworked GPU.

So I retract my blanket statement.

I'd like to see fury testing on a decent CPU. I'm more focused on minimums with my 1080, because I get a 70fps min in dx11, vs a 54 fps min in dx12.

But great that you're getting more in dx12. Is that the benchmark or in game?


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Thanks for the data.
> 
> With a minimum of 35fps at the same settings you posted you are getting better performance. That's great for an overworked GPU.
> 
> So I retract my blanket statement.
> 
> I'd like to see fury testing on a decent CPU. I'm more focused on minimums with my 1080, because I get a 70fps min in dx11, vs a 54 fps min in dx12.
> 
> But great that you're getting more in dx12. Is that the benchmark or in game?


Benchmark and all maxed settings for consistency, I don't play at those settings. Also not ready to upgrade video cards yet until something better from AMD and Nvidia comes along. Not that impressed with their latest offerings, and most games work with CFX (Deus Ex aside), so the 290x's do the job for the time being.

Side note - you should use the rigbuilder in the right corner, so people can see what your PC is. Makes conversations around here much better.


----------



## Zero989

This game has put my CPU at 54% load for a brief second, averaging 40% CPU usage.


----------



## Jay-Sharp

Quote:


> Originally Posted by *the9quad*
> 
> Benchmark and all maxed settings for consistency, I don't play at those settings. Also not ready to upgrade video cards yet until something better from AMD and Nvidia comes along. Not that impressed with their latest offerings, and most games work with CFX (Deus Ex aside), so the 290x's do the job for the time being.
> 
> Side note - you should use the rigbuilder in the right corner, so people can see what your PC is. Makes conversations around here much better.


Speaking for myself, the 1080 is the best video card I've ever used. It's what I wanted the fury x to be, had dx11 not got in the way.

I actually think 390x is AMDs best video card ever. That thing makes short work of most of Nvidia's line up.


----------



## Zero989

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Speaking for myself, the 1080 is the best video card I've ever used. It's what I wanted the fury x to be, had dx11 not got in the way.
> 
> I actually think 390x is AMDs best video card ever. That thing makes short work of most of Nvidia's line up.


Indeed how the tables have turned with AMD's latest driver


----------



## Jay-Sharp

While that is a pretty picture, it is inline with what the performance should have been for a while now. Also, does it take into account Maxwell overclock?

The 980 can gain 20% quite easily. As can the 970.

Don't forget that Gcn has always had more theoretical performance than their counterparts. In a perfect world the 390x should be a stock 980ti.


----------



## the9quad

Quote:


> Originally Posted by *Jay-Sharp*
> 
> Speaking for myself, the 1080 is the best video card I've ever used. It's what I wanted the fury x to be, had dx11 not got in the way.
> 
> I actually think 390x is AMDs best video card ever. That thing makes short work of most of Nvidia's line up.


Not worth the effort/money to go from 290x's to 390x's to be honest (more ram and a slight clock bump that doesnt translate to all that much of a performance increase). Also not worth it to me to go to a 1080 from CFX 290x's, since 90% of games that is more than enough. I'll wait til the next set of cards they both come out with.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Not worth the effort/money to go from 290x's to 390x's to be honest (more ram and a slight clock bump that doesnt translate to all that much of a performance increase). Also not worth it to me to go to a 1080 from CFX 290x's, since 90% of games that is more than enough. I'll wait til the next set of cards they both come out with.


Same boat here. Want a lest a upgrade from 2 x 290X best even though CFX might not work in everything.


----------



## Klocek001

what a mess........


----------



## sumitlian

In my opinion, an OCN user him/herself showing detailed proof for something in here is always more reputable and reliable than any random internet slides. You can also discuss on the methods of the benchmark and ask for any technical details related to it if you want. I'm not saying that internet is full of bull, many times users get a very different picture for their configuration in terms of performance than online reviewers.

Thanks for the data the9quad.


----------



## Klocek001

Most of those benches were BS, you're correct. G3D, TPU, most of them used the integrated benchmark. PurePC and PCGH were the only ones that tested with a savegame, and suddenly it turns out a 1070 DX11 is indeed faster than Fury X in DX12, which leads me to believe a custom 980Ti would kick its butt too.
User's benches tend to be more particular and detailed, yet they don't comapre cards.


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> Most of those benches were BS, you're correct. G3D, TPU, most of them used the integrated benchmark. PurePC and PCGH were the only ones that tested with a savegame, and suddenly it turns out a 1070 DX11 is indeed faster than Fury X in DX12, which leads me to believe a custom 980Ti would kick its butt too.
> User's benches tend to be more particular and detailed, yet they don't comapre cards.


They also don't take the time to give you any frametime data at all. They dont overlay the graphs to see if the cards were even seeing the same stuff. So all you end up with is a sketchy min/avg/max fps bar that may or may not be from cards rendering the same thing. To me that is worthless.

I will take a site like Guru3d who spent the thousands to do a proper FCAT, and that uses the benchmark for consistent results. At the very least any site should be doing FCAT or at a bare minimum perfmon. Min/max fps doesn't say a whole lot, its only a small part of the picture. Heck if I went by that I would be using crosfire and having a solid 60 fps, but in reality my experience would look like this:










That said I would definitely take a 1070 over a furyx any day.I wouldn't doubt the 1070 beating the fury x in dx12.


----------



## the9quad

You don't see many review sites test in game for the bench because it makes the bench very subjective. Who really knows what each card is rendering if both aren't rendering the same thing? Hence real sites using the integrated benchmark, because it is the same thing every time. The same engine using the same assets as in game, but predictable and repeatable. Using anything else begs questions about the validity of the results. Only reason to question the integrated benchmark is if you are a tin hat wearing nut who thinks AMD and nixxes have a conspiracy to make nvidia look bad in the bench.


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> custom 1070s/980Ti's/1080s under DX11 mop the floor with Fury X running DX12, seen two reviews that test in game and both show the same thing.I bet you're gonna discredit pcgh too, I can't wait to see how you would explain fury x matching rx 480


You don't need to sweat that much as we know AMD hasn't competed in high end GPU since a little more than a year. GTX 1070/1080 are indeed the only options available for high end gaming and since they also have 8 GB VRAM. It is best for everyone not to go for Fury/X now in 2016. What AMD has done good for majority of gamers is they have provided mainstream level cards like RX 480/470 at great prices and these cards keep competing good with time.

For now Nvidia is still winning in high end area no doubt about that.


----------



## Forceman

Quote:


> Originally Posted by *the9quad*
> 
> You don't see many review sites test in game for the bench because it makes the bench very subjective. Who really knows what each card is rendering if both aren't rendering the same thing? Hence real sites using the integrated benchmark, because it is the same thing every time. The same engine using the same assets as in game, but predictable and repeatable. Using anything else begs questions about the validity of the results. Only reason to question the integrated benchmark is if you are a tin hat wearing nut who thinks AMD and nixxes have a conspiracy to make nvidia look bad in the bench.


The problem is there is no way to know how representative the benchmark is to real-world game play. We've seen similar things in other games (the original Tomb Raider comes to mind) where the benchmark showed better performance than the game itself. So it's not too hard to imagine that the benchmark could artificially skew results, even if it wasn't done intentionally.

Edit: different CPU load between the benchmark and real gaming, in particular, could be significant. The benchmark may not be running the AI/etc if it is canned.


----------



## sumitlian

Quote:


> Originally Posted by *the9quad*
> 
> You don't see many review sites test in game for the bench because it makes the bench very subjective. Who really knows what each card is rendering if both aren't rendering the same thing? Hence real sites using the integrated benchmark, because it is the same thing every time. The same engine using the same assets as in game, but predictable and repeatable. Using anything else begs questions about the validity of the results. Only reason to question the integrated benchmark is if you are a tin hat wearing nut who thinks AMD and nixxes have a conspiracy to make nvidia look bad in the bench.


Not trying to beat the dead horse but I am wondering why purepc.pl did not enable AA for their tests ? Also why is there no 2K and 4K benchmarks ?


----------



## sumitlian

New DX12 patch ?
http://wccftech.com/directx-12-deus-mankind-divided-patch/

Also,
Quote:


> According to the developer, the patch 6 will be applied automatically by Steam the next time you start the game.


----------



## black96ws6

Quote:


> Originally Posted by *sumitlian*
> 
> New DX12 patch ?
> http://wccftech.com/directx-12-deus-mankind-divided-patch/
> 
> Also,


Sometimes it's more fun reading the comments than the actual article lol:


----------



## BradleyW

Quote:


> Originally Posted by *sumitlian*
> 
> New DX12 patch ?
> http://wccftech.com/directx-12-deus-mankind-divided-patch/
> 
> Also,


Came out yesterday. Made no difference for me. DX12 is still a stuttering mess. Back to previous DX11 build.


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> Came out yesterday. Made no difference for me. DX12 is still a stuttering mess. Back to previous DX11 build.


Wish I knew why me and you have pretty much the same system, but completely opposite experiences with DX12 in this game.


----------



## Klocek001

This is just disappointing how ignorant people can be when they're biased towards one brand.... This is the table of contents from the purepc article



They tested in several in-game loactions, all using both dx11 and dx12, on several cpus. How much time did it take in comparison to that Guru3D review that just runs the integrated benchmark on one system setup ? The guy from purepc is one of the most hard-working reviewers out there, and his reviews are always thorough.
Quote:


> Originally Posted by *the9quad*
> 
> Wish I knew why me and you have pretty much the same system, but completely opposite experiences with DX12 in this game.


I don't think we need a tweed hat to solve this mystery....


----------



## Robenger

Quote:


> Originally Posted by *Klocek001*
> 
> This is just disappointing how ignorant people can be when they're biased towards one brand.... This is the table of contents from the purepc article
> 
> 
> 
> They tested in several in-game loactions, all using both dx11 and dx12, on several cpus. How much time did it take in comparison to that Guru3D review that just runs the integrated benchmark on one system setup ? The guy from purepc is one of the most hard-working reviewers out there, and his reviews are always thorough.
> Xuper you're just the worst type of a person to discredit all this time that had to be spent. You're the one who's biased, not the review. Keep those red glasses on an complain that some reviewers don't make enough effort to cover up the fact that DX12 so far is a complete mess from top to bottom.
> I don't think we need a tweed hat to solve this mystery....


Not to nitpick, but it would have been nice to have seen one AMD CPU used.


----------



## SpeedyVT

Quote:


> Originally Posted by *Robenger*
> 
> Not to nitpick, but it would have been nice to have seen one AMD CPU used.


Considering the benefit of DX12 will be seen in machines constrained by CPU I question the use of a processor that won't bottleneck like an AMD will.


----------



## Olivon

Quote:


> Originally Posted by *Klocek001*
> 
> This is just disappointing how ignorant people can be when they're biased towards one brand.... This is the table of contents from the purepc article


Thanks for sharing, impressive work from PurePC


----------



## Klocek001

Quote:


> Originally Posted by *Robenger*
> 
> Not to nitpick, but it would have been nice to have seen one AMD CPU used.


Certainly, but that's just one guy doing the whole work, and that's a GPU test in the first place. He didn't test AMD vs Intel, he tested AMD vs Nvidia on CPUs from different tiers. Still, his meticulous work is being ridiculed by brand biased ignorants.

He referenced his previous Mankind Divided review, when he had the final dx11 version but the dx12 version was just a pre-release one. Said he can't even find the words to describe the situation where the final dx12 version runs worse than the pre-release one. AMD gain very little from the use of dx12. Now comes the best part: nvidia cards had identical performance in dx11 and dx12 modes when he tested the pre-release dx12 version, whereas now 1070/1080 lose half the fps in dx12 mode, and the GPU usage is ridiculously low.

I'm sure glad I stayed with a nvidia card for another year, we'll see how the situation plays out next year but seems like I dodged the bullet called dx12 this time.


----------



## ToTheSun!

Quote:


> Originally Posted by *Klocek001*
> 
> I'm sure glad I stayed with a nvidia card for another year, we'll see how the situation plays out next year but seems like I dodged the bullet called dx12 this time.


I bought a 980ti to hold me over till the next batch of GPU's exactly because DX11 is still the most relevant comparison. DX12's implementations have been pretty crappy, and only Doom's Vulkan implementation has been worthy of the switch from DX11. But that's one game, and, though AMD cards are thoroughly beating their counterparts at it, the Nvidia side still has satisfactory performance and the one high-end card leading the pack.

WHEN DX12/Vulkan is, indeed, relevant, new cards with new architectures will be here. And, THEN, the consumer will be able to make a more informed decision based on data more realistic than what some slides told us.


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> This is just disappointing how ignorant people can be when they're biased towards one brand.... This is the table of contents from the purepc article
> 
> 
> 
> They tested in several in-game loactions, all using both dx11 and dx12, on several cpus. How much time did it take in comparison to that Guru3D review that just runs the integrated benchmark on one system setup ? The guy from purepc is one of the most hard-working reviewers out there, and his reviews are always thorough.


Did he mention anything about the stuttering observed on AMD cards on gameplay, and not on the benchmark run as well as why similar Nvidia cards did not ?


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> Did he mention anything about the stuttering observed on AMD cards on gameplay, and not on the benchmark run as well as why similar Nvidia cards did not ?


Not that I can see. Is that an issue ?
Quote:


> Originally Posted by *ToTheSun!*
> 
> I bought a 980ti to hold me over till the next batch of GPU's exactly because DX11 is still the most relevant comparison. DX12's implementations have been pretty crappy, and only Doom's Vulkan implementation has been worthy of the switch from DX11. But that's one game, and, though AMD cards are thoroughly beating their counterparts at it, the Nvidia side still has satisfactory performance and the one high-end card leading the pack.
> 
> WHEN DX12/Vulkan is, indeed, relevant, new cards with new architectures will be here. And, THEN, the consumer will be able to make a more informed decision based on data more realistic than what some slides told us.


Doom Vulkan is where it's at. Amazing to see the improvement on all cards, not only new ones.
btw Fury X only beats reference 980Ti/1070, it loses to custom cooled ones







1060 gets a proper ass spanking though. Still, it runs even 1440p comfortably so no worries. The hype for AMD and DX12 was tremendous this year, the results are there on paper for us to interpret though. I personally feel like a dodged a bullet not getting two of them discounted Furies/Nanos and went with a single GTX 1080 instead.


----------



## huzzug

Yes. There are users reporting AMD cards stuttering during gameplay using DX12. DX11 seems to work alright for them.


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> Yes. There are users reporting AMD cards stuttering during gameplay using DX12. DX11 seems to work alright for them.


oh, in dx12 mode. didn't catch that. I suppose that's because dx12 is totally premature now.

on another note, can dx11 games be ported to vulkan or is it possible for OpenGL ones only?


----------



## huzzug

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> Yes. There are users reporting AMD cards stuttering during gameplay using DX12. DX11 seems to work alright for them.
> 
> 
> 
> oh, in dx12 mode. didn't catch that. I suppose that's because dx12 is totally premature now.
> 
> on another note, can dx11 games be ported to vulkan or is it possible for OpenGL ones only?
Click to expand...

I think Talos Principle was a DX11 ported to Vulkan (wrapper). It should be possible if the engine allows it and the developer isn't lazy.


----------



## the9quad

Quote:


> Originally Posted by *huzzug*
> 
> Did he mention anything about the stuttering observed on AMD cards on gameplay, and not on the benchmark run as well as why similar Nvidia cards did not ?


There are Nvidia users complaining about stuttering as well, and there are also AMD users who are not experiencing the stuttering. Doesn't seem to be a vendor specific issue.


----------



## Klocek001

Still, it's kind of funny. I got bashed by AMD advocates when I said it would be funny if 980Ti DX11 would beat Fury X DX12 in Mankind Divided. They were like "no way you green goblin"







Well, a reference 1070 beats it by a noticeable margin, custom 980Ti would beat it even more. Who's laughing now








Since I can remember, probably back in 2014 when all the dx12 amd hype boomed, Mankind Divided was supposed to be the showcase one for amd cards in dx12.
Well, I suppose it needs twenty patches just like very modern game. They should just polish DX11 and then take their time with DX12.


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> This is just disappointing how ignorant people can be when they're biased towards one brand.... This is the table of contents from the purepc article
> 
> 
> 
> They tested in several in-game loactions, all using both dx11 and dx12, on several cpus. How much time did it take in comparison to that Guru3D review that just runs the integrated benchmark on one system setup ? The guy from purepc is one of the most hard-working reviewers out there, and his reviews are always thorough.
> I don't think we need a tweed hat to solve this mystery....


Not sure what you are implying with the tweed hat comment. I have provided hard evidence and frametime logs of my performance. All I have seen from you is pics from subjective benchmarks on the web.

Also quantity of benches does not equal quality. This isn't 2006, only providing avg/min/max frame rates in a review is kind of lazy, sorry to say it.

Also this ins't green versus red here, well for most normal people it isn't but I guess you can keep laughing since you are green. yeah for you , you win something important.


----------



## kyrie74

Quote:


> Originally Posted by *the9quad*
> 
> Not sure what you are implying with the tweed hat commment. I have provided hard evidence and frametime logs of my performance.
> 
> Also benchmarks with only avg/min/max framerates are not exactly results of "hard work."


I think it was a Sherlock Holmes reference.


----------



## Klocek001

Quote:


> Originally Posted by *the9quad*
> 
> Not sure what you are implying with the tweed hat commment. I have provided hard evidence and frametime logs of my performance.
> 
> Also benchmarks with only avg/min/max framerates are not exactly results of "hard work."


Really ? Tested on 4 CPUs, two APIs and 3 in game locations. That not good enough for you? He too lazy ?

Educate yourself.


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> Really ? Tested on 4 CPUs, two APIs and 3 in game locations. That not good enough for you? He too lazy ?
> 
> Educate yourself.


I just posted what I thought about that. Quantity doesn't equal quality. Each one of those gameplay benches are not identical. That in and of itself makes a purely MIN/MAX/AVG FPS comparison sketchy at best. Maybe the average would represent something, but the min and max in that scenario are completely useless. So he did alot of "work"....and came up with at best a comparison that shows "average" framerates are "generally" in that area between cards.


----------



## sumitlian

Quote:


> Originally Posted by *the9quad*
> 
> Not sure what you are implying with the tweed hat commment. I have provided hard evidence and frametime logs of my performance.
> 
> Also benchmarks with only avg/min/max framerates are not exactly results of "hard work."


It is funny they didn't reply to your posts or in other words they keep ignoring that frametimes/fps slides.

To others,
I am talking about DX11 vs DX12 performance. We get over that Nvidia has the fastest card in internet for this game at least, no doubt on that.
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> DX12 runs much better, and actually looks better when it comes to aliasing. Crossfire on the other hand is a mess when it comes to frametimes.
Click to expand...

DX12 runs much better with 290X and it looks better when it comes to Aliasing. If DX12 with AMD provides better Aliasing then we need to be confirmed if Nvidia is providing the same graphics/aliasing quality or not, what if Nvidia reduced graphics quality for more fps ?


----------



## Klocek001

Quote:


> Originally Posted by *the9quad*
> 
> I just posted what I thought about that. Quantity doesn't equal quality. Each one of those gameplay benches are not identical. That in and of itself makes a purely MIN/MAX/AVG FPS comparison sketchy at best. Maybe the average would represent something, but the min and max in that scenario are completely useless. So he did alot of "work"....and came up with at best a comparison that shows "average" framerates are "generally" in that area between cards.


yet you can't seem to notice that the results are actually consistent, across his review, and also with the ones that pcgh did.


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> yet you can't seem to notice that the results are actually consistent, across his review, and also with the ones that pcgh did.


And yet you keep ignoring he personally is getting much better frametimes and fps as well with 290X and DX12.


----------



## Klocek001

Quote:


> Originally Posted by *sumitlian*
> 
> And yet you keep ignoring he personally is getting much better frametimes and fps as well with 290X and DX12.


come on, why are you really doing this ?

I guess there's nothing wrong with dx12 then. I mean he just said there isn't two pages ago, what else does anyone need ? Obviously all the in game reviews are fake.


----------



## huzzug

Maybe its the patch. Afterall, this isn't the final DX12 implementation. As it stands now, one must select whichever works best for the persons use scenario.


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> yet you can't seem to notice that the results are actually consistent, across his review, and also with the ones that pcgh did.


Ok, simple you just explain to me how two, three, or four benchmark runs that aren't rendering exactly identical things between each other can prove anything about min or max framerate between cards. At best all it will tell you is, how generally each card performs on average, but those minimums and maximums are totally useless.

At worst, it begs the question about how long particular areas were looked at, and if that area happened to be easier or harder for one card to render, skewing the results. Heck, I can load a save file up and spend a lot of time looking at the ground, and have better framerates than a 1080 who using the same save file spends the whole run looking at a ton of geometry.

I am not saying that is what they did, but I am saying those types of benches are susceptible to those claims. Personally I think those benches are ok for a "general" idea of performance. I don't think they really say much more than that. I think if you want a true comparison of what the cards do, you need to rely on repeatable benchmarks. You can say those repeatable benchmarks are skewed in favor of one vendor over another, but that same argument like I showed above, can be made about the benchmarks used by pcgh et.al.

Last but not least, you keep acting like people care about nvidia vs amd, i am pretty sure in my first post I said I would take a 1070 hands down over a fury. Like I said, most normal people don't really care about two corporations they don't own stock in. If that is your thing, more power to ya, but I am not interested in that childish fight.

I also said that people should not rely on what some review or some poster says about how performance is in this game. Different people even with similar systems are getting vastly different results. Look at me and Bradley, DX12 is totally screwed up for him, and working flawless for me. Now I know you were implying that somehow I am lying about my results in your previous posts, and that is why I attached the frametime.csv from each run. I am pretty sure I am the only person in this thread to take the time to do that, instead of just spouting anecdotal evidence and "eyeballed" results.


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> I guess there's nothing wrong with dx12 then. I mean he just said there isn't two pages ago, what else does anyone need ? Obviously all the in game reviews are fake.


You missed the AA part. I am quoting again.
Quote:


> DX12 runs much better, and *actually looks better when it comes to aliasing.* Crossfire on the other hand is a mess when it comes to frametimes.


If AA methods in DX12 is better than DX11 for AMD and Nvidia's DX12 performance is worse than DX11, then we might be getting more quality AA/graphics with DX12 + AMD than DX11 + Nvidia. Not saying that it is happening, but it is possible because we are talking about two different APIs. Who knows what more is being rendered in DX12 mode !? Just saying.


----------



## Klocek001

Quote:


> Originally Posted by *the9quad*
> 
> Heck, I can load a save file up and spend a lot of time looking at the ground, and have better framerates than a 1080 who using the same save file spends the whole run looking at a ton of geometry.
> 
> I am not saying that is what they did


you're just hilarious dude. there are 40 tests, in all that test amd cards there's something happening on the screen that doesn't happen on nvidia runs. THAT'S the reason.
Quote:


> Originally Posted by *sumitlian*
> 
> we might be getting more quality AA/graphics with DX12 + AMD than DX11 + Nvidia. Not saying that it is happening


lmao

you two are a worthy pair. paranoid much ?


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> you're just hilarious dude. there are 40 tests, in all that test amd cards there's something happening on the screen that doesn't happen on nvidia runs. THAT'S the reason.
> lmao
> 
> you two are a worthy pair. paranoid much ?


You have reading comprehension issues. Actually read what I said like an adult, instead of like some infant who takes things out of context and pretends like he didn't.

I am saying those benchmarks like that are susceptible to people saying things like that. Just like you saying the integrated benchmark is some collusion between nixxes and AMD and that is why it runs better on AMD. Neither things are true, but both are susceptible to those claims.

Now, what I did say was they are not rendering the exact same things, and the min max results are useless. They are good for a general view of average FPS, and that is it.


----------



## Klocek001

you two keep writing sth totally bogus, then end the whole paragraph with "not saying this is happening.."
and I'm just here sitting and trying to comprehend what the hell do you actually wanna imply with that formula.


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> you're just hilarious dude.
> lmao
> 
> you two are a worthy pair.


He is right, the situation can be legitimately constructed because it happens if the game is using compute + graphics, I've seen that in Battlefield 3 in 2013. My friend's 2GB GTX 670 OCed to ~1150 MHz was getting low fps around 45 fps in Operation Firestorm and Caspian Border in some heavy bombarding, while on the contrary a mid range default clocked 1GB HD 7790 was running same scenarios at ~50-55 fps at same graphics settings. There were times where GTX 670 has higher max and average but many times depending on load HD 7790 provided much better fps.

My point is Battlefield 3 used Microsoft's DirectCompute along with DX11 graphics implementations, which Nvidia didn't support back then. These things sometimes are not showed by reviewers. Same thing might be happening with today's games if they are using compute part, since we all know DX12's concurrent compute + graphics is still not being fully supported by any of the Nvidia GPUs.

Quote:


> _Developer DICE used DirectCompute to accelerate the processing of non-shadowed lights within Battlefield 3. This process involves dividing the screen into tiles, then analyzing which lights are illuminating which tiles. This, in turn, helps establish per-pixel lighting by narrowing down the lights applicable to each tile.
> 
> "*DirectCompute allows this effect to happen at very fast performance because it's heavily parallelizable*," says DICE rendering architect Johan Andersson. "For each tile of 8x8 pixels, a full kernel of work elements are executing the required tasks simultaneously. Several tiles worth of work can be running on the multiple compute units present in modern GPUs at the same time. You can see more of this technique used in AMD's "Leo" demo, which accompanied the Radeon HD 7970 launch. According to AMD, the Leo demo uses similar culling and calculations to those done by DICE, only the work is processed even more quickly by a forward-rendering engine.
> ...
> *Battlefield 3 still relies heavily on compute shaders* for most of its effects and has to devise work-arounds to cope with shader limitations. For instance, the bilateral upsampling compute shader is used to accelerate the rendering of selected screen-space techniques.
> 
> "It's important to test in an area where there actually are a lot of light sources, such as in the subway of the Metro MP map," says AMD's Neal Robison. "The compute path is not there to accelerate the average or best-case performance situations, but the worst areas where we have the most light sources. Though as it is active all the time, it actually reduces performance due to extra overhead in scenes where you only have a couple of light sources."_


http://www.tomshardware.com/reviews/directcompute-opencl-gpu-acceleration,3146-4.html


----------



## sumitlian

Quote:


> Originally Posted by *Klocek001*
> 
> you two keep writing sth totally bogus, then end the whole paragraph with "not saying this is happening.."
> and I'm just here sitting and trying to comprehend what the hell do you actually wanna imply with that formula.


Calm down bro








You are misunderstanding me. purepc is surely not using in built benchmark but we don't know at all how many different situations in a map they have they covered and for how long did they run the tests. They could simply be doing nothing more than let the game load and remain standing for each GPU tests for each map.


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> you two keep writing sth totally bogus, then end the whole paragraph with "not saying this is happening.."
> and I'm just here sitting and trying to comprehend what the hell do you actually wanna imply with that formula.


Read what I posted slowly until you understand each word.

Not sure what else I can say.

When someone takes a save file and runs around an area in a game, and does that several times testing different cards, they are not going to render exactly the same thing each time. Do you understand that much so far?

In the best case scenario they will be close, in the worst case scenario they can be the extreme example I stated (staring at the floor versus geometry). That leaves the benches open to criticism. Just like the integrated benchmark because it is an AMD game, is open to criticism that it favors AMD. Do you understand that?

Now I said, that I don't believe that the reviewers deliberately skewed results, and I also don't wear a tinfoil hat and think AMD and Nixxes colluded to make the benchmark favor AMD.

So what you are left with is an integrated benchmark that performs the same every time compared to a benchmark that doesn't run the same way every time. I think the integrated is better since the tests are exact and repeatable. I think the Min/max results in the other method of testing are useless, and they only give a general sense of what each card does in average FPS. One thing is undeniable, you cant skew the integrated bench its always the same.


----------



## huzzug

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klocek001*
> 
> you two keep writing sth totally bogus, then end the whole paragraph with "not saying this is happening.."
> and I'm just here sitting and trying to comprehend what the hell do you actually wanna imply with that formula.
> 
> 
> 
> Read what I posted slowly until you understand each word.
> 
> Not sure what else I can say.
> 
> When someone takes a save file and runs around an area in a game, and does that several times testing different cards, they are not going to render exactly the same thing each time. Do you understand that much so far?
> 
> In the best case scenario they will be close, in the worst case scenario they can be the extreme example I stated (staring at the floor versus geometry). That leaves the benches open to criticism. Just like the integrated benchmark because it is an AMD game, is open to criticism that it favors AMD. Do you understand that?
> 
> Now I said, that I don't believe that the reviewers deliberately skewed results, and I also don't wear a tinfoil hat and think AMD and Nixxes colluded to make the benchmark favor AMD.
> 
> So what you are left with is an integrated benchmark that performs the same every time compared to a benchmark that doesn't run the same way every time. I think the integrated is better since the tests are exact and repeatable. I think the Min/max results in the other method of testing are useless, and they only give a general sense of what each card does in average FPS. One thing is undeniable, you cant skew the integrated bench its always the same.
Click to expand...

Pssst...just say the 3 golden words and you'll end this discussion. " GTX1070>FuryX"


----------



## Klocek001

I just like comparing cards







Kind of poinless now, nvidia has dx11 in check since maxwell, dx12 is pretty much crawling as dx11 mode runs faster and more stable in most cases.

lol, "useless", get out of here man.


----------



## huzzug

Yes. DX12 is crawling due to it still being a beta implementation. One can take avg fps win of AMD or Nvidia and claim one is better than the other, but there maybe still be certain issues still not being patched or code that does nothing for either brand on screen but just wastes GPU or CPU cycles. Best we wait this one out and continue to discuss RotTR or Doom


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> lol, "useless", get out of here man.


I guess I will just repeat this since you ignored it:
Quote:


> k, simple you just explain to me how two, three, or four benchmark runs that aren't rendering exactly identical things between each other can prove anything about _min or max framerate_ between cards. At best all it will tell you is, how generally each card performs on _average_, but those _minimums and maximums_ are totally useless.


Who knows maybe you took some weird experimental math in some other universe where that will prove something, so I am interested in hearing it.


----------



## tpi2007

I have a hard time believing that the DX 12 mode will leave Beta status tomorrow. I mean, sure, on paper, but performance-wise?

Anyway, the DX 12 implementation up to now is mostly a farce, with games being made with DX 11 and then sprinkled with DX 12 after the fact. In many ways worse than the DX 9 / 10 -> DX 11 transition, because there is no graphical fidelity gain to be had.

Anyone remember that misleading publicity that Microsoft put out with screenshots of Deus Ex's DX 11 vs DX 12 in the original "The Power of DirectX 12" video in March? The video that they had to pull a few days later in order to edit out the cancelled Fable Legends from the line-up? Interestingly they also removed the ridiculous Deus Ex comparison (see below), and we now know that the DX 12 path is identical in graphical fidelity to the DX 11 one.

But in the history books, when you look at the news, it was the first video that got all the tech sites' attention, including this (link to the second, edited video here):




DX 12: the API that the market will only truly be ready for by late next year (and then some).

Just like DX 11.

And DX 10 (well, this was never really a thing as most just jumped to DX 11).

And DX 9.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tpi2007*
> 
> I have a hard time believing that the DX 12 mode will leave Beta status tomorrow. I mean, sure, on paper, but performance-wise?
> 
> Anyway, the DX 12 implementation up to now is mostly a farce, with games being made with DX 11 and then sprinkled with DX 12 after the fact. In many ways worse than the DX 9 / 10 -> DX 11 transition, because there is no graphical fidelity gain to be had.
> 
> Anyone remember that misleading publicity that Microsoft put out with screenshots of Deus Ex's DX 11 vs DX 12 in the original "The Power of DirectX 12" video in March? The video that they had to pull a few days later in order to edit out the cancelled Fable Legends from the line-up? Interestingly they also removed the ridiculous Deus Ex comparison (see below), and we now know that the DX 12 path is identical in graphical fidelity to the DX 11 one.
> 
> But in the history books, when you look at the news, it was the first video that got all the tech sites' attention, including this (link to the second, edited video here):
> 
> 
> 
> 
> DX 12: the API that the market will only truly be ready by late next year (and then some).
> 
> Just like DX 11.
> 
> And DX 10 (well, this was never really a thing as most just jumped to DX 11).
> 
> And DX 9.


I am waiting for BF1 to see a good DX12 port. Will take years to see true DX12 game. At least Vulkan is good with Doom. You cant fix bad development with DX12/Vulkan


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am waiting for BF1 to see a good DX12 port. Will take years to see true DX12 game. At least Vulkan is good with Doom. You cant fix bad development with DX12/Vulkan


There is already one
http://www.ashesofthesingularity.com/


----------



## Klocek001

Quote:


> Originally Posted by *PontiacGTX*
> 
> There is already one
> http://www.ashesofthesingularity.com/




Don't take offence, what I think he meant is not just a game that's techically a showcase of dx12, but a proper GAME that's created with dx12 from grounds up.


----------



## huzzug

Not going to happen as long as Win7 stays a dominant OS in the market


----------



## BiG StroOnZ

Quote:


> Originally Posted by *huzzug*
> 
> Not going to happen as long as Win7 stays a dominant OS in the market


That's why I'm pretty much 100% on team Vulkan right now. I would rather see Vulkan be the future dominant API to be honest. This whole idea of locking a gamer down to a particular OS to be able to play modern games in a new API is not why we game on PC in the first place. Vulkan doesn't force users to use Windows 10 and that's the kind of mentality we need in this type of market.


----------



## Mach 5

I hate to say it, but Mantel was meant to be the next big thing, and you can count the number of games that support it with your fingers and toes.

I cant see Vulkan being any different.


----------



## Artikbot

Mantle was turned into Vulkan shortly after it was released to the world.


----------



## PontiacGTX

Quote:


> Originally Posted by *Klocek001*
> 
> Don't take offence, what I think he meant is not just a game that's techically a showcase of dx12, but a proper GAME that's created with dx12 from grounds up.


AotS is a game. if people dont play like RTS games then doesnt mean is just a DX12 demo unlike all those videos from Microsoft and Nvidia showing features which hasnt implemented at all, or like the Fable game which didnt release and only served as a biased gimmick, Ashes of the singularity is the only game that can be qualiified as a proper DX12 game


----------



## black96ws6

I think if you want to see how AMD and Nvidia cards will do in future games with DX12 and Vulkan, AOTS and Doom are probably your best examples.



If Vega is a 20% improvement over Fury X, that would mean 50fps in AOTS DX12, on par with a stock 1080 FE.

If Vega is a 30% improvement over Fury X, that would mean 54fps in AOTS DX12, it would beat a 1080 FE and run even or slightly fall behind stock 1080 AIB cards.

Then it's just a question of, how much can you OC the Vega chip? And how well does the performance with OC scale?

You can pretty much OC all 1080's to 2000+Mhz, regardless of type...


----------



## black96ws6

What's even more interesting is this second graphic:



You can see the Fury X has gained FPS compared to when the 1080 came out with the later game and driver patches.

Also, you can see Nvidia did improve things with Pascal, it definitely handles DX12\Async better than Maxwell. The 980ti loses performance compared with DX11, while the 1070 holds steady.

I think at this point unless you get a really good deal on a 980ti, future-wise, the 1070 is the better option. I would actually say a Fury X if you can get one for $350 or so is the best bang for the buck option, but that 4GB will probably end up causing issues in the future, so I guess it depends on how long you plan on keeping it...

And obviously, if money is secondary and performance is all that matters to you, get a 1080 or Titan XP.


----------



## black96ws6

You know what else is weird?

How can 2 different "legitimate" sites test the exact same benchmark using the exact same settings and come up with different results???

The above benchmarks from HardOCP were done at the end of May.

Then, you have an August review, 3 months later, from Hardware Canucks, and Nvidia cards are all back on top again in DX12? I mean ALL of them, even the 980ti?

So driver updates from Nvidia? Biased site? Combo of both?



http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-11.html


----------



## kyrie74

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> That's why I'm pretty much 100% on team Vulkan right now. I would rather see Vulkan be the future dominant API to be honest. This whole idea of locking a gamer down to a particular OS to be able to play modern games in a new API is not why we game on PC in the first place. Vulkan doesn't force users to use Windows 10 and that's the kind of mentality we need in this type of market.


Did you say the same thing when D3D11 was released along with Windows7? I seem to remember people complaining that they had to upgrade to 7 from XP to play new DX11 games.


----------



## boredgunner

Quote:


> Originally Posted by *kyrie74*
> 
> Did you say the same thing when D3D11 was released along with Windows7? I seem to remember people complaining that they had to upgrade to 7 from XP to play new DX11 games.


There were less complaints however, for several reasons such as:



Windows 7 was received better (especially by enthusiasts) than Windows 10. Hardly anyone favored XP.
DX11 brought forward tremendous graphical improvements which enticed most gamers.
There wasn't an open standard equivalent, not that people could see at least. OpenGL did not seem capable of measuring up to DX11's graphical improvements. Unlike now where Vulkan is simply an open standard alternative to DX12 with better compatibility.


----------



## Klocek001

Quote:


> Originally Posted by *huzzug*
> 
> Did he mention anything about the stuttering observed on AMD cards on gameplay, and not on the benchmark run as well as why similar Nvidia cards did not ?


Quote:


> Originally Posted by *huzzug*
> 
> Yes. There are users reporting AMD cards stuttering during gameplay using DX12. DX11 seems to work alright for them.


frametimes comparison, pclab review came out today, seems like that's a fact. dx11 is nice for both, dx12 stutters on amd





compared on one graph


----------



## ZealotKi11er

Quote:


> Originally Posted by *black96ws6*
> 
> You know what else is weird?
> 
> How can 2 different "legitimate" sites test the exact same benchmark using the exact same settings and come up with different results???
> 
> The above benchmarks from HardOCP were done at the end of May.
> 
> Then, you have an August review, 3 months later, from Hardware Canucks, and Nvidia cards are all back on top again in DX12? I mean ALL of them, even the 980ti?
> 
> So driver updates from Nvidia? Biased site? Combo of both?
> 
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-11.html


Yeah AMD had a lead in AoTS but now it has lost it even in DX12. What is going on? Did the game just remove ASync or something?


----------



## EastCoast

Quote:


> Originally Posted by *black96ws6*
> 
> I think if you want to see how AMD and Nvidia cards will do in future games with DX12 and Vulkan, AOTS and Doom are probably your best examples.
> 
> 
> 
> If Vega is a 20% improvement over Fury X, that would mean 50fps in AOTS DX12, on par with a stock 1080 FE.
> 
> If Vega is a 30% improvement over Fury X, that would mean 54fps in AOTS DX12, it would beat a 1080 FE and run even or slightly fall behind stock 1080 AIB cards.
> 
> Then it's just a question of, how much can you OC the Vega chip? And how well does the performance with OC scale?
> 
> You can pretty much OC all 1080's to 2000+Mhz, regardless of type...


Which shows how inefficient pascal is. Perhaps one simply like seeing 2000MHz +/- GPU's. I prefer good performance without having to redline the gpu in order to get it. If a typical 1070/1080 can overclock to about 2000MHz +/- give or take +/- 200Mhz or so then it should be beating FuryX by give or take 20+ FPS. We are talking about roughly 350Mhz-700MHz higher overclock here. No, the arch are not the same and that's not the point. It's clear that the Pascal arch is more of a brute force effort to better performance. Therefore, should be the superior choice when it comes to games like this. Yet we aren't seeing those performance gains that reflect such a redline approach to OC's. Your own example couldn't mustard up a double digit advantage.

Sure, sure, you might see a hit/miss on gamework titles but it takes a complete nvidia ecosystem to gaming to do it which then makes the brute force/redline OC's pointless.


----------



## jmcosta

Ashes of the Singularity is probably the only game that has big gains with dx12 but the game sucks, the AI is so baldly codded.. i bet they could make the AI with the same efficiency under dx11.
it often gets stuck, poor responsiveness against the other team.. i never experience a RTS to have an AI this dumb lol.
i played many others with almost the same amount of units and perform much better than this.

something fishy there..


----------



## Woundingchaney

Quote:


> Originally Posted by *EastCoast*
> 
> Which shows how inefficient pascal is. Perhaps one simply like seeing 2000MHz +/- GPU's. I prefer good performance without having to redline the gpu in order to get it. If a typical 1070/1080 can overclock to about 2000MHz +/- give or take +/- 200Mhz or so then it should be beating FuryX by give or take 20+ FPS. We are talking about roughly 350Mhz-700MHz higher overclock here. No, the arch are not the same and that's not the point. It's clear that the Pascal arch is more of a brute force effort to better performance. Therefore, should be the superior choice when it comes to games like this. Yet we aren't seeing those performance gains that reflect such a redline approach to OC's. Your own example couldn't mustard up a double digit advantage.
> 
> Sure, sure, you might see a hit/miss on gamework titles but it takes a complete nvidia ecosystem to gaming to do it which then makes the brute force/redline OC's pointless.


What are you considering "redlining" a video card? Moderate clock increases of 200mhz which is approx. 13% over stock boost perfomance wouldn't suggest redlining.


----------



## EastCoast

Quote:


> Originally Posted by *Woundingchaney*
> 
> What are you considering "redlining" a video card? Moderate clock increases of 200mhz which is approx. 13% over stock boost perfomance wouldn't suggest redlining.


Redlining: GPU that requires a abnormal high delta in clock rate to be competitive or beat a competing product. However, showing very little linear correlation between that abnormal OC delta and actual performance. *Revealing diminishing returns*. However, do to it still be competitive or still beating the competing product is largely *ignored*.

Example:
GPU A has an overall stable clock rate of 2000MHz
GPU B has an overall stable clock rate of 1300MHz
The delta between the 2 is 700MHz core clock.
GPU A, regardless of different Arch's used, beats GPU B do to a brute force approach to gaming. However, frame rates are at times, less then double digit. Or, at times, well below linear performance expectations at such a massive OC... But GPU A still wins. So *diminishing returns* are ignored because GPU A still won.


----------



## Woundingchaney

Quote:


> Originally Posted by *EastCoast*
> 
> Redlining: GPU that requires a abnormal high delta in clock rate to be competitive or beat a competing product. However, showing very little linear correlation between that abnormal OC delta and actual performance. Revealing diminishing returns. However, do to it still be competitive or still beating the competing product is largely *ignored*.


So what percent increase in clock rates results in adequate usage of the term redlining? 5%-10%? Because many of these cards stock boost to over 1800, for instance my Titan stock boosts to over 1800mhz at stock. Im having a hard time considering 10% clock rate increase as "an abnormal high delta". Moving from 1800 boost to 2000 boost is literally only a 11% increase.

Performance metrics for Nvidia's architecture scale very well to OC performance.


----------



## EastCoast

Quote:


> Originally Posted by *Woundingchaney*
> 
> So what percent increase in clock rates results in adequate usage of the term redlining? 5%-10%? Because many of these cards stock boost to over 1800, for instance my Titan stock boosts to over 1800mhz. Im having a hard time considering 10% clock rate increase as "an abnormal high delta". Moving from 1800 boost to 2000 boost is literally only a 11% increase.
> 
> Performance metrics for Nvidia's architecture scale very well to OC performance.


Hmm did my edit after you replied so I'll put it here:

Example:
GPU A has an overall stable clock rate of 2000MHz
GPU B has an overall stable clock rate of 1300MHz
The delta between the 2 is 700MHz core clock.
GPU A, regardless of different Arch's used, beats GPU B do to a brute force approach to gaming. However, frame rates are at times, less then double digit. Or, at times, well below linear performance expectations at such a massive OC... But GPU A still wins. So diminishing returns are ignored because GPU A still won.

The %'s of little value. It's the number to the actual core clocks are becoming attrition. At what point do you think the numbers become meaningless? When we get to 3000Mhz, 4000Mhz, etc.
We can easily calculate the difference between 3500Mhz stock vs 4000Mhz OC and get a percentage. Again, becoming attrition at that point.

Here is something to ponder on. We've been down this road before, albeit never with clock rates this high. I seriously doubt that Volta will be anywhere near 2000MHz and will show improvements over Pascal at a much lower clock rate. It's history repeating itself. Call it tick/tock to clock rates,


----------



## criminal

Quote:


> Originally Posted by *EastCoast*
> 
> Hmm did my edit after you replied so I'll put it here:
> 
> Example:
> GPU A has an overall stable clock rate of 2000MHz
> GPU B has an overall stable clock rate of 1300MHz
> The delta between the 2 is 700MHz core clock.
> GPU A, regardless of different Arch's used, beats GPU B do to a brute force approach to gaming. However, frame rates are at times, less then double digit. Or, at times, well below linear performance expectations at such a massive OC... But GPU A still wins. So diminishing returns are ignored because GPU A still won.
> 
> The %'s of little value. It's the number to the actual core clocks are becoming attrition. At what point do you think the numbers become meaningless? When we get to 3000Mhz, 4000Mhz, etc.
> We can easily calculate the difference between 3500Mhz stock vs 4000Mhz OC and get a percentage. Again, becoming attrition at that point.
> 
> Here is something to ponder on. We've been down this road before, albeit never with clock rates this high. I seriously doubt that Volta will be anywhere near 2000MHz and will show improvements over Pascal at a much lower clock rate. It's history repeating itself. Call it tick/tock to clock rates,


Nvidia did something to Pascal to allow such high clock speeds, but why does it matter if the performance is there? I don't really care that it takes a GTX1070 2000+ MHz to compete/surpass a card running lower clock speeds (Fury X or 980Ti). Pascal isn't being "redlined" as you want to call it. When you "redline" an engine you are potentially causing damage to the engine if you continue to run it like that. There is no damage being done to Pascal running these high clock speeds because power usage, voltage and temps are still relatively low.


----------



## Woundingchaney

Quote:


> Originally Posted by *EastCoast*
> 
> Hmm did my edit after you replied so I'll put it here:
> 
> Example:
> GPU A has an overall stable clock rate of 2000MHz
> GPU B has an overall stable clock rate of 1300MHz
> The delta between the 2 is 700MHz core clock.
> GPU A, regardless of different Arch's used, beats GPU B do to a brute force approach to gaming. However, frame rates are at times, less then double digit. Or, at times, well below linear performance expectations at such a massive OC... But GPU A still wins. So diminishing returns are ignored because GPU A still won.
> 
> The %'s of little value. It's the number to the actual core clocks are becoming attrition. At what point do you think the numbers become meaningless? When we get to 3000Mhz, 4000Mhz, etc.
> We can easily calculate the difference between 3500Mhz stock vs 4000Mhz and get a percentage. Again, becoming attrition at that point.


I still dont understand how core clock rates correspond to the term "redlining" particularly at the notion of them being stock.
Quote:


> red·line
> ˈredˌlīn/
> NORTH AMERICANinformal
> verb
> gerund or present participle: redlining
> 1.
> drive with (a car engine) at or above its rated maximum rpm.
> "both his engines were redlined now"


If one card is rated at 1800 and the other card is rated at 1300 neither card is "redlining" when they operate under those mhz. You could make an argument for redlining when comparing OC verse stock but that isnt often done (unless of course one gpu has no OC ability). Its also important to note that you are looking at one specific benchmark yet there are a considerable amount of benchmarks that offer the performance difference you desire.

http://www.anandtech.com/bench/product/1720?vs=1714

Additionally you are talking about an architecture that is only just now starting to be leveraged to a very limited extent in limited circumstances. No one isnt saying that AMD doesnt have a good solution for Dx12 but its extremely limited in todays games. So while you may prefer a more lower clocked architecture solution, you are referring to an over all architecture that has been under utilized for years and still only has marginal relevance in current games.


----------



## Klocek001

Are we talking 2000MHz out of the box ? Cuz oc'd 1080s can run over 2150MHz even on air. 1080 Classified does. 2000MHz is really nothing special for a Pascal card, don't think that's really redlining it at all. custom 1060s run almost 2GHz out of the box while not consuming much more power than FE.


Spoiler: Warning: Spoiler!







same for custom 1080s


Spoiler: Warning: Spoiler!







*THIS* is redlining


Spoiler: Warning: Spoiler!






same fiji core and memory config, but fury x consumes twice as much power with ~10% performance gain.


----------



## PontiacGTX

Quote:


> Originally Posted by *black96ws6*
> 
> You know what else is weird?
> 
> How can 2 different "legitimate" sites test the exact same benchmark using the exact same settings and come up with different results???
> 
> The above benchmarks from HardOCP were done at the end of May.
> 
> Then, you have an August review, 3 months later, from Hardware Canucks, and Nvidia cards are all back on top again in DX12? I mean ALL of them, even the 980ti?
> 
> So driver updates from Nvidia? Biased site? Combo of both?
> 
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-11.html


It is the same methdology that said that GKepler wasnt performing worse,
but now Tahiti gained 15% performance,did it?


----------



## Forceman

Quote:


> Originally Posted by *black96ws6*
> 
> You know what else is weird?
> 
> How can 2 different "legitimate" sites test the exact same benchmark using the exact same settings and come up with different results???
> 
> The above benchmarks from HardOCP were done at the end of May.
> 
> Then, you have an August review, 3 months later, from Hardware Canucks, and Nvidia cards are all back on top again in DX12? I mean ALL of them, even the 980ti?
> 
> So driver updates from Nvidia? Biased site? Combo of both?
> 
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73251-zotac-gtx-1070-amp-extreme-review-11.html


I think it has something to do with how Hardware Canucks gets their numbers for AotS. They don't use the benchmark result numbers, but instead get them somewhere else. They detailed it in one of the reviews, but I can't remember where, or the details of how they are measuring it. Probably in one of their early AotS articles.

Edit: pretty sure some recent drivers/patches did improve Nvidia performance as well, so that's likely still a factor.


----------



## the9quad

I am starting to think that at least as far as AMD goes, DX12 is not rendering the same as DX 11. For one, DX11 textures are much sharper and it also has way more aliasing.

Take a look at these two captures the first is from DX11 and the second is from DX12. Notice the jacket in DX11 is much sharper (higher res?) in DX11. Also notice the chair legs have way more jaggies then they do in the DX12 shot. Keep in mind it is way more obvious in motion. is there some shenanigans going on? Back to back runs with only changing the API btw.

*DX11*








*DX12*









Now look at this guys jacket in DX11









versus DX12









easier to see in the originals:
http://www.the9quad.com/images/2016/09/19/DX11-A.png
http://www.the9quad.com/images/2016/09/19/DX12-A.png


----------



## GorillaSceptre

DX12 has been a lot of hot air so far.. It was supposed to be an improvement for all PC gamers, Async shading aside. Some titles perform a tiny bit better, but mostly garbage.. Once all the major engines take advantage of it and developers actually have time to do it properly, then we better see some big improvements.

Vulkan and DOOM has been great though, i hope it takes off in a big way and becomes the new go-to API, but i think that was more down to id than Vulkan, i bet they would of delivered with DX12 too. It's clear that the main studios have to handle it, port houses like Nixxes either don't have the expertise, or time to deliver.


----------



## BradleyW

Textures are toned down with DX12 atm due to the higher VRAM usage. Hitman did the same until they gave the option to turn the safeguards off.


----------



## DracoNB

Quote:


> Originally Posted by *the9quad*
> 
> easier to see in the originals:
> http://www.the9quad.com/images/2016/09/19/DX11-A.png
> http://www.the9quad.com/images/2016/09/19/DX12-A.png


Is the FOV different? The DX12 shot is wider. It looks to have more AA on as well which could be why the textures aren't as sharp.


----------



## the9quad

Quote:


> Originally Posted by *DracoNB*
> 
> Is the FOV different? The DX12 shot is wider. It looks to have more AA on as well which could be why the textures aren't as sharp.


Fov is the same, just hard to take screenshots in exactly the same spot. I got close enough, lol. AA settngs are the same btw, but obviously different.


----------



## DracoNB

Quote:


> Originally Posted by *the9quad*
> 
> Fov is the same, just hard to take screenshots in exactly the same spot. I got close enough, lol. AA settngs are the same btw, but obviously different.


Ah ok, I thought it was save -> change settings -> load

Yeah obviously something is different, maybe DX12 uses a different AA system or something extra is running because its much less aliasing but overall blur, similar to FXAA.


----------



## sumitlian

DX11 is looking sharper than DX12. It should be due to Anti Aliasing method used with DX12 !?


----------



## vodkapl

Quote:


> Originally Posted by *Mach 5*
> 
> I hate to say it, but Mantel was meant to be the next big thing, and you can count the number of games that support it with your fingers and toes.
> 
> I cant see Vulkan being any different.


Mantle seemed to been the catalyst for Metal and DX12, and it was a foundation for Vulkan. Just because few games used Mantle it doesn't mean it was not a big thing.

MANTLE - Not open source, less compatibility (no android, linux and osx)
VULKAN - Open source, supports most if not all platforms

Not only that it seems Vulkan has larger gpu range than Mantle and DX12.
Mantle became available early 2015, that's roughly 6/3 years after Windows7/8. Windows 10 has been out for less than two years now, time = more marketshare, and the DX12 exclusivity can hurt them. The significance of this is that if gamers refuse to upgrade to Windows 10, and they can and should, they won't lose much if any at all because Vulkan will become their alternative.
Developers won't have a choice but use Vulkan if they want to get better sales.

Finally, Vulkan has MUCH more support than Mantle ever did.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *vodkapl*
> 
> Developers won't have a choice but use Vulkan if they want to get better sales.


This is exactly what I was thinking. In the future if developers of a game only use DX12 they are cutting themselves out of potential markets. Which is obviously hazardous for a game because you are potentially losing tons of sales (just look at the Steam hardware surveys for what kind of computers most people are running). It might really look like in the future just like right now how we have toggles in graphics options for DX11 or DX12 or OpenGL or Vulkan that in the future games are going to have to have DX12 or Vulkan options (obviously with a launcher where you pick which one before the game loads). This to me seems like the most reasonable outcome.

It also seems very plausible because Microsoft worked closely with AMD with DX12 and of course Vulkan is basically Mantle, this should not be difficult for a developer to execute.


----------



## boredgunner

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> This is exactly what I was thinking. In the future if developers of a game only use DX12 they are cutting themselves out of potential markets. Which is obviously hazardous for a game because you are potentially losing tons of sales (just look at the Steam hardware surveys for what kind of computers most people are running). It might really look like in the future just like right now how we have toggles in graphics options for DX11 or DX12 or OpenGL or Vulkan that in the future games are going to have to have DX12 or Vulkan options (obviously with a launcher where you pick which one before the game loads). This to me seems like the most reasonable outcome.
> 
> It also seems very plausible because Microsoft worked closely with AMD with DX12 and of course Vulkan is basically Mantle, this should not be difficult for a developer to execute.


One would hope Vulkan becomes the most common choice. It is different now compared to when DX11 first hit, because DX11 did not have an equal alternative.

EA and Square Enix are going with DX12. Everything Microsoft is DX12 obviously. Valve will be going with Vulkan, but they hardly make games to say the least. Bethesda is in love with Microsoft so one would expect them to encourage DX12 even if DOOM/id don't care. Not sure about Ubisoft or Sega, although I know the Total War franchise (which is underneath Sega) has chosen DX12. I wonder what other big publishers will be going with?


----------



## BiG StroOnZ

Quote:


> Originally Posted by *boredgunner*
> 
> One would hope Vulkan becomes the most common choice. It is different now compared to when DX11 first hit, because DX11 did not have an equal alternative.
> 
> EA and Square Enix are going with DX12. Everything Microsoft is DX12 obviously. Valve will be going with Vulkan, but they hardly make games to say the least. Bethesda is in love with Microsoft so one would expect them to encourage DX12 even if DOOM/id don't care. Not sure about Ubisoft or Sega, although I know the Total War franchise (which is underneath Sega) has chosen DX12. I wonder what other big publishers will be going with?


They can all lock themselves to DX12 if they so please, doesn't mean the market will respond positively to it. When most gamers are still on Win7/Win8 and are in no rush to upgrade to Windows 10 based on its current reputation. All these companies you named are going to have to make a choice when their sales numbers are rolling in and they are wondering why the numbers are so low compared to previous years or why ALL of their games have terrible ratings by actual people.

Right now you can't see it, but I can see it clear as day; the media coverage and message board/forum conversations based on outrages taking place of games having only DX12 support with no Vulkan option. Just look at all the different adventures we have been through over the past couple of years in the gaming market and hardware market (to name a couple; most recently, No Man's Sky or most memorable, inaccurate GTX 970 specifications). The internet is evolving, the people are evolving, they are realizing they have a voice and it matters. With this type of realization the possibilities of change are endless.


----------



## boredgunner

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> They can all lock themselves to DX12 if they so please, doesn't mean the market will respond positively to it. When most gamers are still on Win7/Win8 and are in no rush to upgrade to Windows 10 based on its current reputation. All these companies you named are going to have to make a choice when their sales numbers are rolling in and they are wondering why the numbers are so low compared to previous years or why ALL of their games have terrible ratings by actual people.
> 
> Right now you can't see it, but I can see it clear as day; the media coverage and message board/forum conversations based on outrages taking place of games having only DX12 support with no Vulkan option. Just look at all the different adventures we have been through over the past couple of years in the gaming market and hardware market (to name a couple; most recently, No Man's Sky or most memorable, inaccurate GTX 970 specifications). The internet is evolving, the people are evolving, they are realizing they have a voice and it matters. With this type of realization the possibilities of change are endless.


Well, for the next few years both DX12 and Vulkan games will also have DX11 and/or OpenGL since even Vulkan only goes back so far with regards to GPUs. By the time games stop having DX11 altogether, I think the market will have moved on to Windows 10. But then again I am a pessimist.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *boredgunner*
> 
> Well, for the next few years both DX12 and Vulkan games will also have DX11 and/or OpenGL since even Vulkan only goes back so far with regards to GPUs. By the time games stop having DX11 altogether, I think the market will have moved on to Windows 10. But then again I am a pessimist.


By moving on do you mean people forgetting about how ridiculous Windows 10 is or Microsoft realizing that because Windows 10 is a failure, they are going to have to make another OS? Because I'm personally going with the second choice. Windows 11. But I guess that's because I'm quite optimistic about the future on all fronts including the gaming industry.









I don't think that people are going to discontinue getting tired of being spied on, and I don't see it dying out anytime soon. I only see it becoming more provocative to the people. Everyone seems to be getting sick and tired of being sick and tired.


----------



## ToTheSun!

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> By moving on do you mean people forgetting about how ridiculous Windows 10 is or Microsoft realizing that because Windows 10 is a failure


I'm all for supporting Vulkan, and i hope DX12 gets absolutely no traction (silly me).

But how is Windows 10 a failure, again?


----------



## moustang

The one reason why DX12 will always win out over Vulkan should be obvious.

How's that Audio support in Vulkan working? Is it as good as DX12?

And if PC game developers are basically forced to use DX12 for audio in all of their Windows PC games, what financial incentive do they have for making their renderer work in Vulkan? And don't even think that multiplatform makes a difference, most developers don't bother supporting Linux or Mac right now, they aren't going to lose anything by not supporting these systems tomorrow.

Just look at the number of games with OpenGL support right now. Why would you expect Vulkan to get any more support than OpenGL?


----------



## deepor

Quote:


> Originally Posted by *moustang*
> 
> The one reason why DX12 will always win out over Vulkan should be obvious.
> 
> How's that Audio support in Vulkan working? Is it as good as DX12?
> 
> And if PC game developers are basically forced to use DX12 for audio in all of their Windows PC games, what financial incentive do they have for making their renderer work in Vulkan? And don't even think that multiplatform makes a difference, most developers don't bother supporting Linux or Mac right now, they aren't going to lose anything by not supporting these systems tomorrow.
> 
> Just look at the number of games with OpenGL support right now. Why would you expect Vulkan to get any more support than OpenGL?


If you are actually asking: there's absolutely no audio stuff in Vulkan (or other needed things like input). It's purely about graphics. I think people usually use some middleware to deal with audio.

In other news, the game this thread here is about is getting ported to Mac and Linux so it would then be one concrete example of a game supporting OpenGL (and Mac and Linux).


----------



## vodkapl

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> ...This to me seems like the most reasonable outcome.


I'd say the same too.
Quote:


> Originally Posted by *boredgunner*
> 
> One would hope Vulkan becomes the most common choice. It is different now compared to when DX11 first hit, because DX11 did not have an equal alternative.


I really believe gamers can contribute to Vulkan success. Advertising costs money. I have seen Microsoft do two promotions so far for Windows 10, which become for DX12 by proxy, one was the free upgrade and the second is "Money Back" deal with computer purchases. Now if many (or more at least) people in gaming corners of web vouch for Vulkan over DX12 because of it's benefits it could potentially make more developers and companies consider Vulkan.

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> They can all lock themselves to DX12 if they so please, doesn't mean the market will respond positively to it. When most gamers are still on Win7/Win8 and are in no rush to upgrade to Windows 10 based on its current reputation. All these companies you named are going to have to make a choice when their sales numbers are rolling in and they are wondering why the numbers are so low compared to previous years or why ALL of their games have terrible ratings by actual people.
> 
> Right now you can't see it, but I can see it clear as day; the media coverage and message board/forum conversations based on outrages taking place of games having only DX12 support with no Vulkan option. Just look at all the different adventures we have been through over the past couple of years in the gaming market and hardware market (to name a couple; most recently, No Man's Sky or most memorable, inaccurate GTX 970 specifications). The internet is evolving, the people are evolving, they are realizing they have a voice and it matters. With this type of realization the possibilities of change are endless.


I am very confident too this will happen if W7 and 8 marketshares stay strong. That's one thing pc gaming community could, and maybe should, do to help Vulkan succeed; keep W7 and 8 marketshares strong.
Quote:


> Originally Posted by *moustang*
> 
> The one reason why DX12 will always win out over Vulkan should be obvious.
> How's that Audio support in Vulkan working? Is it as good as DX12?
> And if PC game developers are basically forced to use DX12 for audio in all of their Windows PC games, what financial incentive do they have for making their renderer work in Vulkan? And don't even think that multiplatform makes a difference, most developers don't bother supporting Linux or Mac right now, they aren't going to lose anything by not supporting these systems tomorrow.
> 
> Just look at the number of games with OpenGL support right now. Why would you expect Vulkan to get any more support than OpenGL?


You are potentially right that this can happen but I think if something is missing it will be developed. Multiplatform matters because with DX12 it's not just Linux and OSX that's out in the cold blizzard but also Windows 7 and 8, and Vulkan comes with the bonus of Android compatibility which is why so many companies are supporting it too.
Devs dont support Linux or Mac because both os have few marketshare. Linux I heard has between 1-3% while OSX around 5-7%. But currently, when you add up all OSes that DX12 does not support it comes to less than 50% of market share. If that's not a big financial incentive to work with Vulkan then I don't know what is.

There is also OPENSK project which can be one of the projects that could fill the gap of DirectSound.


----------



## KyadCK

Quote:


> Originally Posted by *boredgunner*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BiG StroOnZ*
> 
> This is exactly what I was thinking. In the future if developers of a game only use DX12 they are cutting themselves out of potential markets. Which is obviously hazardous for a game because you are potentially losing tons of sales (just look at the Steam hardware surveys for what kind of computers most people are running). It might really look like in the future just like right now how we have toggles in graphics options for DX11 or DX12 or OpenGL or Vulkan that in the future games are going to have to have DX12 or Vulkan options (obviously with a launcher where you pick which one before the game loads). This to me seems like the most reasonable outcome.
> 
> It also seems very plausible because Microsoft worked closely with AMD with DX12 and of course Vulkan is basically Mantle, this should not be difficult for a developer to execute.
> 
> 
> 
> One would hope Vulkan becomes the most common choice. It is different now compared to when DX11 first hit, because DX11 did not have an equal alternative.
> 
> EA and Square Enix are going with DX12. Everything Microsoft is DX12 obviously. Valve will be going with Vulkan, but they hardly make games to say the least. Bethesda is in love with Microsoft so one would expect them to encourage DX12 even if DOOM/id don't care. Not sure about Ubisoft or Sega, although I know the Total War franchise (which is underneath Sega) has chosen DX12. I wonder what other big publishers will be going with?
Click to expand...

DX12 also has no equivalent. Vulkan has zero audio or input control, it is not a complete package. https://en.wikipedia.org/wiki/DirectX#Components

Even if Vulkan's tools and support are as good as Direct2D, Direct3D, and DirectCompute, they have no control over the rest, and are reliant on 3rd party solutions. There are advantages to owning the majority share OS.
Quote:


> Originally Posted by *vodkapl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BiG StroOnZ*
> 
> ...This to me seems like the most reasonable outcome.
> 
> 
> 
> I'd say the same too.
> Quote:
> 
> 
> 
> Originally Posted by *boredgunner*
> 
> One would hope Vulkan becomes the most common choice. It is different now compared to when DX11 first hit, because DX11 did not have an equal alternative.
> 
> Click to expand...
> 
> I really believe gamers can contribute to Vulkan success. Advertising costs money. I have seen Microsoft do two promotions so far for Windows 10, which become for DX12 by proxy, one was the free upgrade and the second is "Money Back" deal with computer purchases. Now if many (or more at least) people in gaming corners of web vouch for Vulkan over DX12 because of it's benefits it could potentially make more developers and companies consider Vulkan.
> 
> Quote:
> 
> 
> 
> Originally Posted by *BiG StroOnZ*
> 
> They can all lock themselves to DX12 if they so please, doesn't mean the market will respond positively to it. When most gamers are still on Win7/Win8 and are in no rush to upgrade to Windows 10 based on its current reputation. All these companies you named are going to have to make a choice when their sales numbers are rolling in and they are wondering why the numbers are so low compared to previous years or why ALL of their games have terrible ratings by actual people.
> 
> Right now you can't see it, but I can see it clear as day; the media coverage and message board/forum conversations based on outrages taking place of games having only DX12 support with no Vulkan option. Just look at all the different adventures we have been through over the past couple of years in the gaming market and hardware market (to name a couple; most recently, No Man's Sky or most memorable, inaccurate GTX 970 specifications). The internet is evolving, the people are evolving, they are realizing they have a voice and it matters. With this type of realization the possibilities of change are endless.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> *I am very confident too this will happen if W7 and 8 marketshares stay strong.* That's one thing pc gaming community could, and maybe should, do to help Vulkan succeed; keep W7 and 8 marketshares strong.
> Quote:
> 
> 
> 
> Originally Posted by *moustang*
> 
> The one reason why DX12 will always win out over Vulkan should be obvious.
> How's that Audio support in Vulkan working? Is it as good as DX12?
> And if PC game developers are basically forced to use DX12 for audio in all of their Windows PC games, what financial incentive do they have for making their renderer work in Vulkan? And don't even think that multiplatform makes a difference, most developers don't bother supporting Linux or Mac right now, they aren't going to lose anything by not supporting these systems tomorrow.
> 
> Just look at the number of games with OpenGL support right now. Why would you expect Vulkan to get any more support than OpenGL?
> 
> Click to expand...
> 
> You are potentially right that this can happen but I think if something is missing it will be developed. Multiplatform matters because with DX12 it's not just Linux and OSX that's out in the cold blizzard but also Windows 7 and 8, and Vulkan comes with the bonus of Android compatibility which is why so many companies are supporting it too.
> Devs dont support Linux or Mac because both os have few marketshare. Linux I heard has between 1-3% while OSX around 5-7%. But currently, when you add up all OSes that DX12 does not support it comes to less than 50% of market share. If that's not a big financial incentive to work with Vulkan then I don't know what is.
> 
> There is also OPENSK project which can be one of the projects that could fill the gap of DirectSound.
Click to expand...

Ya know what's funny about this comment? XP and DX11.

Otherwise, OpenSK is not even remotely close to feasible yet. It's years away, and isn't relevant now or any time soon.


----------



## vodkapl

Quote:


> Originally Posted by *KyadCK*
> 
> Ya know what's funny about this comment? XP and DX11.
> 
> Otherwise, OpenSK is not even remotely close to feasible yet. It's years away, and isn't relevant now or any time soon.


Why is XP and D11 funny in context to my comment? And can't developers use other tools other ones in DX?


----------



## boredgunner

How many games use DirectSound? Pretty much all use a third party API, like FMOD or XAudio2. But I know Vulkan demands more work while DX12 is a complete package, one of the reasons I am pessimistic about Vulkan's future. I am glad some studios I follow will be using it, such as Croteam. Then again I already have a Windows 10 partition so that I can play DX12 games too.


----------



## KyadCK

Quote:


> Originally Posted by *vodkapl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> Ya know what's funny about this comment? XP and DX11.
> 
> Otherwise, OpenSK is not even remotely close to feasible yet. It's years away, and isn't relevant now or any time soon.
> 
> 
> 
> Why is XP and D11 funny in context to my comment? And can't developers use other tools other ones in DX?
Click to expand...

XP had massive share.
DX11 did not work on XP.
DX11 won anyway.

The people saying "If lots of user stay on 7, DX12 won't be a thing" are the ones who were not around when everyone was saying "If we stay on XP, DX11 won't be a thing".

They can. We know DX support and tools are great. ID says Vulkan tools and support are great. Is anyone speaking up for non-DX audio and input solutions? Are their tools good?


----------



## boredgunner

Quote:


> Originally Posted by *KyadCK*
> 
> They can. We know DX support and tools are great. ID says Vulkan tools and support are great. Is anyone speaking up for non-DX audio and input solutions? Are their tools good?


Well, this list is pretty big and kept up to date. See how many recent big name titles use DirectSound. I see none.

http://satsun.org/audio/

XAudio2, Wwise, FMOD are all considered complete packages in their own right (as told by me by developers, I never worked on this level of game development). Too bad OpenAL is sort of like the Vulkan of audio APIs, open source but perhaps more work required to implement yet potentially the best results by far.


----------



## sumitlian

For Windows, Are Windows Driver Model related libraries mandatory for 3rd party APIs like XAudio or 3rd party APIs are completely independent of WDM ? I am confused please help.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sumitlian*
> 
> For Windows, Are Windows Driver Model related libraries mandatory for 3rd party APIs like XAudio or 3rd party APIs are completely independent of WDM ? I am confused please help.


Not mandatory. That's why Vulkan works in Windows 10 and Windows 7 and Linux and Android...


----------



## sumitlian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not mandatory. That's why Vulkan works in Windows 10 and Windows 7 and Linux and Android...


Thanks. Yes for graphics I know that already. I was just confirming if sound apis have the same freedom too or not.
It was a stupid question I think.


----------



## vodkapl

Quote:


> Originally Posted by *KyadCK*
> 
> XP had massive share.
> DX11 did not work on XP.
> DX11 won anyway.
> 
> The people saying "If lots of user stay on 7, DX12 won't be a thing" are the ones who were not around when everyone was saying "If we stay on XP, DX11 won't be a thing".
> 
> They can. We know DX support and tools are great. ID says Vulkan tools and support are great. Is anyone speaking up for non-DX audio and input solutions? Are their tools good?


Well it's one thing to decide not to switch over to a new OS but another to maintain that decision for years. Even informing and convincing people to not switch to Windows 10 is a tough challenge. It would require more than consumers promoting Vulkan but also resolve to not switch.

I wish I could enlighten you about other tools for sound but I'm not a developer


----------



## Mahigan

Quote:


> Originally Posted by *Klocek001*
> 
> come on, why are you really doing this ?
> 
> I guess there's nothing wrong with dx12 then. I mean he just said there isn't two pages ago, what else does anyone need ? Obviously all the in game reviews are fake.


I am not getting stuttering under DX12 either, many other users are not getting stuttering. My frame rate actually improves under DX12 as well, as with many users.

Hence why you have to think outside the box (something users here are unable to do for the most part). Something else is amiss... software? hardware differences? Who knows?

I have postulated possible culprits earlier in this thread but was gangbanged by the usual know-nothing suspects (people who frequently ignore the art of deductive reasoning).

If people are getting stuttering, while others are not, and both are using the same GPU and drivers then evidently the issue is found elsewhere in their software/hardware configuration. The same applies to people getting better performance under DX11 vs DX12 vs those who get better performance under DX12 over DX11. There is a culprit somewhere. Evidently. Something outside of drivers, GPU and game patches.

Yet all I see here are people claiming that one review is "truer" than another because it supports their bias.

Many NVidia users are getting stuttering.... are you? Probably not. Then why is their experience any less as real as yours? What could cause these differences? That's where deductive reasoning (troubleshooting) comes in.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *ToTheSun!*
> 
> I'm all for supporting Vulkan, and i hope DX12 gets absolutely no traction (silly me).
> 
> *But how is Windows 10 a failure, again?*


http://www.overclock.net/t/1601965/wccf-new-low-for-microsoft-removes-the-x-button-from-windows-10-upgrade-dialog
http://www.overclock.net/t/1583423/tt-windows-7-users-arent-budging-despite-free-windows-10-upgrade
http://www.overclock.net/t/1606538/techspot-france-gives-microsoft-three-months-to-stop-windows-10-from-collecting-excessive-user-data
http://www.overclock.net/t/1606541/extremetech-windows-10-gets-one-last-desperate-nagware-update
http://www.overclock.net/t/1604272/eurogamer-woman-awarded-10k-after-suing-microsoft-for-sneaky-windows-10-upgrade
http://www.overclock.net/t/1602317/newsweek-microsoft-s-malware-like-windows-10-automatic-upgrades-cost-anti-poaching-nonprofit-thousands-group-says
http://www.overclock.net/t/1599301/betanews-windows-10-ruins-a-pro-gaming-stream-with-a-badly-timed-update
http://www.overclock.net/t/1607212/pcworld-microsoft-faces-two-new-lawsuits-over-aggressive-windows-10-upgrade-tactics
http://www.overclock.net/t/1569318/ars-even-when-told-not-to-windows-10-just-can-t-stop-talking-to-microsoft
http://www.overclock.net/t/1575181/zdnet-microsoft-tries-to-clear-the-air-on-windows-10-privacy-furor
http://www.overclock.net/t/1579074/forbes-microsoft-admits-windows-10-automatic-spying-cannot-be-stopped
http://www.overclock.net/t/1581859/slashdot-windows-10-fall-update-uninstalls-desktop-software-without-informing-users
http://www.overclock.net/t/1589351/extremetech-microsoft-now-preloading-tripadvisor-bloatware-into-windows-10
http://www.overclock.net/t/1577076/zdnet-windows-10-upgrade-nags-become-more-aggressive-offer-no-opt-out
http://www.overclock.net/t/1578670/ars-tv-windows-10-will-be-automatically-downloaded-to-windows-7-and-windows-8-machines-next-year
http://www.overclock.net/t/1567525/rt-incredibly-intrusive-windows-10-spies-on-you-by-default
http://www.overclock.net/t/1566715/forbes-gamers-should-be-worried-about-windows-10-automatic-updates
http://www.overclock.net/t/1592002/the-hacker-news-windows-10-sends-your-data-5500-times-every-day-even-after-tweaking-privacy-settings
http://www.overclock.net/t/1568055/myce-windows-10-updates-can-disable-pirated-games-and-unauthorized-hardware
http://www.overclock.net/t/1567830/zh-the-surveillance-state-goes-mainstream-windows-10-is-watching-logging-everything
http://www.overclock.net/t/1565400/ars-windows-10-updates-to-be-automatic-and-mandatory-for-home-users

Take a look at public opinion and decide for yourself.








Quote:


> Originally Posted by *vodkapl*
> 
> I'd say the same too.
> 
> I am very confident too this will happen if W7 and 8 marketshares stay strong. *That's one thing pc gaming community could, and maybe should, do to help Vulkan succeed; keep W7 and 8 marketshares strong.*


That's what I will be doing for sure is playing my part. But problem is most gamers will get worried that they cant play a new Microsoft exclusive title (like Forza for instance) and upgrade to Win 10 just so they can play the game. Regardless of whether they hate Windows 10 or not or any of the Win10 antics. Then of course AMD users will be like, "Well, I'm missing out on potential performance so I HAVE TO UPGRADE to Win 10!" (see what AMD and Microsoft did here...). This is a bit worrisome but hopefully a good portion of the community stays strong. Just because you are missing out on one game, does not mean there isn't another out there that won't satisfy your craving.
Quote:


> Originally Posted by *KyadCK*
> 
> XP had massive share.
> DX11 did not work on XP.
> DX11 won anyway.
> 
> The people saying "If lots of user stay on 7, DX12 won't be a thing" are the ones who were not around when everyone was saying "If we stay on XP, DX11 won't be a thing".


I don't recall going from Windows XP directly to Windows 7. I remember there being an intermediary OS that came before it, that was proven to be the reason why people wanted to stay on XP. You seem to have forgotten about it but it was called Windows Vista and that was primarily the reason why people wanted to stay on XP. It really had nothing to do with DX11. It had to do with Windows Vista being atrocious.

This time again, it has nothing to do with DX12. I have no problem with DX12, I have a problem with them locking down DX12 to a nonsense rubbish spyware infested operating system. It's so bad that I had Spybot Anti-Beacon installed on my computer (on a Windows 7 machine) because they added Telemetry nonsense to Windows 7 through updates. Yet after a few updates slipped through the cracks (ones that I did not approve of and were downloaded and installed automatically - even though updates were disabled). Microsoft had the ability to turn off my internet connection on my computer because of the fact that I had all the Telemetry nonsense disabled. As soon as I deactivated Spybot Anti-Beacon my internet suddenly worked again. Digging deeper I did indeed find that it was a matter of Windows 10 updates being installed onto my Windows 7 computer. I had to go search for individual executables in the Windows folder that Microsoft installed onto my computer and remove them by hand in order to get my internet working again while having Anti-Beacon installed and enabled at the same time.

The fact that Microsoft has the power to disable your internet because you want to disable Telemetry is violation of personal rights. End of story. It's not like any of this data mining is being used for any good. Is there still violent crime? Is there still terrorism? Is there still evil? Yes. So please do explain to me why I need to give up my rights when the spying and the data mining is NOT BEING USED to better the world around me.


----------



## moustang

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Take a look at public opinion and decide for yourself.


https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0

22.9% of all PCs run Windows 10. Only Windows 7 which has been out for 7 years has a higher market share. Windows 10 has more users than every Mac OS, Linux, and Windows 8/8.1 COMBINED.

If that's a failure then surely you must believe that Linux and Mac are abysmal failures since both combined are just barely over 1/4th the userbase of Windows 10.

What's even more telling though is WHO are the ones that aren't adopting Windows 10 yet. It's not gamers, it's businesses. Globally less than 15% of businesses have adopted Windows 10, which means the vast majority of that Windows 10 userbase are home users. Since businesses typically don't allow gaming during work I hardly think their slow adoption is relevant to this subject.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *moustang*
> 
> https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0
> 
> 22.9% of all PCs run Windows 10. Only Windows 7 which has been out for 7 years has a higher market share. Windows 10 has more users than every Mac OS, Linux, and Windows 8/8.1 COMBINED.
> 
> If that's a failure then surely you must believe that Linux and Mac are abysmal failures since both combined are just barely over 1/4th the userbase of Windows 10.
> 
> What's even more telling though is WHO are the ones that aren't adopting Windows 10 yet. It's not gamers, it's businesses. Globally less than 15% of businesses have adopted Windows 10, which means the vast majority of that Windows 10 userbase are home users. Since businesses typically don't allow gaming during work I hardly think their slow adoption is relevant to this subject.


Did you even bother to look at the public opinion? These statistics might as well be null and void from the get go based on the aggressive tactics by Microsoft to push Windows 10.

A) They offered a Free Upgrade
B) Most people found their computers upgraded to Windows 10 without their permission
C) Many people DID UPGRADE to Windows 10, but then shortly after reverted back to 7 or 8.

But none of that even matters NetMarketShare is a Microsoft partner, end of discussion.

So is Windows 10 increasing market share because people want it or because Microsoft is forcing it down people's throats? Let's be honest here, you and I both know the answer to that question because if you read any of the links I posted about the PUBLIC OPINION OR PUBLIC PERCEPTION of the product you would know that statistics might as well be meaningless based on the Windows 10 agenda.

21 links posted vs your 1 link but somehow that is supposed to mean that Windows 10 was welcomed into the market with open arms and everyone has been just thrilled about it.









Yeah people love it so much Microsoft is getting sued over it or people want to sue because of being unhappy about it:

https://www.cnet.com/news/microsoft-pays-woman-10000-over-forced-windows-10-install/
https://www.hackread.com/microsoft-sued-over-windows-10-grade/
http://www.valuewalk.com/2015/11/windows-10-class-action-microsoft/

But in your eyes because of your precious statistics basically ran by Microsoft, everybody loves Windows 10 and it is the next big thing.


----------



## deepor

You need to check out the Steam survey website instead of one doing a general PC survey. That's more interesting because it's purely about games.

In the Steam survey, Windows 10 is actually a lot higher. You can also see that there's less people using a DX12 and Vulkan capable card in Windows 7 (plus 8 and Linux) instead of in Windows 10. This should mean if someone is deciding between Vulkan and DX12, Windows 10 adoption might not be a problem at all and not a good argument.

I feel a better argument is just that Vulkan works and is already pretty close to DX12 in its ideas, so why not use it instead of DX12? When you use Vulkan, you might do work that could be useful in the future somewhere else, like on Android or whatnot, while DX12 is pure Windows.


----------



## DracoNB

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> But none of that even matters NetMarketShare is a Microsoft partner, end of discussion.


Is steam is partnered with Microsoft too?

http://store.steampowered.com/hwsurvey

Windows 10 64: 47.44% (+2.77%)
Windows 7 64: 28.74% (-1.29%)
Windows 8.1 64: 8.57% (-0.75%)


----------



## BiG StroOnZ

Quote:


> Originally Posted by *DracoNB*
> 
> Is steam is partnered with Microsoft too?
> 
> http://store.steampowered.com/hwsurvey
> 
> Windows 10 64: 47.44% (+2.77%)
> Windows 7 64: 28.74% (-1.29%)
> Windows 8.1 64: 8.57% (-0.75%)


How many of those are people that installed Windows 10 or upgraded. Went onto Steam, happened to have the Survey pop up and then reverted back to 8.1 or 7 shortly after. I've seen the Steam Hardware survey maybe once in my entire time on Steam and I've upgraded my hardware multiple times since I've been on Steam.

And here it was a year ago:



Did you bother reading anything else I wrote, or were you just worried about that one particular sentence? I made many suggestions as to why Windows 10 would be gaining marketshare and it wasn't because people like to use Windows 10 or wanted to use Windows 10 over another OS. It was because they were forced to. Cornered into a wall. They had no choice. Quite different.

Especially Gamers, I've pointed out why a gamer would bite their tongue and just do the upgrade. Want to play Forza _finally_ on the PC? Gotta have Windows 10. Are you an AMD user and want to gain more performance in certain titles? Gotta have Windows 10. And with all these new PC builders emerging switching over from Consoles who are not familiar with anything that has transpired over the last year or so regarding Windows 10, they are most likely starting off their build with Windows 10. Not their fault they don't really know any better and it's not really like they had a choice if they wanted to play all the modern titles that they expected to play with their brand new build. Someone coming from a console is definitely going to want to play Microsoft exclusives, so automatically they are going to have to use Windows 10 if they want to continue to play their console games on PC.


----------



## daviejams

Windows 10 is good , best operating system I've ever used. Would never go back to w7


----------



## ToTheSun!

Quote:


> Originally Posted by *daviejams*
> 
> Windows 10 is good , best operating system I've ever used. Would never go back to w7


Dude, you're wrong. Don't you see the overwhelming evidence? Windows 10 is a failure.


----------



## vodkapl

Quote:


> Originally Posted by *moustang*
> 
> *What's even more telling though is WHO are the ones that aren't adopting Windows 10 yet. It's not gamers, it's businesses*. Globally less than 15% of businesses have adopted Windows 10, which means the vast majority of that Windows 10 userbase are home users. Since businesses typically don't allow gaming during work I hardly think their slow adoption is relevant to this subject.


According to Steam which DracoNB has mentioned below W10 has 47% of all gamers. Now if you take into account what Big Str0onz said and also fact that Microsoft is really pushing W10 (nagware, deals with developers and companies, other) and there is no downgrade or option to reinstall previous OS for free then you can't say that 47% of 100% of gamers on Steam is a great success. So, no, according to Steam gamers as a whole has not adopted W10.
Quote:


> Originally Posted by *BiG StroOnZ*
> 
> That's what I will be doing for sure is playing my part. But problem is most gamers will get worried that they cant play a new Microsoft exclusive title (like Forza for instance) and upgrade to Win 10 just so they can play the game. Regardless of whether they hate Windows 10 or not or any of the Win10 antics. Then of course AMD users will be like, "Well, I'm missing out on potential performance so I HAVE TO UPGRADE to Win 10!" (see what AMD and Microsoft did here...). This is a bit worrisome but hopefully a good portion of the community stays strong. Just because you are missing out on one game, does not mean there isn't another out there that won't satisfy your craving.


When things are obscure people will fall and upgrade or give in. If consumers are to free themselves from DX12's lock we need to stand together. The effort must be coordinated and much more...
There is strength in numbers but if you are separate from the flock you will be killed. How do you know if there is a flock? And if you do know, how do you know the size of it? And if it's size is enough to bring about the change that you hope for?
Consumers must spread the word and explain why Vulkan, and software like it, should matter to us, and it's benefits going forward.
Real effort is required.
1. Spread awareness through social media and other: make Vulkan fan art, animations videos, upload informative videos or make posts about DX12 and Vulkan, support Vulkan development etc
2. Common place for everyone participating to gather and discuss

Label me insane but I would participate if a group of consumers "fighting" to stop DX12's LOCK emerged. Sure Vulkan could have disadvantages too later down the road but it's far better alternative than an api that doesn't help at all. There is "Vulkanmasterrace" on Reddit which I follow but doesn't seem to be doing anything than keeping up with Vulkan news.


----------



## HaiderGill

Just wondering now that people have had time to play the game properly, is it up to the level of the original - story and gameplay? I have only played the Warren Spector original and Human Revolution. I won't mention graphics of the original as the graphics in Unreal engine was crap. I thought Human Revolution was a very good game but not as good as the original. Story wasn't upto Warren's original and it wasn't as ground breaking. Chaps I'm asking for your opinion where do think this game lies, better than HR as good as the original?


----------



## huzzug

Story seems to be lacking but the gameplay is great as per what some users and general reviewers have come to agree.


----------



## boredgunner

Quote:


> Originally Posted by *HaiderGill*
> 
> Just wondering now that people have had time to play the game properly, is it up to the level of the original - story and gameplay? I have only played the Warren Spector original and Human Revolution. I won't mention graphics of the original as the graphics in Unreal engine was crap. I thought Human Revolution was a very good game but not as good as the original. Story wasn't upto Warren's original and it wasn't as ground breaking. Chaps I'm asking for your opinion where do think this game lies, better than HR as good as the original?


I've completed it. I also finished Human Revolution + Missing Link twice, and the original with Revision mod.

Gameplay is a universal improvement over Human Revolution save for the mouse acceleration. It has some of the best dialogue quality of all the Deus Ex games, but the plot doesn't contain nearly as much material and compared to the other two (especially the original) and it is lacking in exposition and overall coherency to some degree. You can see the consequences of Square Enix turning this game into a trilogy.


----------



## DracoNB

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> How many of those are people that installed Windows 10 or upgraded. Went onto Steam, happened to have the Survey pop up and then reverted back to 8.1 or 7 shortly after. I've seen the Steam Hardware survey maybe once in my entire time on Steam and I've upgraded my hardware multiple times since I've been on Steam.
> 
> And here it was a year ago:
> 
> 
> 
> Did you bother reading anything else I wrote, or were you just worried about that one particular sentence? I made many suggestions as to why Windows 10 would be gaining marketshare and it wasn't because people like to use Windows 10 or wanted to use Windows 10 over another OS. It was because they were forced to. Cornered into a wall. They had no choice. Quite different.
> 
> Especially Gamers, I've pointed out why a gamer would bite their tongue and just do the upgrade. Want to play Forza _finally_ on the PC? Gotta have Windows 10. Are you an AMD user and want to gain more performance in certain titles? Gotta have Windows 10. And with all these new PC builders emerging switching over from Consoles who are not familiar with anything that has transpired over the last year or so regarding Windows 10, they are most likely starting off their build with Windows 10. Not their fault they don't really know any better and it's not really like they had a choice if they wanted to play all the modern titles that they expected to play with their brand new build. Someone coming from a console is definitely going to want to play Microsoft exclusives, so automatically they are going to have to use Windows 10 if they want to continue to play their console games on PC.


So it went up 13% in a month right after release, and you are what, trying to show that as a failure? They have ~50% of gamers on 10.

And for your silly "Upgraded, ran survey, went back to 7"... going to say probably happened to almost no one. Like you said, its hard to get the survey to come up, so having it come up during the hour or w/e the person upgraded... highly unlikely.

Better performance on 10 sounds like a win for gamers to me. Not sure how you are trying to make these all negatives. 10 works great and I'd never go back to 7. 8.1 was much better than 7 as well even on a desktop, let alone hybrid laptops.

Quote:


> Originally Posted by *daviejams*
> 
> Windows 10 is good , best operating system I've ever used. Would never go back to w7


Ditto. Been using it since release and Win 8.1 before that. I even upgraded my work machine from 7 to 10 because I hate 7 compared to 10. Its much faster and works better for everything.


----------



## Assirra

Quote:


> Originally Posted by *ToTheSun!*
> 
> Dude, you're wrong. Don't you see the overwhelming evidence? Windows 10 is a failure.


What overwhelming evidence exactly?


----------



## deepor

Quote:


> Originally Posted by *Assirra*
> 
> What overwhelming evidence exactly?


That post was sarcasm.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *ToTheSun!*
> 
> Dude, you're wrong. Don't you see the overwhelming evidence? Windows 10 is a failure.


I'm assuming you didn't bother to check any of the twenty-one links I provided for you, so instead you use passive aggressive sarcasm to cover up your inability to read the _actual_ overwhelming evidence.
Quote:


> Originally Posted by *vodkapl*
> 
> When things are obscure people will fall and upgrade or give in. If consumers are to free themselves from DX12's lock we need to stand together. The effort must be coordinated and much more...
> There is strength in numbers but if you are separate from the flock you will be killed. How do you know if there is a flock? And if you do know, how do you know the size of it? And if it's size is enough to bring about the change that you hope for?
> Consumers must spread the word and explain why Vulkan, and software like it, should matter to us, and it's benefits going forward.
> Real effort is required.
> 1. Spread awareness through social media and other: make Vulkan fan art, animations videos, upload informative videos or make posts about DX12 and Vulkan, support Vulkan development etc
> 2. Common place for everyone participating to gather and discuss
> 
> Label me insane but I would participate if a group of consumers "fighting" to stop DX12's LOCK emerged. Sure Vulkan could have disadvantages too later down the road but it's far better alternative than an api that doesn't help at all. There is "Vulkanmasterrace" on Reddit which I follow but doesn't seem to be doing anything than keeping up with Vulkan news.


I agree with a lot of what you said. I find it interesting that AMD Enthusiasts like to frown upon bad business practices from NVIDIA or Intel and will do anything they can in their power to spread this type of information. Yet, here we are in a situation where the same exact thing is taking place with Microsoft, but you visit PCMasterRace subreddit and it's pretty quiet over there with anything regarding Microsoft's bad business practices with Windows 10. It's like everyone forgot what happened over the past year and was just like, "Well if you can't beat 'em, join 'em." Upgraded to Windows 10 and never looked back. Quite unfortunate really. Basically a double standard. "Gameworks is evil, blah blah blah blah. But I get good performance in DX12 with my Radeon GPU so it's perfectly fine that Microsoft is tracking every single thing I do." Yet if it was NVIDIA who made Windows 10 or Intel who made Windows 10 this would be a never ending story until some change came out of it.

We had strength in numbers, but it isn't helping that there are titles coming out in DX12 (even if many of the implementations of it have been proven to be disastrous) and people want to test it out for themselves. However, in comparison to DX12 implementations, Vulkan's implementations have been much more positive compared to the numerous DX12 implementations we have seen so far. Even if the number of Vulkan supporting games is a lot smaller. It still shows that Vulkan looks a lot more promising than what we have seen with DX12 so far.

I'll tell you one thing, if these spotty implementations of DX12 continue throughout the next year and Vulkan implementations prove to be better people are going to respond to it. They will find the flock and will know exactly where it is. It will be exactly the type of push we need to get people on the Vulkan bandwagon. You will definitely start seeing a shift if we keep having games with terrible DX12 support. People will be like, "what's the point of me using DX12 if it's worse than using DX11?" That can only happen so many times before people start to catch on. With the majority dGPU market being comprised of NVIDIA users, from what it seems, you would have to propose that seems like another likely outcome.
Quote:


> Originally Posted by *DracoNB*
> 
> So it went up 13% in a month right after release, and you are what, trying to show that as a failure? They have ~50% of gamers on 10.
> 
> And for your silly "Upgraded, ran survey, went back to 7"... going to say probably happened to almost no one. Like you said, its hard to get the survey to come up, so having it come up during the hour or w/e the person upgraded... highly unlikely.
> 
> Better performance on 10 sounds like a win for gamers to me. Not sure how you are trying to make these all negatives. 10 works great and I'd never go back to 7. 8.1 was much better than 7 as well even on a desktop, let alone hybrid laptops.


How is 50% a success? When you score 50% on your papers in University do you effervescently run throughout the campus announcing your success?

Actually no, it is very likely that it has happened. Because the survey shows up very rarely so if it shows up when the person is on Windows 10 even though they reverted back to Windows 7 or 8 it's not like Steam is going to reflect this change at all.

It doesn't matter how much better it runs if it is basically malware and violation of people's personal privacy. But let me guess, none of that matters to you right. Give up your privacy for better performance. Seems logical. Have a corporation tracking your every move and using that information against you, then you wonder why you drove past a billboard that you barely saw but suddenly you are craving a fast food Cheeseburger...








Quote:


> Originally Posted by *Assirra*
> 
> What overwhelming evidence exactly?


http://www.overclock.net/t/1601965/wccf-new-low-for-microsoft-removes-the-x-button-from-windows-10-upgrade-dialog
http://www.overclock.net/t/1583423/tt-windows-7-users-arent-budging-despite-free-windows-10-upgrade
http://www.overclock.net/t/1606538/techspot-france-gives-microsoft-three-months-to-stop-windows-10-from-collecting-excessive-user-data
http://www.overclock.net/t/1606541/extremetech-windows-10-gets-one-last-desperate-nagware-update
http://www.overclock.net/t/1604272/eurogamer-woman-awarded-10k-after-suing-microsoft-for-sneaky-windows-10-upgrade
http://www.overclock.net/t/1602317/newsweek-microsoft-s-malware-like-windows-10-automatic-upgrades-cost-anti-poaching-nonprofit-thousands-group-says
http://www.overclock.net/t/1599301/betanews-windows-10-ruins-a-pro-gaming-stream-with-a-badly-timed-update
http://www.overclock.net/t/1607212/pcworld-microsoft-faces-two-new-lawsuits-over-aggressive-windows-10-upgrade-tactics
http://www.overclock.net/t/1569318/ars-even-when-told-not-to-windows-10-just-can-t-stop-talking-to-microsoft
http://www.overclock.net/t/1575181/zdnet-microsoft-tries-to-clear-the-air-on-windows-10-privacy-furor
http://www.overclock.net/t/1579074/forbes-microsoft-admits-windows-10-automatic-spying-cannot-be-stopped
http://www.overclock.net/t/1581859/slashdot-windows-10-fall-update-uninstalls-desktop-software-without-informing-users
http://www.overclock.net/t/1589351/extremetech-microsoft-now-preloading-tripadvisor-bloatware-into-windows-10
http://www.overclock.net/t/1577076/zdnet-windows-10-upgrade-nags-become-more-aggressive-offer-no-opt-out
http://www.overclock.net/t/1578670/ars-tv-windows-10-will-be-automatically-downloaded-to-windows-7-and-windows-8-machines-next-year
http://www.overclock.net/t/1567525/rt-incredibly-intrusive-windows-10-spies-on-you-by-default
http://www.overclock.net/t/1566715/forbes-gamers-should-be-worried-about-windows-10-automatic-updates
http://www.overclock.net/t/1592002/the-hacker-news-windows-10-sends-your-data-5500-times-every-day-even-after-tweaking-privacy-settings
http://www.overclock.net/t/1568055/myce-windows-10-updates-can-disable-pirated-games-and-unauthorized-hardware
http://www.overclock.net/t/1567830/zh-the-surveillance-state-goes-mainstream-windows-10-is-watching-logging-everything
http://www.overclock.net/t/1565400/ars-windows-10-updates-to-be-automatic-and-mandatory-for-home-users


----------



## daviejams

Windows 10 is not a failure in any way and moving forward , most games (your AAA ones anyway) will use DX12. If you want to play the latest games then upgrade to win10.

It's that simple


----------



## ToTheSun!

Quote:


> Originally Posted by *daviejams*
> 
> Windows 10 is not a failure in any way and moving forward , most games (your AAA ones anyway) will use DX12. If you want to play the latest games then upgrade to win10.
> 
> It's that simple


Microsoft needs to release Windows 11 so that Windows 10 can be fine.


----------



## moustang

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Did you even bother to look at the public opinion?


Your cherry picked news stores ≠ public opinion.

How's this for public opinion:

http://www.pcworld.com/article/3078133/software-games/steams-may-survey-shows-its-users-prefer-windows-10.html

41.48% OF GAMERS are using Windows 10 on Steam. More than any other OS. This was a 2% increase in Windows 10 users IN A SINGLE MONTH.

Now that's public opinion at work.


----------



## HaiderGill

Quote:


> Originally Posted by *boredgunner*
> 
> I've completed it. I also finished Human Revolution + Missing Link twice, and the original with Revision mod.
> 
> Gameplay is a universal improvement over Human Revolution save for the mouse acceleration. It has some of the best dialogue quality of all the Deus Ex games, but the plot doesn't contain nearly as much material and compared to the other two (especially the original) and it is lacking in exposition and overall coherency to some degree. You can see the consequences of Square Enix turning this game into a trilogy.


Hmmm yes I'm not to bothered about the length but the quality of the plot. It's what got me hooked to the first one...


----------



## vodkapl

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> I agree with a lot of what you said. I find it interesting that AMD Enthusiasts like to frown upon bad business practices from NVIDIA or Intel and will do anything they can in their power to spread this type of information. Yet, here we are in a situation where the same exact thing is taking place with Microsoft, but you visit PCMasterRace subreddit and it's pretty quiet over there with anything regarding Microsoft's bad business practices with Windows 10. It's like everyone forgot what happened over the past year and was just like, "Well if you can't beat 'em, join 'em." Upgraded to Windows 10 and never looked back. Quite unfortunate really. Basically a double standard. "Gameworks is evil, blah blah blah blah. But I get good performance in DX12 with my Radeon GPU so it's perfectly fine that Microsoft is tracking every single thing I do." Yet if it was NVIDIA who made Windows 10 or Intel who made Windows 10 this would be a never ending story until some change came out of it.
> 
> We had strength in numbers, but it isn't helping that there are titles coming out in DX12 (even if many of the implementations of it have been proven to be disastrous) and people want to test it out for themselves. However, in comparison to DX12 implementations, Vulkan's implementations have been much more positive compared to the numerous DX12 implementations we have seen so far. Even if the number of Vulkan supporting games is a lot smaller. It still shows that Vulkan looks a lot more promising than what we have seen with DX12 so far.
> 
> I'll tell you one thing, if these spotty implementations of DX12 continue throughout the next year and Vulkan implementations prove to be better people are going to respond to it. They will find the flock and will know exactly where it is. It will be exactly the type of push we need to get people on the Vulkan bandwagon. You will definitely start seeing a shift if we keep having games with terrible DX12 support. People will be like, "what's the point of me using DX12 if it's worse than using DX11?" That can only happen so many times before people start to catch on. With the majority dGPU market being comprised of NVIDIA users, from what it seems, you would have to propose that seems like another likely outcome.


The problem is that consumers can't wait for DX12 to mess up. Vulkan is clearly the better api for PC. Right now it's lacking functionality, functionality that's being worked on, but that's way less than what DX12 lacks which is compatibility across all platforms.
Every week that goes by is another week where Vulkan will lose it's purpose (cross platform). And if Windows 10 gets enough marketshare to a point most developers won't bother with Vulkan it's game over. Right now there is 30%+ that does not use W10. Enough for many if not all devs to use Vulkan for games that they plan on releasing on many platforms.
But that won't stay the same in couple years. It would be first thing I would adress if I was leading group of consumers to help Vulkan succeed.

I think best thing consumers can do to help Vulkan or any api that works across all platforms is to inform lots and lots of people. What makes Vulkan great? What makes DX12 good but not great? And from there we can sit and watch and see what people do. No more can be done than that.


----------



## the9quad

Quote:


> Originally Posted by *vodkapl*
> 
> The problem is that consumers can't wait for DX12 to mess up. Vulkan is clearly the better api for PC. Right now it's lacking functionality, functionality that's being worked on, but that's way less than what DX12 lacks which is compatibility across all platforms.
> Every week that goes by is another week where Vulkan will lose it's purpose (cross platform). And if Windows 10 gets enough marketshare to a point most developers won't bother with Vulkan it's game over. Right now there is 30%+ that does not use W10. Enough for many if not all devs to use Vulkan for games that they plan on releasing on many platforms.
> But that won't stay the same in couple years. It would be first thing I would adress if I was leading group of consumers to help Vulkan succeed.
> 
> I think best thing consumers can do to help Vulkan or any api that works across all platforms is to inform lots and lots of people. What makes Vulkan great? What makes DX12 good but not great? And from there we can sit and watch and see what people do. No more can be done than that.


I think consumers will buy a game regardless of API, and we have little to no _direct_ involvement in what API a development studio decides to use. They will of course consider market share, but it will largely come down to resources vs expected gain. It's not like gamers are just _not_ going to buy a game because it uses DX12, so how exactly does consumer opinion factor into what API a dev studio chooses? Are they going to look at opinion polls?


----------



## Crazy9000

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Actually no, it is very likely that it has happened. Because the survey shows up very rarely so if it shows up when the person is on Windows 10 even though they reverted back to Windows 7 or 8 it's not like Steam is going to reflect this change at all.


If it shows up rarely, it's going to be less likely to record a temporary windows 10 switch, not more.


----------



## BradleyW

Latest patch today has reduced DX12 stutter and DX12 load times, but it's still not on par with DX11 yet.


----------



## Snee

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> How is 50% a success? When you score 50% on your papers in University do you effervescently run throughout the campus announcing your success?
> 
> Actually no, it is very likely that it has happened. Because the survey shows up very rarely so if it shows up when the person is on Windows 10 even though they reverted back to Windows 7 or 8 it's not like Steam is going to reflect this change at all.


You can't fight progress and change in technology. Clinging to Windows 7 isn't going to stop DX12 or Windows 10 proliferation and adoption. Adoption will continue to rise and more and more developers will continue to use DX12. You seem to be moving the goalpost, 50 percent adoption within a year isn't a success? If so many people are upgrading only to revert as you say, we would be seeing a decline in adoption and market share when the data shows the opposite. So far we've seen only two implementations of Vulkan that I'm aware of, DOOM which has been excellent and The Talos Principle, which from my experience has not been great, with performance being worse than DX11 and many rendering anomalies. DX12 has had a somewhat rocky start but we have a much larger sample of games using it and a lot of the problems have been ironed out through patches and drivers.


----------



## Paladin Goo

If what I read is true - they're working on multigpu support for DX12 as well. Excited for that. For whatever reason, it did wonders on Rise of the Tomb Raider for me. I would get drops down to the 40s/50s maxed out on DX11 SLI, but when they enabled multigpu DX12, it smoothed out completely.


----------



## vodkapl

Quote:


> Originally Posted by *the9quad*
> 
> I think consumers will buy a game regardless of API


Yes, and I wouldn't tell anyone wishing for Vulkan to succeed to not buy DX12 games. Not buying DX12 games would be extreme thing.
Quote:


> Originally Posted by *the9quad*
> 
> and we have little to no _direct_ involvement in what API a development studio decides to use.


That's also correct but we can voice ourselves in hopes of influencing them to chose Vulkan and maybe they will listen. Encouraging and asking devs to consider Vulkan is more effective than being quiet about it.
Quote:


> Originally Posted by *the9quad*
> 
> They will of course consider market share, but it will largely come down to resources vs expected gain.


Right now the market share is good for Vulkan and most likely beneficial. If consumers can maintain market share from significantly changing in Windows 10 favour, then we could once again influence devs to go for Vulkan where it's ideal with a multiplatform api.
Quote:


> Originally Posted by *the9quad*
> 
> so how exactly does consumer opinion factor into what API a dev studio chooses? Are they going to look at opinion polls?


It doesn't if the dev studios don't care about their fans input on the matter. But if they do care its possible.
Quote:


> Originally Posted by *Snee*
> 
> You can't fight progress and change in technology. Clinging to Windows 7 isn't going to stop DX12 or Windows 10 proliferation and adoption.


Mantle was change in technology. Do you think Microsoft thought "We can't fight progress and change in technology" when it arrived? So why should consumers? Companies spend money advertising their products, consumers can promote Vulkan maybe it could contribute to success for it. I'd say if enough consumers did that it would contribute positively for Vulkan's success, the real question would then be how many...
DX12 isn't only low level api, there is also Metal and Vulkan.
Shall we forget about the progress and change that they represent? It's like if you think DX12 = ultimate progress simply because Microsoft has a monopoly on PC gaming.


----------



## the9quad

Quote:


> Originally Posted by *vodkapl*
> 
> Right now the market share is good for Vulkan and most likely beneficial. If consumers can maintain market share from significantly changing in Windows 10 favor, then we could once again influence devs to go for Vulkan where it's ideal with a multiplatform api.


It's inevitable that windows 7 and 8 users are slowly going to shrink into nothing, not sure what you mean by "If consumers can maintain market share from significantly changing in Windows 10 favor". That has already happened, and it isn't going to get better. No matter how many die hards there are on tech forums who wont switch.

I understand where you are coming from, don't get me wrong. You make some good points, and if we lived in a perfect world things would go that way.


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> Latest patch today has reduced DX12 stutter and DX12 load times, but it's still not on par with DX11 yet.


Load times got crazy better. Still no stutter on my end though, so I am sticking with DX12.


----------



## vodkapl

Quote:


> Originally Posted by *the9quad*
> 
> It's inevitable that windows 7 and 8 users are slowly going to shrink into nothing, not sure what you mean by "If consumers can maintain market share from significantly changing in Windows 10 favor". That has already happened, and it isn't going to get better. No matter how many die hards there are on tech forums who wont switch.
> 
> I understand where you are coming from, don't get me wrong. You make some good points, and if we lived in a perfect world things would go that way.


Well there is still over 30%+ gamers who aren't on Windows 10 and I think that's good enough to persuade devs to go Vulkan if their game is intended for multiple platforms. But that could shrink easily as you said.
What I hoped with my preaching for Vulkan maybe could get people thinking of starting a consumer group to help Vulkan, however you can as a consumer, succeed.
Valve and AMD are positive towards Vulkan. Hopefully developers in industry realize Vulkan's significance and push it.

Interesting how DX12 vs Vulkan will go in upcoming years. Though I am very pessimistic now.









EDIT: What I meant by "maintain the marketshare" is that I thought we as consumers could, with information/promotion of Vulkan, influence other consumers from not switching, who don't want to change to W10 because of privacy issue or other reasons but feel drawn to do so because of gaming.
I changed to W10 because I've been a Windows user my whole life.
If I knew about Vulkan sooner I would have waited.


----------



## Snee

Quote:


> Originally Posted by *vodkapl*
> 
> Mantle was change in technology. Do you think Microsoft thought "We can't fight progress and change in technology" when it arrived? So why should consumers? Companies spend money advertising their products, consumers can promote Vulkan maybe it could contribute to success for it. I'd say if enough consumers did that it would contribute positively for Vulkan's success, the real question would then be how many...
> DX12 isn't only low level api, there is also Metal and Vulkan.
> Shall we forget about the progress and change that they represent? It's like if you think DX12 = ultimate progress simply because Microsoft has a monopoly on PC gaming.


I don't disagree with what you're saying. I want Vulkan to succeed, but thinking that it will usurp DirectX 12 on PC is a lofty goal and it's one that I don't see being achieved because a small percentage of users are sticking on a soon to be depreciated operating system like Windows 7. Microsoft's response to Mantle (I think it's unclear whether they had DX12 in mind pre-Mantle) was to adapt their technology and _change_. This is what I meant by you "can't stop progress", technology is ever changing and sticking your head in the sand won't help. I think ultimately it rests on the shoulders of developers as to whether Vulkan is a viable option. Looking at what happened with Opengl vs Directx, one has to wonder if history will repeat itself. At least we know developers like Id software and those who are Linux friendly will likely support Vulkan.


----------



## vodkapl

Quote:


> Originally Posted by *Snee*
> 
> I don't disagree with what you're saying. I want Vulkan to succeed, but thinking that it will usurp DirectX 12 on PC is a lofty goal and it's one that I don't see being achieved because a small percentage of users are sticking on a soon to be depreciated operating system like Windows 7. Microsoft's response to Mantle (I think it's unclear whether they had DX12 in mind pre-Mantle) was to adapt their technology and _change_. This is what I meant by you "can't stop progress", technology is ever changing and sticking your head in the sand won't help. I think ultimately it rests on the shoulders of developers as to whether Vulkan is a viable option. Looking at what happened with Opengl vs Directx, one has to wonder if history will repeat itself. At least we know developers like Id software and those who are Linux friendly will likely support Vulkan.


I'd say that consumers organizing themselves to discuss Vulkan, and how we can help it flourish, is opposite of sticking heads in the sand. The reason I disagree with your statement is simply because Vulkan exists.
Microsoft isn't standing around to see if Vulkan will dethrone DX12, they are actively promoting W10 and DX12. Who knows what deals they have with developers concerning DX12 in light of recent news we heard about Lenovo's deal with Microsoft.


----------



## Snee

Quote:


> Originally Posted by *vodkapl*
> 
> I'd say that consumers organizing themselves to discuss Vulkan, and how we can help it flourish, is opposite of sticking heads in the sand. The reason I disagree with your statement is simply because Vulkan exists.
> Microsoft isn't standing around to see if Vulkan will dethrone DX12, they are actively promoting W10 and DX12. Who knows what deals they have with developers concerning DX12 in light of recent news we heard about Lenovo's deal with Microsoft.


There is probably some of that going on. Undoubtedly that is the case with games that are Xbox exclusive on consoles and making their way onto PC with the Windows 10/DX12 only requirement. Valve supporting Vulkan might make a big splash as well as it powering VR applications if that happens to take off.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *moustang*
> 
> Your cherry picked news stores ≠ public opinion.
> 
> How's this for public opinion:
> 
> http://www.pcworld.com/article/3078133/software-games/steams-may-survey-shows-its-users-prefer-windows-10.html
> 
> 41.48% OF GAMERS are using Windows 10 on Steam. More than any other OS. This was a 2% increase in Windows 10 users IN A SINGLE MONTH.
> 
> Now that's public opinion at work.


You obviously don't understand the meaning of cherry picked. I simply searched Windows 10 in the search bar and that's what came up for news articles. Was quite simple really and took no effort whatsoever. Hardly Cherry picking. And with that quantity of news articles, again, you do not understand the definition of "cherry picked." The only thing that took effort was reading through all the articles.

Regardless, as I said to others, read through the news articles that I posted or don't. Look at how ACTUAL people feel about it, instead of basing your opinion solely on statistics that may or not be accurate (this comes to light when you actually see people's responses in those threads).
Quote:


> Originally Posted by *vodkapl*
> 
> The problem is that consumers can't wait for DX12 to mess up. Vulkan is clearly the better api for PC. Right now it's lacking functionality, functionality that's being worked on, but that's way less than what DX12 lacks which is compatibility across all platforms.
> Every week that goes by is another week where Vulkan will lose it's purpose (cross platform). And if Windows 10 gets enough marketshare to a point most developers won't bother with Vulkan it's game over. Right now there is 30%+ that does not use W10. Enough for many if not all devs to use Vulkan for games that they plan on releasing on many platforms.
> But that won't stay the same in couple years. It would be first thing I would adress if I was leading group of consumers to help Vulkan succeed.
> 
> I think best thing consumers can do to help Vulkan or any api that works across all platforms is to inform lots and lots of people. What makes Vulkan great? What makes DX12 good but not great? And from there we can sit and watch and see what people do. No more can be done than that.


I only recently started thinking about this whole ordeal, and have begun offering my opinion on the future of gaming. It has only been recently that I realized that having Microsoft succeed yet again as having a monopoly on APIs and Vulkan failing could be detrimental to the industry. Hopefully more people come to this realization because honestly we have been on DX11 for what? 6 Years? Maybe two more years of DX11 games and then it's all DX12 for another 6 years until Vulkan slowly fades away like OpenGL did. We really cannot have that happen again. People need to realize that they MUST ALWAYS have a choice. With anything in life you must have a choice. If you cannot have a choice then that is a problem.

If you do not have a choice in your games to select which API you want to use and which company you want to support then this basically goes against everything that people expect out of PC gaming. We have so many choices in the PC Industry, why not with our APIs?

I'm working on getting a Camera to start making Youtube videos and I plan on touching upon many topics when I do, but this is one that I will definitely be talking about because I feel it is pretty important.
Quote:


> Originally Posted by *Crazy9000*
> 
> If it shows up rarely, it's going to be less likely to record a temporary windows 10 switch, not more.


How do you figure that? If you upgrade to Windows 10 and Steam Survey comes up, you fill it out, and then few months down the road you find yourself not liking Windows 10 and you revert back to 7 or 8 how is Steam going to show this at all? It won't. Another thing, say you keep Windows 10 installed but you decide to dual-boot with another OS even though you filled out the Steam survey while you were on Windows 10. You might have planned to stick to Windows 10 originally but changed your mind later on. Is Steam going to reflect this in their statistics? That the individual is actually dual-booting Windows 10 now and it's only being used for games but the person is using Windows 7 or 8 instead as their daily driver. Again Steam would not reflect this.
Quote:


> Originally Posted by *Snee*
> 
> You can't fight progress and change in technology. Clinging to Windows 7 isn't going to stop DX12 or Windows 10 proliferation and adoption. Adoption will continue to rise and more and more developers will continue to use DX12. You seem to be moving the goalpost, 50 percent adoption within a year isn't a success? If so many people are upgrading only to revert as you say, we would be seeing a decline in adoption and market share when the data shows the opposite. So far we've seen only two implementations of Vulkan that I'm aware of, DOOM which has been excellent and The Talos Principle, which from my experience has not been great, with performance being worse than DX11 and many rendering anomalies. DX12 has had a somewhat rocky start but we have a much larger sample of games using it and a lot of the problems have been ironed out through patches and drivers.


People can be using Windows 10 only for games and dual-booting another OS as their daily driver. Seems more logical after you read through all those threads (which I doubt you did) and seeing the public opinion on the operating system. I mean all the people who have so far disagreed with me have clearly not taken the time to read through all those threads as I have or have not been a part of the threads themselves. So until you do that, you have no right exclaiming that I'm moving the goal post because you are not even taking the time to see the perspective that I have on the subject (seems to be a common problem today, the ability to put yourself in someone else's shoes). And I think you are confusing logical deduction with moving the goal post. Two completely different things.









Another thing, since you want to rely on numbers. Explain to me how one website shows 47% Marketshare for Windows 10 but another website Shows 47% Marketshare for Windows 7. Now realistically, I will have good faith that you can think for yourself and realize that doesn't make too much sense.

Also, how is locking down an API to a single operating system progress? Sure DX12 itself and the concept itself is progress but locking it down to a single operating system is not. This goes against everything that PC Builders expect out of PCs. That type of mentality is usually reserved for Consoles. Something that the majority of us do not agree with (locking anything down). So explain to me how the majority of people are interested in competition and choice but at the same time they all adopted Windows 10 and even better nearly 50% of the Market did? Again as I said earlier, read through the links, see the responses. After reading all of that, I'm sure you will realize as I did, something is not adding up or making sense. And even if the numbers are somehow accurate, and my belief that they aren't is erroneous; explain to me where all the people who were against Windows 10 went?


----------



## huzzug

I do not think that the adoption of Win10 is going to stop even if we can get a consumer forum for Vulkan going. Sure if consumers stay with their win7 /8.1, they'd make a difference, but how much of a difference if hip and trendy is the new "it" thing. Everyone and their mothers want something that is new and bling !! Consumers being sheep also does not help with what one wants to achieve. What needs to happen is to get actual developers on the forum. Afterall, they're the ones making games and they are always going to choose whatever makes most sense in terms of coding, resource management and savings in man hours as well as the corresponding performance from using the said API. Even if Win7 continues to be a dominant platform, devs would still choose DX11 with a DX12 wrapper for those on Win10 rather than Vulkan with a DX12 wrapper or no DX12.


----------



## deepor

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> [...] Look at how ACTUAL people feel about it, instead of basing your opinion solely on statistics that may or not be accurate (this comes to light when you actually see people's responses in those threads).
> 
> [...]


You guys are simply both right! There's people that don't like Windows 10 and would like for example Windows 7 better. This does not mean that they won't use Windows 10. People still upgrade as they feel there's no choice and so both things can be true. The majority can dislike using Windows 10 over 7, and Windows 10 can still be the most used OS for gaming. Then afterwards you just have to wait for a while, and people will forget and using Windows 10 will just feel like the normal thing to do.


----------



## tpi2007

Even though AMD created Mantle which then led to Vulkan, they got in bed with Microsoft for DX 12 and Windows 10 because they can't afford not to. Otherwise this Deus Ex game would have Vulkan support and bolster AMD's performance achievements on a much bigger OS share. So, that lost money must be coming from Microsoft's pockets.

Also, if they could afford to promote Vulkan more, the new Battlefield 1 would have Vulkan support instead of DX 12, but up to now it seemingly doesn't and won't. It would seemingly make sense that if Battlefield 4 and Battlefiled Hardline support Mantle, the next major game in the franchise would support its natural progression, Vulkan, but apparently it won't, instead EA chose to spend its resources on a Microsoft partnership for DX 12. I searched for any news about Battlefield 1 having Vulkan support and found nothing. DX 12 support on the other hand, seems like it's there.

Also, on the bigger question: http://gamingbolt.com/id-software-dev-puzzled-by-devs-choosing-dx12-over-vulkan-claims-xbox-one-dx12-is-different-than-pc
Quote:


> and this is something that Axel Gneiting, engine programmer at id Software, doesn't necessarily approve of.
> 
> Speaking on Twitter, Gneiting said that developers using DirectX 12 over Vulkan 'literally makes no sense.' Elaborating on his stance, and in response to some questions, Gneiting pointed out that with Windows 7 forming a major chunk of the PC gaming market, and with DirectX 12 being incompatible with Windows 7, using DirectX in an attempt to have 'one codebase' makes no sense, since developers would need to create two separate ones anyway. He pointed out that the argument that programming for Xbox One and Windows 10 becomes easier by using DirectX 12 is moot too, because DirectX 12 on Windows and on Xbox is very different, necessitating two separate code paths anyway.
> 
> He also made some observations about how a lot of the perceived benefits of DirectX 12 are not exclusive to it, noting that both Vulkan and DirectX give similar performance benefits anyway.


Quote:


> Originally Posted by *BiG StroOnZ*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *moustang*
> 
> Your cherry picked news stores ≠ public opinion.
> 
> How's this for public opinion:
> 
> http://www.pcworld.com/article/3078133/software-games/steams-may-survey-shows-its-users-prefer-windows-10.html
> 
> 41.48% OF GAMERS are using Windows 10 on Steam. More than any other OS. This was a 2% increase in Windows 10 users IN A SINGLE MONTH.
> 
> Now that's public opinion at work.
> 
> 
> 
> You obviously don't understand the meaning of cherry picked. I simply searched Windows 10 in the search bar and that's what came up for news articles. Was quite simple really and took no effort whatsoever. Hardly Cherry picking. And with that quantity of news articles, again, you do not understand the definition of "cherry picked." The only thing that took effort was reading through all the articles.
> 
> Regardless, as I said to others, read through the news articles that I posted or don't. Look at how ACTUAL people feel about it, instead of basing your opinion solely on statistics that may or not be accurate (this comes to light when you actually see people's responses in those threads).
> Quote:
> 
> 
> 
> Originally Posted by *vodkapl*
> 
> The problem is that consumers can't wait for DX12 to mess up. Vulkan is clearly the better api for PC. Right now it's lacking functionality, functionality that's being worked on, but that's way less than what DX12 lacks which is compatibility across all platforms.
> Every week that goes by is another week where Vulkan will lose it's purpose (cross platform). And if Windows 10 gets enough marketshare to a point most developers won't bother with Vulkan it's game over. Right now there is 30%+ that does not use W10. Enough for many if not all devs to use Vulkan for games that they plan on releasing on many platforms.
> But that won't stay the same in couple years. It would be first thing I would adress if I was leading group of consumers to help Vulkan succeed.
> 
> I think best thing consumers can do to help Vulkan or any api that works across all platforms is to inform lots and lots of people. What makes Vulkan great? What makes DX12 good but not great? And from there we can sit and watch and see what people do. No more can be done than that.
> 
> Click to expand...
> 
> I only recently started thinking about this whole ordeal, and have begun offering my opinion on the future of gaming. It has only been recently that I realized that having Microsoft succeed yet again as having a monopoly on APIs and Vulkan failing could be detrimental to the industry. Hopefully more people come to this realization because honestly we have been on DX11 for what? 6 Years? Maybe two more years of DX11 games and then it's all DX12 for another 6 years until Vulkan slowly fades away like OpenGL did. We really cannot have that happen again. People need to realize that they MUST ALWAYS have a choice. With anything in life you must have a choice. If you cannot have a choice then that is a problem.
> 
> If you do not have a choice in your games to select which API you want to use and which company you want to support then this basically goes against everything that people expect out of PC gaming. We have so many choices in the PC Industry, why not with our APIs?
> 
> I'm working on getting a Camera to start making Youtube videos and I plan on touching upon many topics when I do, but this is one that I will definitely be talking about because I feel it is pretty important.
> Quote:
> 
> 
> 
> Originally Posted by *Crazy9000*
> 
> If it shows up rarely, it's going to be less likely to record a temporary windows 10 switch, not more.
> 
> Click to expand...
> 
> How do you figure that? If you upgrade to Windows 10 and Steam Survey comes up, you fill it out, and then few months down the road you find yourself not liking Windows 10 and you revert back to 7 or 8 how is Steam going to show this at all? It won't. Another thing, say you keep Windows 10 installed but you decide to dual-boot with another OS even though you filled out the Steam survey while you were on Windows 10. You might have planned to stick to Windows 10 originally but changed your mind later on. Is Steam going to reflect this in their statistics? That the individual is actually dual-booting Windows 10 now and it's only being used for games but the person is using Windows 7 or 8 instead as their daily driver. Again Steam would not reflect this.
> Quote:
> 
> 
> 
> Originally Posted by *Snee*
> 
> You can't fight progress and change in technology. Clinging to Windows 7 isn't going to stop DX12 or Windows 10 proliferation and adoption. Adoption will continue to rise and more and more developers will continue to use DX12. You seem to be moving the goalpost, 50 percent adoption within a year isn't a success? If so many people are upgrading only to revert as you say, we would be seeing a decline in adoption and market share when the data shows the opposite. So far we've seen only two implementations of Vulkan that I'm aware of, DOOM which has been excellent and The Talos Principle, which from my experience has not been great, with performance being worse than DX11 and many rendering anomalies. DX12 has had a somewhat rocky start but we have a much larger sample of games using it and a lot of the problems have been ironed out through patches and drivers.
> 
> Click to expand...
> 
> People can be using Windows 10 only for games and dual-booting another OS as their daily driver. Seems more logical after you read through all those threads (which I doubt you did) and seeing the public opinion on the operating system. I mean all the people who have so far disagreed with me have clearly not taken the time to read through all those threads as I have or have not been a part of the threads themselves. So until you do that, you have no right exclaiming that I'm moving the goal post because you are not even taking the time to see the perspective that I have on the subject (seems to be a common problem today, the ability to put yourself in someone else's shoes). And I think you are confusing logical deduction with moving the goal post. Two completely different things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another thing, since you want to rely on numbers. Explain to me how one website shows 47% Marketshare for Windows 10 but another website Shows 47% Marketshare for Windows 7. Now realistically, I will have good faith that you can think for yourself and realize that doesn't make too much sense.
> 
> Also, how is locking down an API to a single operating system progress? Sure DX12 itself and the concept itself is progress but locking it down to a single operating system is not. This goes against everything that PC Builders expect out of PCs. That type of mentality is usually reserved for Consoles. Something that the majority of us do not agree with (locking anything down). So explain to me how the majority of people are interested in competition and choice but at the same time they all adopted Windows 10 and even better nearly 50% of the Market did? Again as I said earlier, read through the links, see the responses. After reading all of that, I'm sure you will realize as I did, something is not adding up or making sense. And even if the numbers are somehow accurate, and my belief that they aren't is erroneous; explain to me where all the people who were against Windows 10 went?
Click to expand...

Fresh news: http://arstechnica.com/information-technology/2016/09/consumer-group-microsoft-should-compensate-unhappy-windows-10-upgraders
Quote:


> British consumer rights group Which? isn't very happy with the way that Microsoft has handled Windows 10. The group says that Microsoft should compensate Windows 10 users for when the upgrade caused downtime due to software or hardware incompatibility, and it needs to do more to ensure that Windows users are aware of the customer support options that are available to them.
> 
> This comes after a June survey of Windows users showed that 12 percent of upgraders reverted back to Windows 7 or 8.1, with a majority of those downgraders saying that the upgrade adversely affected their PC.
> 
> The group cited a laundry list of complaints about the upgrade, with most of the complaints boiling down to compatibility issues. Hardware compatibility was a particular problem, with devices such as printers and Wi-Fi adapters ceasing to function after installing Windows 10. This points at one of the more unsatisfactory aspects of the upgrade. Although in principle the upgrade should have verified that there were no compatibility issues (both hardware and software), there are numerous reports that this didn't work in practice, with the upgrade being pushed to machines that were then left partially inoperable as a result.


Quote:


> Which? also reported that the Windows 10 upgrade would uninstall some software, including unspecified Norton software and Microsoft's own Office 2010. The group said that this was a problem if users had lost their original product keys or install media, as it could leave them with no alternative but to repurchase the software.
> 
> Perhaps worst of all, some users say that they had to pay for third-party repairs to get their systems operational again after the upgrade.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *deepor*
> 
> You guys are simply both right! There's people that don't like Windows 10 and would like for example Windows 7 better. This does not mean that they won't use Windows 10. People still upgrade as they feel there's no choice and so both things can be true. The majority can dislike using Windows 10 over 7, and Windows 10 can still be the most used OS for gaming. Then afterwards you just have to wait for a while, and people will forget and using Windows 10 will just feel like the normal thing to do.


Well that's one of the reasons why I suggested there might be a lot of dual-booters too.
Quote:


> Originally Posted by *tpi2007*
> 
> Fresh news: http://arstechnica.com/information-technology/2016/09/consumer-group-microsoft-should-compensate-unhappy-windows-10-upgraders
> Quote:
> 
> 
> 
> British consumer rights group Which? isn't very happy with the way that Microsoft has handled Windows 10. The group says that Microsoft should compensate Windows 10 users for when the upgrade caused downtime due to software or hardware incompatibility, and it needs to do more to ensure that Windows users are aware of the customer support options that are available to them.
> 
> This comes after a June survey of Windows users showed that 12 percent of upgraders reverted back to Windows 7 or 8.1, with a majority of those downgraders saying that the upgrade adversely affected their PC.
> 
> The group cited a laundry list of complaints about the upgrade, with most of the complaints boiling down to compatibility issues. Hardware compatibility was a particular problem, with devices such as printers and Wi-Fi adapters ceasing to function after installing Windows 10. This points at one of the more unsatisfactory aspects of the upgrade. Although in principle the upgrade should have verified that there were no compatibility issues (both hardware and software), there are numerous reports that this didn't work in practice, with the upgrade being pushed to machines that were then left partially inoperable as a result.
> Quote:
> Which? also reported that the Windows 10 upgrade would uninstall some software, including unspecified Norton software and Microsoft's own Office 2010. The group said that this was a problem if users had lost their original product keys or install media, as it could leave them with no alternative but to repurchase the software.
> 
> Perhaps worst of all, some users say that they had to pay for third-party repairs to get their systems operational again after the upgrade.
> 
> 
> 
> Also, on the bigger question: http://gamingbolt.com/id-software-dev-puzzled-by-devs-choosing-dx12-over-vulkan-claims-xbox-one-dx12-is-different-than-pc
Click to expand...

Well, it's a survey based on 5500 people. It was installed on 270 *million* devices back in April. But having a consumer rights group calling for compensation is definitely not good publicity for the product at all and further proves that people are unhappy. Especially when the the group "cited a _laundry list_ of complaints about the upgrade."

http://www.zdnet.com/article/windows-10-is-on-track-for-a-billion-users-but-do-independent-stats-confirm-microsofts-numbers/

But I have to read through the second article, maybe it should be posted in news section.



Basically sums it up.


----------



## tpi2007

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Well, it's a survey based on 5500 people. It was installed on 270 *million* devices back in April.
> 
> http://www.zdnet.com/article/windows-10-is-on-track-for-a-billion-users-but-do-independent-stats-confirm-microsofts-numbers/
> 
> But I have to read through the second article, maybe it should be posted in news section.
> 
> 
> 
> Basically sums it up.


Well, surveys are just like that, it's not possible to have a 100% sample, so you go for something that can be statistically relevant. From all the links you provided and the articles written for the past months, it seems about right and predictable that it would happen.

Note that the story on the link you provided in the post above is no longer true, Microsoft has already publicly admitted that it won't reach 1 billion devices running Windows 10 in the time frame they originally set out. Also interesting is that following the 29 July "free" upgrade deadline they still haven't publicly updated the upgrade numbers.

On another note but related to statistics, people should read Steam's hardware statistics fully and in context. What do I mean? A large part of gamers on Steam are there to play Dota 2, Team Fortress 2 and CS:GO. The first two are free to play and the latter is cheap and can be had for even cheaper on a sale.

These three occupy the top three slots on the Steam player stats, with Dota 2 and CS:GO with peak stats that are on average at least 10 times higher than anything else on the list.

Taken just now:



These are games that are DX 9 based (TF2's min. requirements actually say it can even run on DX 8.1!) and have very low system requirements, which matches up with the still high amount of dual core CPUs usage, which is even with 4 core ones: 2 CPUs at 46.87% and 4 CPUs at 46.48% as of August. Used GPUs are still mostly budget and entry level ones, with the exception of the GTX 970 at number one, the rest of the GPUs in the first twenty are all budget and that matches up with the dominant amount of VRAM being 1 GB.

So, let's not pretend that Steam is the holy grail of statistics because first and foremost it tells what Steam users have and a big part of that has to do with the platform's mainstays of Dota 2, CS:GO and TF2. It is what is is and should be read in context.

Many of the people there upgraded to Windows 10 because they could, not because they had to, and surely not to take advantage of DX 12. Arrandale, Sandy Bridge and Ivy Bridge based graphics:


Intel HD Graphics 4000 - 2.97%
Intel HD Graphics 3000 - 2.00%
Intel Valleyview Baytrail - 1.64%
Intel Ivy Bridge - 1.37%
Intel HD Graphics 2000 - 0.92%
Intel Sandy Bridge - 0.75%
Intel Ironlake Arrandale - 0.64%
account for 10.29% of Steam users, which is more than twice the number of GTX 970 users (5.08%) and neither of these Intel iGPUs support DX 12.


----------



## DracoNB

Quote:


> Originally Posted by *tpi2007*
> 
> Even though AMD created Mantle which then led to Vulkan, they got in bed with Microsoft for DX 12 and Windows 10 because they can't afford not to. Otherwise this Deus Ex game would have Vulkan support and bolster AMD's performance achievements on a much bigger OS share. So, that lost money must be coming from Microsoft's pockets.
> 
> Also, if they could afford to promote Vulkan more, the new Battlefield 1 would have Vulkan support instead of DX 12, but up to now it seemingly doesn't and won't. It would seemingly make sense that if Battlefield 4 and Battlefiled Hardline support Mantle, the next major game in the franchise would support its natural progression, Vulkan, but apparently it won't, instead EA chose to spend its resources on a Microsoft partnership for DX 12. I searched for any news about Battlefield 1 having Vulkan support and found nothing. DX 12 support on the other hand, seems like it's there.


Or... Maybe, just maybe, DX12 is better suited API for their needs? Not to mention direct translation to the XB1 version of the game.


----------



## vodkapl

Quote:


> Originally Posted by *DracoNB*
> 
> Or... Maybe, just maybe, DX12 is better suited API for their needs? Not to mention direct translation to the XB1 version of the game.


Possibly this, and that Vulkan wasn't good enough at start of game's development and Microsoft showed them a shiny penny


----------



## tpi2007

Quote:


> Originally Posted by *DracoNB*
> 
> Or... Maybe, just maybe, DX12 is better suited API for their needs? Not to mention direct translation to the XB1 version of the game.


With their "needs" being financial gains to be had from Microsoft incentives, not API needs; there is nothing that DX 12 can do that Vulkan can't, and besides using Vulkan guarantees you a much wider install base.

Also:

http://gamingbolt.com/id-software-dev-puzzled-by-devs-choosing-dx12-over-vulkan-claims-xbox-one-dx12-is-different-than-pc



It's not that direct. The Xbox One SoC has specificities that a PC doesn't, like that special ESRAM.


----------



## PhRe4k

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> http://www.overclock.net/t/1601965/wccf-new-low-for-microsoft-removes-the-x-button-from-windows-10-upgrade-dialog
> http://www.overclock.net/t/1583423/tt-windows-7-users-arent-budging-despite-free-windows-10-upgrade
> http://www.overclock.net/t/1606538/techspot-france-gives-microsoft-three-months-to-stop-windows-10-from-collecting-excessive-user-data
> http://www.overclock.net/t/1606541/extremetech-windows-10-gets-one-last-desperate-nagware-update
> http://www.overclock.net/t/1604272/eurogamer-woman-awarded-10k-after-suing-microsoft-for-sneaky-windows-10-upgrade
> http://www.overclock.net/t/1602317/newsweek-microsoft-s-malware-like-windows-10-automatic-upgrades-cost-anti-poaching-nonprofit-thousands-group-says
> http://www.overclock.net/t/1599301/betanews-windows-10-ruins-a-pro-gaming-stream-with-a-badly-timed-update
> http://www.overclock.net/t/1607212/pcworld-microsoft-faces-two-new-lawsuits-over-aggressive-windows-10-upgrade-tactics
> http://www.overclock.net/t/1569318/ars-even-when-told-not-to-windows-10-just-can-t-stop-talking-to-microsoft
> http://www.overclock.net/t/1575181/zdnet-microsoft-tries-to-clear-the-air-on-windows-10-privacy-furor
> http://www.overclock.net/t/1579074/forbes-microsoft-admits-windows-10-automatic-spying-cannot-be-stopped
> http://www.overclock.net/t/1581859/slashdot-windows-10-fall-update-uninstalls-desktop-software-without-informing-users
> http://www.overclock.net/t/1589351/extremetech-microsoft-now-preloading-tripadvisor-bloatware-into-windows-10
> http://www.overclock.net/t/1577076/zdnet-windows-10-upgrade-nags-become-more-aggressive-offer-no-opt-out
> http://www.overclock.net/t/1578670/ars-tv-windows-10-will-be-automatically-downloaded-to-windows-7-and-windows-8-machines-next-year
> http://www.overclock.net/t/1567525/rt-incredibly-intrusive-windows-10-spies-on-you-by-default
> http://www.overclock.net/t/1566715/forbes-gamers-should-be-worried-about-windows-10-automatic-updates
> http://www.overclock.net/t/1592002/the-hacker-news-windows-10-sends-your-data-5500-times-every-day-even-after-tweaking-privacy-settings
> http://www.overclock.net/t/1568055/myce-windows-10-updates-can-disable-pirated-games-and-unauthorized-hardware
> http://www.overclock.net/t/1567830/zh-the-surveillance-state-goes-mainstream-windows-10-is-watching-logging-everything
> http://www.overclock.net/t/1565400/ars-windows-10-updates-to-be-automatic-and-mandatory-for-home-users
> 
> Take a look at public opinion and decide for yourself.
> 
> 
> 
> 
> 
> 
> 
> 
> That's what I will be doing for sure is playing my part. But problem is most gamers will get worried that they cant play a new Microsoft exclusive title (like Forza for instance) and upgrade to Win 10 just so they can play the game. Regardless of whether they hate Windows 10 or not or any of the Win10 antics. Then of course AMD users will be like, "Well, I'm missing out on potential performance so I HAVE TO UPGRADE to Win 10!" (see what AMD and Microsoft did here...). This is a bit worrisome but hopefully a good portion of the community stays strong. Just because you are missing out on one game, does not mean there isn't another out there that won't satisfy your craving.
> I don't recall going from Windows XP directly to Windows 7. I remember there being an intermediary OS that came before it, that was proven to be the reason why people wanted to stay on XP. You seem to have forgotten about it but it was called Windows Vista and that was primarily the reason why people wanted to stay on XP. It really had nothing to do with DX11. It had to do with Windows Vista being atrocious.
> 
> This time again, it has nothing to do with DX12. I have no problem with DX12, I have a problem with them locking down DX12 to a nonsense rubbish spyware infested operating system. It's so bad that I had Spybot Anti-Beacon installed on my computer (on a Windows 7 machine) because they added Telemetry nonsense to Windows 7 through updates. Yet after a few updates slipped through the cracks (ones that I did not approve of and were downloaded and installed automatically - even though updates were disabled). Microsoft had the ability to turn off my internet connection on my computer because of the fact that I had all the Telemetry nonsense disabled. As soon as I deactivated Spybot Anti-Beacon my internet suddenly worked again. Digging deeper I did indeed find that it was a matter of Windows 10 updates being installed onto my Windows 7 computer. I had to go search for individual executables in the Windows folder that Microsoft installed onto my computer and remove them by hand in order to get my internet working again while having Anti-Beacon installed and enabled at the same time.
> 
> The fact that Microsoft has the power to disable your internet because you want to disable Telemetry is violation of personal rights. End of story. It's not like any of this data mining is being used for any good. Is there still violent crime? Is there still terrorism? Is there still evil? Yes. So please do explain to me why I need to give up my rights when the spying and the data mining is NOT BEING USED to better the world around me.


This is why I game on Linux, sacrificing some of my games is preferable IMO to sacrificing my privacy


----------



## DracoNB

Quote:


> Originally Posted by *tpi2007*
> 
> With their "needs" being financial gains to be had from Microsoft incentives, not API needs; there is nothing that DX 12 can do that Vulkan can't, and besides using Vulkan guarantees you a much wider install base.
> 
> Also:
> 
> http://gamingbolt.com/id-software-dev-puzzled-by-devs-choosing-dx12-over-vulkan-claims-xbox-one-dx12-is-different-than-pc
> 
> 
> 
> It's not that direct. The Xbox One SoC has specificities that a PC doesn't, like that special ESRAM.


Its much closer to DX12 than Vulkan would be.

DX12 offers MGPU support, Vulkan does.

DX12 also has a bunch more APIs and parts than just graphics, vulkan is only graphics.

There is a lot missing from vulkan that is in DX12.

Can you please show a source of these financial gains paid to them from Microsoft?

And sorry I can't take ID's claims without massive bias. They've always used OpenGL over DirectX even though directx is clearly superior. How much closer would Vulkan and DX11 be if they'd used it instead of OpenGL?

DICE helped write Mantle which then turned into Vulkan. If there was gains to be had over DX12 don't you think they would have used it?


----------



## vodkapl

Quote:


> Originally Posted by *DracoNB*
> 
> DICE helped write Mantle which then turned into Vulkan. If there was gains to be had over DX12 don't you think they would have used it?


DX12 was influenced by Mantle...
Furthermore why would a company, who has locked many gamers to their OS through their api, decide to make a risky move and use a different api that's not their own? One major reason they don't use Vulkan, or didn't use Mantle, is because again their OS share depends heavily on gamers.
If gaming flourishes to same or close extent on Linux as Windows people will switch, and when more and more do it will increase Linux marketshare which will in turn make the devs who are money driven develop their programs for Linux as well.
Most likely virus creators will come too.
Then you'll see W10 marketshare drop and so will Microsoft's revenue as result of that.

Speaking of gains, isn't cross platform a major gain or should we consider that not a gain because it goes against Microsoft's business?


----------



## the9quad

Even if vulkan was a huge success, the amount of gamers who would switch to linux would still be minuscule. Windows is always going to be the dominant OS. In addition, people are eventually going to upgrade past the OS they are currently on, it is inevitable (those stuck on 7 used to use XP I bet). Those are the cold hard non tinfoil hat wearing facts.


----------



## vodkapl

Quote:


> Originally Posted by *the9quad*
> 
> *Even if vulkan was a huge success, the amount of gamers who would switch to linux would still be minuscule. Windows is always going to be the dominant OS*. In addition, people are eventually going to upgrade past the OS they are currently on, it is inevitable (those stuck on 7 used to use XP I bet). Those are the cold hard non tinfoil hat wearing facts.


I strongly disagree though it will take substantial amount of time. That is IF Vulkan becomes a huge success for consumers.
Many people will slowly but surely realize Linux gives them more benefits than Windows and make the switch. There are already many who are dual booting because they don't like certain things by Windows and those who aren't dual booting but would switch to Linux 100% if X game or X amount of games were available. They have already made up their mind so it's leaves those who are unaware or haven't thought about switching or OS choice.


----------



## the9quad

Quote:


> Originally Posted by *vodkapl*
> 
> I strongly disagree though it will take substantial amount of time. That is IF Vulkan becomes a huge success for consumers.
> Many people will slowly but surely realize Linux gives them more benefits than Windows and make the switch. There are already many who are dual booting because they don't like certain things by Windows and those who aren't dual booting but would switch to Linux 100% if X game or X amount of games were available. They have already made up their mind so it's leaves those who are unaware or haven't thought about switching or OS choice.


Your welcome to that opinion, but based on past history linux will never amount to a significant market share no matter what. I highly doubt a gaming API is going to change that. And those dual booters you talk about are an extremely small fraction of the market. I think you overestimate the amount of gamers who even think of linux as an option at all. Easy to over-estimate when you spend time on forums like this. Problem is people on these forums are a small fraction and even within this small fraction linux users are another small fraction of that. Among the real world gaming people, linux is so small it might as well not exist.

For example I think the last survey on steam had linux users at not even 1%......., 0.83% to be exact. There are 3 times as many people who are still on 56k than there are linux users to put that in perspective. There are over twice as many still on 33.6kbps.. let that sink in for a moment. Heck there are more people who have only 1GB system ram then people who are on Linux.


----------



## Potatolisk

If the games I play were as good on Linux I would switch or at least dual boot.
Sure it would take a long time for everyone to switch, but we would get good few percent of market more each year.


----------



## vodkapl

Quote:


> Originally Posted by *the9quad*
> 
> Your welcome to that opinion, but based on past history linux will never amount to a significant market share no matter what. I highly doubt a gaming API is going to change that. And those dual booters you talk about are an extremely small fraction of the market. I think you overestimate the amount of gamers who even think of linux as an option at all. Easy to over-estimate when you spend time on forums like this. Problem is people on these forums are a small fraction and even within this small fraction linux users are another small fraction of that. Among the real world gaming people, linux is so small it might as well not exist.
> 
> For example I think the last survey on steam had linux users at not even 1%......., 0.83% to be exact. There are 3 times as many people who are still on 56k than there are linux users to put that in perspective. There are over twice as many still on 33.6kbps.. let that sink in for a moment. Heck there are more people who have only 1GB system ram then people who are on Linux.


Perhaps you're right.


----------



## PhRe4k

Quote:


> Originally Posted by *vodkapl*
> 
> Perhaps you're right.


He may be right, but either way it makes no difference to me at least. I dont like the direction companies like Google and Microsoft are headed, and in a way I prefer not having to deal with triple AAA games/devs/publishers. Not like i've ever play every single games on Linux or Windows anyway (at the time of this post, there are *5522* Steam linux titles)


----------



## the9quad

Quote:


> Originally Posted by *PhRe4k*
> 
> He may be right, but either way it makes no difference to me at least. I dont like the direction companies like Google and Microsoft are headed, and in a way I prefer not having to deal with triple AAA games/devs/publishers. Not like i've ever play every single games on Linux or Windows anyway (at the time of this post, there are *5522* Steam linux titles)


There is no shame in that man, I am not anti-linux. I am just saying to be realistic about it.


----------



## PhRe4k

Quote:


> Originally Posted by *the9quad*
> 
> There is no shame in that man, I am not anti-linux. I am just saying to be realistic about it.


I hear you, many of us Linux guys get overly ambitions but end up getting disappointed. Linux still isn't 100 percent comparable to Windows as far as game library but it's much closer than it was even a few years ago thanks to Steam and GOG


----------



## Jay-Sharp

Using the latest dx12 patch, on Nvidia, the GPU performance is now mostly a match for dx11 (on a 1080). Sadly, the CPU usage is still not ideal in dense crowded areas, with dx11 scaling better with thread usage.

GPU bound scenarios are obviously getting a boost on the midranges, so to that extent dx12 is seemingly better.

As an Nvidia user, I don't know the extent to which their dx11 driver is doing the heavy lifting. Could be that nvidias dx11 implementation is just much better for what this game needs ( lots of AI, open world, interactions).


----------



## DracoNB

Quote:


> Originally Posted by *vodkapl*
> 
> DX12 was influenced by Mantle...
> Furthermore why would a company, who has locked many gamers to their OS through their api, decide to make a risky move and use a different api that's not their own? One major reason they don't use Vulkan, or didn't use Mantle, is because again their OS share depends heavily on gamers.
> If gaming flourishes to same or close extent on Linux as Windows people will switch, and when more and more do it will increase Linux marketshare which will in turn make the devs who are money driven develop their programs for Linux as well.
> Most likely virus creators will come too.
> Then you'll see W10 marketshare drop and so will Microsoft's revenue as result of that.
> 
> Speaking of gains, isn't cross platform a major gain or should we consider that not a gain because it goes against Microsoft's business?


What are you talking about?? I was asking why DICE, who developed Mantle, went with DX12 over Vulkan if Vulkan is superior. I also listed many features that DX12 has that Vulkan doesn't but you ignored those.


----------



## PontiacGTX

Quote:


> Originally Posted by *DracoNB*
> 
> What are you talking about?? I was asking why DICE, who developed Mantle, went with DX12 over Vulkan if Vulkan is superior. I also listed many features that DX12 has that Vulkan doesn't but you ignored those.


Maybe for them it would be easier to work with less APIs or the one which could allow them to work with other platform without big changes into the game? or maybe because consoles have their priority


----------



## sumitlian

Quote:


> Originally Posted by *DracoNB*
> 
> What are you talking about?? I was asking why DICE, who developed Mantle, went with DX12 over Vulkan if Vulkan is superior. I also listed many features that DX12 has that Vulkan doesn't but you ignored those.


Because Vulkan is more close to the metal of GCN than the metal of Nvidia architectures, why would DICE choose an API which won't give benefits to majority of gamers aka Nvidia owners. DICE prioritizes multiplayer games, and to get market share they need to make the game work as fast as possible for wide rage of hardware that includes Intel HD as well.

DICE using DX12 over Vulkan doesn't make Vulkan any less good than DX12. Yes in terms of complete application development, DX12 seems more suitable as we know it contains sound and input API/libraries as well. But as a user here a page ago already has provided a link showing very few games have been built using Microsoft DirectSound, it also proves not many developers use M$'s sound API.

A user here also have told us, coming from his past posts he seems to have knowledge of graphics technology and programming.
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> With vulkan you can go closer to the hardware than dx12 and with spir-v and specific vendor extensions there are more advantages to vulkan. There is nothing to debate there are so many informations out there. Now we just need vulkan games
Click to expand...

Vulkan provides vendor specific support, means it guarantees to run at optimal speed for a specific architecture. But again it doesn't matter in real world much, if majority of users do not have relevant hardware, it would be stupid to use such API for online game.


----------



## Newbie2009

Quote:


> Originally Posted by *sumitlian*
> 
> Because Vulkan is more close to the metal of GCN than the metal of Nvidia architectures, why would DICE choose an API which won't give benefits to majority of gamers aka Nvidia owners. DICE prioritizes multiplayer games, and to get market share they need to make the game work as fast as possible for wide rage of hardware that includes Intel HD as well.
> 
> DICE using DX12 over Vulkan doesn't make Vulkan any less good than DX12. Yes in terms of complete application development, DX12 seems more suitable as we know it contains sound and input API/libraries as well. But as a user here a page ago already has provided a link showing very few games have been built using Microsoft DirectSound, it also proves not many developers use M$'s sound API.
> 
> A user here also have told us, coming from his past posts he seems to have knowledge of graphics technology and programming.
> *Vulkan provides vendor specific support, means it guarantees to run at optimal speed for a specific architecture. But again it doesn't matter in real world much, if majority of users do not have relevant hardware, it would be stupid to use such API for online game*.


Eh?


----------



## sumitlian

Quote:


> Originally Posted by *Newbie2009*
> 
> Eh?


eh ? what ?


----------



## DracoNB

Quote:


> Originally Posted by *sumitlian*
> 
> Because Vulkan is more close to the metal of GCN than the metal of Nvidia architectures, *why would DICE choose an API which won't give benefits to majority of gamers aka Nvidia owners.* DICE prioritizes multiplayer games, and to get market share they need to make the game work as fast as possible for wide rage of hardware that includes Intel HD as well.
> 
> *DICE using DX12 over Vulkan doesn't make Vulkan any less good than DX12*. Yes in terms of complete application development, DX12 seems more suitable as we know it contains sound and input API/libraries as well. But as a user here a page ago already has provided a link showing very few games have been built using Microsoft DirectSound, it also proves not many developers use M$'s sound API.
> 
> A user here also have told us, coming from his past posts he seems to have knowledge of graphics technology and programming.
> *Vulkan provides vendor specific support, means it guarantees to run at optimal speed for a specific architecture. But again it doesn't matter in real world much, if majority of users do not have relevant hardware, it would be stupid to use such API for online game*.


You contradict yourself multiple times in your post. You say that Vulkan and DX12 are equal, but DX12 is better for online multiplayer gaming.

You are saying that Vulkan is more close to the metal for AMD only. What about NVAPI? What are your thoughts on it?


----------



## sumitlian

Quote:


> Originally Posted by *DracoNB*
> 
> You contradict yourself multiple times in your post. You say that Vulkan and DX12 are equal, but DX12 is better for online multiplayer gaming.
> 
> You are saying that Vulkan is more close to the metal for AMD only. What about NVAPI? What are your thoughts on it?


I am so thrilled by your interpretation and comprehension skill.
Yes man I contradicted myself multiple times. My bad, just ignore that post.


----------



## Paztak

So, is there any benchmark available where DX12 performance is tested after this new patch?


----------



## EightDee8D

Quote:


> Originally Posted by *sumitlian*
> 
> I am so thrilled by your interpretation and comprehension skill.
> Yes man I contradicted myself multiple times. My bad, just ignore that post.


You are not Enthusiast enough man.


----------



## Mahigan

Quote:


> Originally Posted by *daviejams*
> 
> Windows 10 is good , best operating system I've ever used. Would never go back to w7


I really like Windows 10. I don't have any issues with it. I think a lot of the hate can fit into one of two categories here...

1. It is riddled with adware (meaning it communicates with Microsoft).
2. Many nVIDIA users hate DX12.


----------



## ToTheSun!

Quote:


> Originally Posted by *Mahigan*
> 
> 2. Many nVIDIA users hate DX12.


Right? This one boggles the mind. Why would people dislike an API represented almost singly by games that run very poorly? Makes no sense to me.


----------



## daviejams

Quote:


> Originally Posted by *ToTheSun!*
> 
> Right? This one boggles the mind. Why would people dislike an API represented almost singly by games that run very poorly? Makes no sense to me.


Problems at launch seem to be a common theme with this api but I'd imagine it's because it's new and difficult (compared to DX11). Time will resolve this as developers get better at using it and working out the kinks. Also none of the DX12 games "run very poorly" really

This time next year nobody is going to use DX11(for AAA games) so you would best get used to it


----------



## ToTheSun!

Quote:


> Originally Posted by *daviejams*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> Right? This one boggles the mind. Why would people dislike an API represented almost singly by games that run very poorly? Makes no sense to me.
> 
> 
> 
> Problems at launch seem to be a common theme with this api but I'd imagine it's because it's new and difficult (compared to DX11). Time will resolve this as developers get better at using it and working out the kinks. Also none of the DX12 games "run very poorly" really
> 
> This time next year nobody is going to use DX11(for AAA games) so you would best get used to it
Click to expand...

I agree with you. I was just using sarcasm to address the bait.


----------



## deepor

Quote:


> Originally Posted by *Mahigan*
> 
> I really like Windows 10. I don't have any issues with it. I think a lot of the hate can fit into one of two categories here...
> 
> 1. It is riddled with adware (meaning it communicates with Microsoft).
> 2. Many nVIDIA users hate DX12.


For me here, I have a Windows that I sometimes do not use for several months. Windows 10 for this is torture from what I've seen. After it wasn't used for a long time, it seems I can't go use it for just a bit anymore. That only works well if it's used daily, otherwise Windows Update flips out for hours and through several restarts before it calms down again. I don't remember older Windows being as annoying as what I've seen from Windows 10.

There's an "LTSB" Windows version that would fix the problem because it only ever installs security updates. It's annoying that there's no way to get it without hacking activation and breaking the license agreement, which I don't want to do (that's why it's annoying).


----------



## boredgunner

Quote:


> Originally Posted by *ToTheSun!*
> 
> Right? This one boggles the mind. Why would people dislike an API represented almost singly by games that run very poorly? Makes no sense to me.


They know nothing about DX12 itself, they go off only what they see. They see it runs worse in games on their cards and blame the API.


----------



## Klocek001

It's the current state of dx12 versions of games that is dilapidated (had to consult a dictionary to find a suitable expression).
Frankly, telling around that amd has an advantage over nvidia is true, but what does it matter anyway when nvidia in dx11 mode can match or ouperform amd dx12.
I'm disappointed with nvidia cards in dx12, especially the new Pascal ones with negative pefromance when compared to dx11. Still I don't know who to blame for this performance loss, the developer or the card manufacturer themsleves.


----------



## deepor

Quote:


> Originally Posted by *Klocek001*
> 
> It's the current state of dx12 versions of games that is dilapidated (had to consult a dictionary to find a suitable expression).
> Frankly, telling around that amd has an advantage over nvidia is true, but what does it matter anyway when nvidia in dx11 mode can match or ouperform amd dx12.
> I'm disappointed with nvidia cards in dx12, especially the new Pascal ones with negative pefromance when compared to dx11. Still I don't know who to blame for this performance loss, the developer or the card manufacturer themsleves.


I have two ideas.

It could be the DX11 drivers (and the OpenGL drivers) from nvidia being super good. With DX12 (and Vulkan), the ratio between how much work towards the end result on screen is done by the game's developer and how much is done by the driver's developer is different. The driver influence on the performance is lower and the game's influence is higher.

Another thought would be what kind of graphics card the game developers have in their own machines where they test their code, with DX11 being the main target they have put most work into. While developing things, they are perhaps working on nvidia cards. They naturally fix all annoying performance problems they can find and know how to fix, and they avoid doing anything that can't run well. This is all before they go through the final optimization work they do before release where they'll also look at other graphics card. This could then make the DX11 nvidia driver look very good, and compete well with DX12.


----------



## tpi2007

Quote:


> Originally Posted by *DracoNB*
> 
> Its much closer to DX12 than Vulkan would be.
> 
> DX12 offers MGPU support, Vulkan does.
> 
> DX12 also has a bunch more APIs and parts than just graphics, vulkan is only graphics.
> 
> There is a lot missing from vulkan that is in DX12.
> 
> Can you please show a source of these financial gains paid to them from Microsoft?
> 
> And sorry I can't take ID's claims without massive bias. They've always used OpenGL over DirectX even though directx is clearly superior. How much closer would Vulkan and DX11 be if they'd used it instead of OpenGL?
> 
> DICE helped write Mantle which then turned into Vulkan. If there was gains to be had over DX12 don't you think they would have used it?


True, but the Xbox One market share is much lower compared to the PS4 and smaller than the PC installed base. Microsoft pure and simply stopped publishing sales figures for the system last year on October 26, with speculation and an EA executive talking too much saying that the Xbox One has sold around 18 - 19 million while Sony sold 36 million as of January this year. Considering that both the PC and the PS4 can run OpenGL and variants, and Vulkan, my reasoning makes perfect economic sense.

So, you have to wonder why exactly would they pair with AMD in the beginning and then switch to an ecosystem that doesn't bring them advantages in terms of installed player base? There has to be monetary incentives on the part of Microsoft, just as there were when they released Titanfall exclusively on the Xbox One:

Respawn: Exclusivity Decision Was out of Our Hands, We'd Have Loved to Be on PlayStation from the Beginning
Quote:


> It was hardly surprising, then, that Titanfall had a lot of hype leading up to its launch. One of the main talking points for a long time was the Xbox console exclusivity, which cut off PlayStation gamers from being able to play it.
> 
> In retrospect, Respawn's Art Director Joel Emslie and Chief Operating Officer Dusty Welch revealed in an interview with PlayStation LifeStyle that the decision was strictly between Electronic Arts and Microsoft, while they would have liked to be on PlayStation from the beginning.
> 
> (...)
> Quote:
> 
> 
> 
> That decision is one that was out of our hands - *the EA decision with Microsoft*. We would always like these to be multiplatform. The more consumers that can play 'em, the better. It's not about money, it's about the audience base, which is important. We would have loved to have been with PlayStation from the beginning. But I think Joel made a good point earlier, which is that Titanfall 2 is a really refined, robust, well-balanced game, so we're excited that the PlayStation audience, seeing it for the first time, is getting an incredible, highly-rated game.
Click to expand...

As to MGPU support, I believe that you are missing a word in your post, the sentence is supposed to end with "Vulkan does _not_", right ? I'll assume so. You're right, although from what I've read not only has the Khronos group announced that it's the next update to the API coming, it's also feasible for developers to implement in on their own right now. Of course, having some of the heavy lifting done beforehand is preferable, but it's interesting to note that on this very same topic, not only hasn't Deus Ex: Mankind Divided not left DX 12 Beta status on the 19th of September like they said it would, it also does not yet support DX 12 MGPU configurations as of the last Patch 7 released on the 23rd, so in practice the difference is theoretical at this point, it seems like a complicated job in either case:



As to DX 12 having more than graphics, true, but it's also true that you have OpenCL and it's also true that things like Microsoft's DirectSound are now deprecated and not used much now as other posters have already said. There are good choices used.


----------



## DracoNB

Quote:


> Originally Posted by *tpi2007*
> 
> True, but the Xbox One market share is much lower compared to the PS4 and smaller than the PC installed base. Microsoft pure and simply stopped publishing sales figures for the system last year on October 26, with speculation and an EA executive talking too much saying that the Xbox One has sold around 18 - 19 million while Sony sold 36 million as of January this year. Considering that both the PC and the PS4 can run OpenGL and variants, and Vulkan, my reasoning makes perfect economic sense.


PS4 doesn't support Vulkan
Quote:


> As to MGPU support, I believe that you are missing a word in your post, the sentence is supposed to end with "Vulkan does _not_", right ? I'll assume so. You're right, although from what I've read not only has the Khronos group announced that it's the next update to the API coming, it's also feasible for developers to implement in on their own right now. Of course, having some of the heavy lifting done beforehand is preferable, but it's interesting to note that on this very same topic, not only hasn't Deus Ex: Mankind Divided not left DX 12 Beta status on the 19th of September like they said it would, it also does not yet support DX 12 MGPU configurations as of the last Patch 7 released on the 23rd, so in practice the difference is theoretical at this point, it seems like a complicated job in either case:
> 
> As to DX 12 having more than graphics, true, but it's also true that you have OpenCL and it's also true that things like Microsoft's DirectSound are now deprecated and not used much now as other posters have already said. There are good choices used.


Yes Deus Ex Mankind Divided doesn't support MGPU, but other DX12 titles do including Rise of the Tomb Raider, so expect to see it in Deus Ex when they release the official DX12 patch or shortly after. Afterall they did the ROTTR PC port


----------



## ZealotKi11er

Apart from UWP ports what game has shipped with DX12? I think its 0 which means DX12 is not here.


----------



## tpi2007

Quote:


> Originally Posted by *DracoNB*
> 
> PS4 doesn't support Vulkan
> 
> Yes Deus Ex Mankind Divided doesn't support MGPU, but other DX12 titles do including Rise of the Tomb Raider, so expect to see it in Deus Ex when they release the official DX12 patch or shortly after. Afterall they did the ROTTR PC port


Yes, I know, but there is nothing stopping them from doing so, since they already use a custom OpenGL implementation, that's what I was hinting at. The Xbox One on the other hand doesn't support Vulkan and most surely _won't_. That is the difference.

Deus Ex was supposed to have been released with DX 12 support from the beginning. Then DX 12 was delayed. And when it got out it wasn't it its final form, that was supposed to come on Sept 19. It's now 11 days past that and the game's DX 12 support is still in Beta and there is still no mGPU support. At some point it becomes mostly irrelevant to the market whether it gets it or not. Most fans have already played the game. At that point it will be just homework that they are doing for the next instalment / DLC. Meanwhile mGPU support for Vulkan is also underway, and so all in all, I'd say that from a relevancy point of view, Vulkan isn't very late to the game in that regard.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Apart from UWP ports what game has shipped with DX12? I think its 0 which means DX12 is not here.


AotS is the Crysis of DX 12, but that's it. And it's not even a very good game.

Even the first UWP titles were not exactly much more than games wrapped around it. Gears of War was based on the 2006 UE3 DX9 game and Quantum Break was made with DX 11 first as the Steam release yesterday proves.


----------



## PontiacGTX

Quote:


> Originally Posted by *tpi2007*
> 
> AotS is the Crysis of DX 12, but that's it. And it's not even a very good game.
> 
> Even the first UWP titles were not exactly much more than games wrapped around it. Gears of War was based on the 2006 UE3 DX9 game and Quantum Break was made with DX 11 first as the Steam release yesterday proves.


you cant compare Crysis to AotS, Crysis was unoptimized FPS sandbox, introduced a good upgradee system to the weapons, it had an awesome multiplayer unlike Crysis 2 it really exploit the graphics features, AotS is a really optimized RTS, showcasing latest features from the new API but doesnt show latest graphics features.keep most of the the supreme commander concept, both are good games. if people doesnt like the genre or the style that is a lot subjective


----------



## tpi2007

Quote:


> Originally Posted by *PontiacGTX*
> 
> you cant compare Crysis to AotS, Crysis was unoptimized FPS sandbox, introduced a good upgradee system to the weapons, it had an awesome multiplayer unlike Crysis 2 it really exploit the graphics features, AotS is a really optimized RTS, showcasing latest features from the new API but doesnt show latest graphics features.keep most of the the supreme commander concept, both are good games. if people doesnt like the genre or the style that is a lot subjective


I'm not comparing them. I'm just saying that both positioned themselves as frontrunners to promote a new DirectX version.

Of course they are different games, but I didn't compare them to one another, I just said that AotS wasn't that good of a game in itself.

According to the averaged review scores by both critics and users (and that also applies to GOG - 3.5 stars and Steam - 68% positive reviews) AotS is generally seen as not that great of a game.

Meanwhile, and not comparing them to each other, Crysis is generally seen as a good game, despite the fame of being a benchmark.



http://www.metacritic.com/game/pc/ashes-of-the-singularity



http://www.metacritic.com/game/pc/crysis


----------



## boredgunner

Quote:


> Originally Posted by *tpi2007*
> 
> I'm not comparing them. I'm just saying that both positioned themselves as frontrunners to promote a new DirectX version.
> 
> Of course they are different games, but I didn't compare them to one another, I just said that AotS wasn't that good of a game in itself.
> 
> According to the averaged review scores by both critics and users (and that also applies to GOG - 3.5 stars and Steam - 68% positive reviews) AotS is generally seen as not that great of a game.
> 
> Meanwhile, and not comparing them to each other, Crysis is generally seen as a good game, despite the fame of being a benchmark.
> 
> 
> 
> http://www.metacritic.com/game/pc/ashes-of-the-singularity
> 
> 
> 
> http://www.metacritic.com/game/pc/crysis


Games like these are needed too, I hope others realize this. I'm glad Crysis also happened to be one of the better shooters around. AotS isn't known for anything other than its DX12, while Crysis was simultaneously known for its fun sandbox physics-based gameplay even if this was shamefully forgotten over time.


----------



## lombardsoup

Quote:


> Originally Posted by *boredgunner*
> 
> Games like these are needed too, I hope others realize this. I'm glad Crysis also happened to be one of the better shooters around. AotS isn't known for anything other than its DX12, while Crysis was simultaneously known for its fun sandbox physics-based gameplay even if this was shamefully forgotten over time.


The original Crysis was good. The sequels went with more of the COD on rails route and in so doing lost a good portion of what made the first game fun to play.


----------



## NightAntilli

Shooters are simply way more popular than RTS games.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NightAntilli*
> 
> Shooters are simply way more popular than RTS games.


Because the flavor to RTS is MOBA right now which is way more popular than shotters in PC.


----------



## DracoNB

They updated Deus Ex MD today with full DX12 support as well as beta version of MGPU DX12 support!


----------



## lynxxyarly

Quote:


> Originally Posted by *DracoNB*
> 
> They updated Deus Ex MD today with full DX12 support as well as beta version of MGPU DX12 support!


Thanks for that heads-up. I will have to check that out tonight and see how it runs!


----------



## y2kcamaross

With the beta mGPU branch enabled, my game crashes immediately on startup


----------



## the9quad

CFX works flawlessly for me in DX12. Two very old 290x's getting a solid 60fps at 1440p ultra settings.


----------



## the9quad

Quote:


> Originally Posted by *y2kcamaross*
> 
> With the beta mGPU branch enabled, my game crashes immediately on startup


if you are running RTSS turn it off or make sure you grab the latest version, that might help.


----------



## Ha-Nocri

Deus Ex got multi GPU support... only for AMD tho. 480 (CF) vs 1060 (x2)


----------



## scorch062

The 1440p scaling results for RX 480 is pretty amazing. Weird how 1060 does not work with DX12 multi-GPU setup, it worked properly with Ashes of the Singularity.


----------



## ku4eto

Quote:


> Originally Posted by *scorch062*
> 
> The 1440p scaling results for RX 480 is pretty amazing. Weird how 1060 does not work with DX12 multi-GPU setup, it worked properly with Ashes of the Singularity.


Yea, 100% scaling is insane. On 1080p seems to be a CPU bottleneck.


----------



## ToTheSun!

Quote:


> Originally Posted by *ku4eto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *scorch062*
> 
> The 1440p scaling results for RX 480 is pretty amazing. Weird how 1060 does not work with DX12 multi-GPU setup, it worked properly with Ashes of the Singularity.
> 
> 
> 
> Yea, 100% scaling is insane. On 1080p seems to be a CPU bottleneck.
Click to expand...

Considering DX12 is supposed to be kinder on the CPU than DX11 and, also, that framerates tested at 1080p weren't even that high, i seriously doubt it's a CPU "bottleneck". And that's disregarding how non black and white hardware bottlenecking really is.

Might just be a case of underutilized GPU assets at 1080p and fully utilized at 1440p.

Perhaps, as RTG said, maybe the RX480 really is a 1440p, at least according to their own idea of what that statement means.


----------



## Mahigan

And now we see DX12 fixed under Deus Ex and delivering impressive multi-GPU scaling, FPS numbers and frame times.

BF1 is a lot of fun as well. I've been running it under DX12 and can't wait for full Multi-GPU to be added. When that happens... two R9 290x's will become competitive against NVidia's latest and greatest.

Vega can't come soon enough...


----------



## huzzug

Quote:


> Originally Posted by *Mahigan*
> 
> And now we see DX12 fixed under Deus Ex and delivering impressive multi-GPU scaling, FPS numbers and frame times.
> 
> BF1 is a lot of fun as well. I've been running it under DX12 and can't wait for full Multi-GPU to be added. When that happens... two R9 290x's will become competitive against NVidia's latest and greatest.
> 
> Vega can't come soon enough...


If what gains we see are real, AMD might hold off doing Vega if Nvidia's latest gen can't compete with AMD's 2 gen old card.


----------



## h4rdcor3

Quote:


> Originally Posted by *huzzug*
> 
> If what gains we see are real, AMD might hold off doing Vega if Nvidia's latest gen can't compete with AMD's 2 gen old card.


I doubt it. Amd still doesn't have anything in the high end. The Fury X is their "high end" right now and gets rocked by a 1080. AMD needs something in at least the 1070 performance range. My 290x was released 3 years ago and it still matches every single card solution AMD has put out besides the Fury line.


----------



## daviejams

Quote:


> Originally Posted by *h4rdcor3*
> 
> I doubt it. Amd still doesn't have anything in the high end. The Fury X is their "high end" right now and gets rocked by a 1080. AMD needs something in at least the 1070 performance range. My 290x was released 3 years ago and it still matches every single card solution AMD has put out besides the Fury line.


You should be happy about that a 290x three years ago was a very smart choice of GPU

I am sure they will release something at the beginning of next year that will compete with the 1070/80


----------



## Ghoxt

Steam News: Multi-GPU support for DirectX 12 for NVIDIA GeForce is officially released! (Nov 17th)

Quote:


> Today we are officially releasing Multi-GPU support for DirectX 12 for NVIDIA GeForce users. This will mean that the dx12_mgpu_preview Beta branch will be removed.
> 
> We strongly advise players with NVIDIA GeForce Multi-GPU capable systems to make sure that you have the latest driver installed and confirm that your SLI is active in your NVIDIA Control Panel. NVIDIA's latest driver (375.86) will focus on fixing stability issues for Multi-GPU support for DirectX 12.


----------



## Rabit

1080P Very high preset results from built in benchmark
i5 3470 @ 4GHz Turbo 4,22Ghz
R9 290X 1050/1260 -41Mv Vcore, +5% Power limit, power drawn in GPU-Z max 184W avg 170W.

DX 11
47,3 avg
13.8 min
63,4 max

Dx12
54,6 avg
46,1 min
61,6 max

For me who do not have latest/fastest CPU on market dx11 is dead


----------



## Ghoxt

Quote:


> Originally Posted by *Rabit*
> 
> 1080P Very high preset results from built in benchmark
> i5 3470 @ 4GHz Turbo 4,22Ghz
> R9 290X 1050/1260 -41Mv Vcore, +5% Power limit, power drawn in GPU-Z max 184W avg 170W.
> 
> DX 11
> 47,3 avg
> 13.8 min
> 63,4 max
> 
> Dx12
> 54,6 avg
> 46,1 min
> 61,6 max
> 
> For me who do not have latest/fastest CPU on market dx11 is dead


Whoa that DX12 "Min" is NICE! Basically no drop off


----------



## TopicClocker

Quote:


> Originally Posted by *Rabit*
> 
> 1080P Very high preset results from built in benchmark
> i5 3470 @ 4GHz Turbo 4,22Ghz
> R9 290X 1050/1260 -41Mv Vcore, +5% Power limit, power drawn in GPU-Z max 184W avg 170W.
> 
> DX 11
> 47,3 avg
> 13.8 min
> 63,4 max
> 
> Dx12
> 54,6 avg
> 46,1 min
> 61,6 max
> 
> For me who do not have latest/fastest CPU on market dx11 is dead


Why is the minimum so low on DX11? Was that like a 1 or 2 second thing?


----------



## Rabit

Quote:


> Originally Posted by *TopicClocker*
> 
> Why is the minimum so low on DX11? Was that like a 1 or 2 second thing?


Yes short slow downs each 5-10 sec


----------



## TopicClocker

Quote:


> Originally Posted by *Rabit*
> 
> Yes short slow downs each 5-10 sec


Oh I see.

DX12 looks very promising, I'm glad to see that low-level APIs taking off more on PC!


----------



## Nameless1988

Quote:


> Originally Posted by *Rabit*
> 
> 1080P Very high preset results from built in benchmark
> i5 3470 @ 4GHz Turbo 4,22Ghz
> R9 290X 1050/1260 -41Mv Vcore, +5% Power limit, power drawn in GPU-Z max 184W avg 170W.
> 
> DX 11
> 47,3 avg
> 13.8 min
> 63,4 max
> 
> Dx12
> 54,6 avg
> 46,1 min
> 61,6 max
> 
> For me who do not have latest/fastest CPU on market dx11 is dead


It's not maxed out. Try max settings without MSAA


----------



## boredgunner

Quote:


> Originally Posted by *Nameless1988*
> 
> It's not maxed out. Try max settings without MSAA


You can try for kicks, but even if the performance hit wasn't there MSAA would be worthless in the game as well as most other modern games. MSAA doesn't get rid of shader aliasing which is the main source of aliasing in today's games. Developers need to not even bother with MSAA anymore. I'm glad they have TAA, even if it's not even close to the best TAA implementation (that would be Unreal Engine 4). TAA and supersampling (for future use) is all we need in games, and maybe SMAA/MLAA for people using integrated graphics on laptops or whatever.


----------



## ku4eto

With my desktop rig i am getting avg of 58, minimum of 43 on the benchmark. High preset, with some customs turned on +vhigh.


----------



## Nameless1988

Quote:


> Originally Posted by *boredgunner*
> 
> You can try for kicks, but even if the performance hit wasn't there MSAA would be worthless in the game as well as most other modern games. MSAA doesn't get rid of shader aliasing which is the main source of aliasing in today's games. Developers need to not even bother with MSAA anymore. I'm glad they have TAA, even if it's not even close to the best TAA implementation (that would be Unreal Engine 4). TAA and supersampling (for future use) is all we need in games, and maybe SMAA/MLAA for people using integrated graphics on laptops or whatever.


In fact I wrote "without msaa" because it kills framerate.


----------



## boredgunner

Quote:


> Originally Posted by *Nameless1988*
> 
> In fact I wrote "without msaa" because it kills framerate.


Wow, my mistake. Definitely. I still wonder why it was included in the game in the first place.


----------



## grifers

Crossfire sucks here on Directx 12 and 11, 295x2 + 290x (Trocrossfire) in this game, all set ultra without MSAa, wth is this game?, other games and benchmarks rauns fine, all gpus 100%, why this game not? any cheat to force crossfire? any tweak?, i dont understand.

P.D - Lastest drivers here


----------



## Rabit

MSI R9 290X probably RIP waiting for new thermal pads but even in 2D I see Artefacts with flash

on my PowerColor R7 260X

R7 260x 2GB High preset
*DX12*
1200/1470
22.9
13.3
35.7

1200/1485
25.1
17.9
32.3

1200/1475 * memory contoller load up to 100% often 95%+

Dx11

23.9
14.5
34.1

1200/1485 * memory contoller load up to 99% often 93%+

24.2
16.7
32.3

Again
*DX12*

1200/1485
Medium preset

30.1
22.3
39.4

+ Textures on High
29.9
22.1
38.3


----------



## shadow85

Do DX12 still doesnt have dual GPU support for Deus EX MD?


----------



## Zaen

WOOOOOW so much variation in performance...







Wasn't this game built around AMD, i see the AMD logo when launching the game. Maybe the "buggyness" in AMD cards is caused by DX12, i believe AMD is more Vulkan friendly then with DX. Maybe i'm wrong... ^_^

Personally i have been playing this in ultra settings but not checking fps rate, don't have a program installed for it except riva tuner and haven't used that in a long time, because i didn't find any problems with artifacts, frame skip or freeze or any type of stutter.

Game benchmark in ultra settings gives me a average of 42+ fps, min. of 33+ and max of 58+ and that is without OC on the GPU just on CPU.

U can check my build bellow if u curious about what i'm using.


----------



## diggiddi

How does crossfire work for this title?


----------



## Ha-Nocri

Quote:


> Originally Posted by *diggiddi*
> 
> How does crossfire work for this title?


It's probably the best implementation of multi-GPU setup so far actually.


----------



## ku4eto

Quote:


> Originally Posted by *Ha-Nocri*
> 
> It's probably the best implementation of multi-GPU setup so far actually.


I gotta try DX12 async crossfire with 480 and 470 4gb versons.I think it wil lwork out prett ywell.


----------



## Bal3Wolf

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> How does crossfire work for this title?
> 
> 
> 
> It's probably the best implementation of multi-GPU setup so far actually.
Click to expand...

So crossfire works on newer amd cards but not older i never could get crossfire to work on my 7970s in the game.


----------



## diggiddi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> It's probably the best implementation of multi-GPU setup so far actually.


So Xfire works for 290x too?


----------



## diggiddi

Is this a decent score for FX 8350 @4.2GHz, 1600mhz RAM, Crossfire 290x ?

BTW game is on staem sale


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## ku4eto

Quote:


> Originally Posted by *diggiddi*
> 
> Is this a decent score for FX 8350 @4.2GHz, 1600mhz RAM, Crossfire 290x ?
> 
> BTW game is on staem sale
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Using Ultra Preset i got 72 maximum and slightly higher avg, but the minimums were down to 3 (probably due to the HDD beign so slow). Thats with 480 and 470 in crossfire (4gb).


----------



## OCAddict

From my own gaming experience once past the graphics Ohhhs and Ahhhs the game plays much smoother in DX11, and I guess that is the difference between a game designed in DX11 being patched to run DX12, we saw exactly the same thing in Crysis 2.


----------



## diggiddi

So DX12 looks better?


----------



## OCAddict

Quote:


> Originally Posted by *diggiddi*
> 
> So DX12 looks better?


Cloth texturing was impressive that's about all I really noticed, but once the glitches begin showing up affecting smooth game play with lockups and freezes it's not worth running it in DX12, of course they may patch all those issues and it run great later on.

It will be nice when we get a game that is designed around DX12 but that will probably be some time off, and by then a new DX will be arriving as well.


----------



## epic1337

its because this isn't native DX12, beneath it its still running on DX11 D3D runtime with DX12 wrapper on top.
utilizing the DX12 wrapper only shortens the rendering path and allows for "some" DX12 features to be used.

in any case, since the underlying API being used is still DX11, the overall benefit is minimal and in some cases even worse.


----------

