# [Bethesda.net] DOOM now updated with Vulkan!



## PlugSeven

Source
Quote:


> DOOM - Vulkan Support Now Live
> 
> At id Software, we've always pushed technology. With DOOM we let the game drive the technology decisions from early on. This has continued even in post-release, with new updates and more. Today we're excited to share another big technology push: Vulkan support is now live on PC.
> 
> When we were looking to adopt Vulkan for DOOM, the main question we asked ourselves was: "What's the gamer benefit?" Ultimately the biggest benefit will be high framerates. There are a number of game-focused reasons super-high framerates matter, but primarily its movement and player feel. The game just feels amazing running that fast, so we made it a priority to try to really exploit the available hardware on PC.........


----------



## MuscleBound

Does this Vulkan work with Nvidia cards??


----------



## Imglidinhere

Hopefully this means I can do better than 40 FPS now. ^^;


----------



## Robenger

Quote:


> Originally Posted by *MuscleBound*
> 
> Does this Vulkan work with Nvidia cards??


Yes. But not fully with Async Compute.


----------



## Masterchief3k

I wonder if the demo will get vulkan support. it'd be nice to at least try it. I get 60 FPS through the whole demo on ultra with my 960. No nightmare settings available though, sadly.


----------



## ChevChelios

Quote:


> KNOWN ISSUES
> 
> DOOM crashes as soon as the game starts when running Vulkan on the NVIDIA GTX 690.
> 
> The NVIDIA GTX 690 is not currently supported to run Vulkan on DOOM. Users with this GPU will need to run the game on OpenGL .
> 
> DOOM crashes, is stuck on a black screen at launch or exhibits severe graphical corruption on supported NVIDIA GPUs that have 2GB VRAM when running Vulkan on Windows 7.
> 
> Vulkan is not currently supported on NVIDIA GPUs with 2 GB of RAM on Windows 7. Users with these GPUs need to run DOOM on the OpenGL graphics AP
> 
> Performance may suddenly decrease when entering Object Mode in the SnapMap editor while running the Vulkan API.
> 
> If this issue occurs, exiting Object mode back to Blueprint mode, before going back into Object Mode may correct the issue. If the problem persists, we recommend you run the SnapMap editor on OpenGL (you'll still be able to play your maps on Vulkan).
> 
> While running the Vulkan API, DOOM crashes when taking a screenshot using the Steam overlay on AMD GPUs.
> 
> This is a known issue with Vulkan API and AMD GPUs.
> 
> The game may crash on some AMD GPUs when locking the computer.
> 
> This is a known issue while running DOOM on the Vulkan API.
> 
> While running the Vulkan API on an NVIDIA GTX 970, strange black checkering or primarily black screens appear after changing resolution or display mode settings.
> 
> Graphical corruption may occur after resizing the game window while in windowed mode, and then switching the display mode from windowed to full screen. If this occurs, restarting the game application should correct the issue.
> 
> Launching SnapMap may take longer on the Vulkan API.
> 
> It may take longer for some users to launch SnapMap on Vulkan than it does on OpenGL.
> 
> Some AI pathing issues may exhibit on SnapMaps while running the Vulkan API.
> 
> We are currently investigating a fix for this to include in an upcoming patch.
> 
> Windows 10 users running OpenGL or Vulkan on the AMD R9 Fury X znd R9 390X at 1080p or higher may experience framerates below 60 FPS with Vertical Sync enabled.
> 
> While we are working on a fix for this with AMD, affected users should run the game with Vertical Sync disabled.
> 
> When using the Steam controller, why can't I rotate the automap with the right trackpad?
> 
> This is a known issue with the Steam Controller. An alternate control method can be used to rotate the map in the automap interface.


----------



## Phaethon666

Not a bad boost to performance. Has anyone here tested it out?


----------



## jprovido

Quote:


> Originally Posted by *Phaethon666*
> 
> Not a bad boost to performance. Has anyone here tested it out?




5820k 4.7ghz, gtx 1080 @ 2050mhz 11ghz mem everything maxed out at 1440p max AA i get solid 144fps now with vulkan. gpu load went down to 80+ to low 90's used to be 99% all the time.


----------



## Phaethon666

Any comparison against DX11 with your rig? Thats some sweet performance btw.


----------



## 8472

Logical.


----------



## sugarhell

http://www.guru3d.com/news-story/new-patch-brings-vulkan-support-to-doom,2.html


----------



## hyp36rmax

OMG YES!!!! does it take advantage of SLI and Crossfire?


----------



## JackCY

DOOM doesn't have DX11, it's ID Software so no DX








Well finally GURU3D, I've been searching stuff all day and reviewers couldn't get anything up all day so far.


----------



## Newbie2009

Comparison: I picked this part as ultra shadows kills performance so this is a minimum fps example. 290X single, stock. Oh, these are 1600p, not 1080p. Crossfire doesn't work. (but don't need now)


----------



## daviejams

Yup runs much better on my 290x - like 20fps better

This is the kind of improvement we've all been waiting for DX12 to deliver


----------



## Newbie2009

Quote:


> Originally Posted by *daviejams*
> 
> Yup runs much better on my 290x - like 20fps better
> 
> This is the kind of improvement we've all been waiting for DX12 to deliver


Seriously impressive. I mean seriously. I can't believe it.


----------



## xioros

All hail AMD









Now bloody release Vega! I need a new GPU


----------



## Newbie2009

Quote:


> Originally Posted by *xioros*
> 
> All hail AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now bloody release Vega! I need a new GPU


To hell with that. Make every game vulcan, I won't need to upgrade at all!


----------



## scorch062

My 380x got a massive performance boost. - from average 40-50 fps with dips to 30s and even rare mid-20s on a mix of high-medium settings (cpu bottleneck







), i know get average 60-70 with lowest dip mid 50s on ultra. Very impressive!


----------



## JackCY

Quote:


> Originally Posted by *Newbie2009*
> 
> To hell with that. Make every game vulcan, I won't need to upgrade at all!


Amen.









That CPU bottleneck removal and being able to properly feed AMD GPUs should finally start showing up as long as devs don't mess it up and the comparisons are done with equal quality settings


----------



## spyshagg

nice very nice

future is bright when your card properly handles it


----------



## jprovido

placebo effect? I feel like doom is much faster now with my 1080 but after reading performance should be the same


----------



## NFL

Quote:


> Originally Posted by *jprovido*
> 
> placebo effect? I feel like doom is much faster now with my 1080 but after reading performance should be the same


Perhaps an improvement in frame times as well? Won't know for sure though until DF does their breakdown


----------



## jprovido

Quote:


> Originally Posted by *NFL*
> 
> Perhaps an improvement in frame times as well? Won't know for sure though until DF does their breakdown


I was already hitting stable 144fps on opengl with minor dips (like explosions etc it goes down to 130ish) now it's more stable at 144fps. man I don't know lol I'll wait for more reviews


----------



## jsc1973

Quote:


> Originally Posted by *Newbie2009*
> 
> To hell with that. Make every game vulcan, I won't need to upgrade at all!


Make them all Vulkan just to eliminate a proprietary API. It's time for DX to just die already.


----------



## Pro3ootector

http://www.legitreviews.com/doom-gets-vulkan-implementation_183882



Bought RX 480 recently. I am not disappointed with this card. Not even a little bit.


----------



## TFL Replica

Cool but disappointing to see no positive gains on Nvidia GPUs. Maybe they need to gimp their OpenGL performance so Vulkan looks comparatively better (I'm just kidding







).


----------



## ZealotKi11er

Quote:


> Originally Posted by *TFL Replica*
> 
> Cool but disappointing to see no positive gains on Nvidia GPUs. Maybe they need to gimp their OpenGL performance so Vulkan looks comparatively better (I'm just kidding
> 
> 
> 
> 
> 
> 
> 
> ).


Almost all the advantages Nvidia has had through DX11 and OpenGL through the years has been closed with DX12 and Vulkan.


----------



## Cybertox

These are some very significant performance improvements. 10 - 20 frames, impressive. Not looking good for Nvidia though, if its gonna be the same case with the Titan and the 1080Ti I might have to stick with AMD.


----------



## orlfman

https://community.bethesda.net/thread/54585
Quote:


> Asynchronous compute is a feature that provides additional performance gains on top of the baseline id Tech 6 Vulkan feature set.
> 
> Currently asynchronous compute is only supported on AMD GPUs and requires DOOM Vulkan supported drivers to run. We are working with NVIDIA to enable asynchronous compute in Vulkan on NVIDIA GPUs. We hope to have an update soon.


async disabled on nvidia gpu's


----------



## sugarhell

Quote:


> Originally Posted by *orlfman*
> 
> https://community.bethesda.net/thread/54585
> async disabled on nvidia gpu's


Even id is waiting for that nvidia Async driver


----------



## Mhill2029

As much as I welcome this, DOOM already runs stupidly high frame rates and is amazingly fluid without Vulcan.


----------



## Dudewitbow

Quote:


> Originally Posted by *Mhill2029*
> 
> As much as I welcome this, DOOM already runs stupidly high frame rates and is amazingly fluid without Vulcan.


it makes those who have high refresh rate monitors enjoy the game more. also helps low end GCN setups get better frames for 60hz monitors.

What I'm more curious about is if any site tested GCN 1.0 gpus performance gains, as its the generation that had the least amount of ACE units.


----------



## Ha-Nocri

1080 OpenGL vs Vulkan @1440p:




Looks like OpenGL is faster. But the guy says Vulkan wins @1080p... which is kinda pointless


----------



## jlucio

Quote:


> Originally Posted by *Mhill2029*
> 
> As much as I welcome this, DOOM already runs stupidly high frame rates and is amazingly fluid without Vulcan.


The point here is that Doom shows we don't need Directx 12 to use our video cards on their full potential. Vulkan does that just fine.

Valve, Bethesda... which one will be the next big studio using Vulkan? Directx days might be numbered.


----------



## magnek

Quote:


> Originally Posted by *ChevChelios*


Is that for Bethesda or nVidia (or both)?


----------



## Fancykiller65

DigitalFoundry had a Doom comparison video between the 390(x?) and the 970, where they were basically equal. I would like to see how they do on the vulkan drivers. Does the 390 pull ahead of the 970?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Fancykiller65*
> 
> DigitalFoundry had a Doom comparison video between the 390(x?) and the 970, where they were basically equal. I would like to see how they do on the vulkan drivers. Does the 390 pull ahead of the 970?


Well, it should. Like in all DX12 games. I'm expecting 390 to be 20% faster. That's the boost it should get from Vulkan, judging from the 480 benchmarks. I'm waiting for DF video also.


----------



## ZealotKi11er

Most of what Vulcan and DX12 does for AMD GPU is bring their 4K performance down to 1440p and 1080p where before where CPU limited resolutions. Nvidia does not have these problems with DX11 or OpenGL so the gains are not there. ASync is only to add performance on top of that. The only time Nvidia will see DX12 improvements is when a games is made for DX12 only but we will never know because we can't compare to to DX11. Right now all DX12/Vulkan games are not true DX12/Vulkan.


----------



## OneB1t

even in 4K you got boost under vulkan so its not just CPU limitation


----------



## Ha-Nocri

Quote:


> Originally Posted by *OneB1t*
> 
> even in 4K you got boost under vulkan so its not just CPU limitation


Yeah. AMD cards can do 2 things at the same time, asynchronously


----------



## pengs

Quote:


> Originally Posted by *Fancykiller65*
> 
> DigitalFoundry had a Doom comparison video between the 390(x?) and the 970, where they were basically equal. I would like to see how they do on the vulkan drivers. Does the 390 pull ahead of the 970?


Most definitely. It should easily. I'm expecting the 390X to be slightly faster than the 980, as it should be. The 390 should be quite a bit faster than the 970 now given that it was only about -10% behind it without Vulkan.


----------



## doza

how do u guys test fps with vulkan?

with OPEN GL 4.5 on ultra (1080p) max aa etc... i am getting from 90 to 130ish fps, but when i switch to vulkan api , riva tuner from afterburner is not showing plus fraps is also not showing info in game.


----------



## Newbie2009

Quote:


> Originally Posted by *doza*
> 
> how do u guys test fps with vulkan?
> 
> with OPEN GL 4.5 on ultra (1080p) max aa etc... i am getting from 90 to 130ish fps, but when i switch to vulkan api , riva tuner from afterburner is not showing plus fraps is also not showing info in game.


It is in Doom advanced options or video options, down bottom of list.


----------



## dagget3450

Did anyone check the PCIE power draw on AMD gpu's with these gains? It stands logically that they are using more power and thus blowing up main boards all over the place.

I cant wait to test this on my old 1366 platform with furyx, So i wonder why they took so long to release this.


----------



## rcfc89

So they can do all this and still no Sli support. Ohh Bethesda you just continue to disappoint.


----------



## Horsemama1956

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Most of what Vulcan and DX12 does for AMD GPU is bring their 4K performance down to 1440p and 1080p where before where CPU limited resolutions. Nvidia does not have these problems with DX11 or OpenGL so the gains are not there. ASync is only to add performance on top of that. The only time Nvidia will see DX12 improvements is when a games is made for DX12 only but we will never know because we can't compare to to DX11. Right now all DX12/Vulkan games are not true DX12/Vulkan.


1440P isn't a cpu limited resolution.


----------



## umeng2002

nVidia is pretty good at optimizing their drivers, so I'm not shocked they have no gains with Vulkan.

But AMD is benefiting from the lack of their crappy dx11/ Opengl driver overhead AND async compute.

I hope this a just a foreshadowing of things to come with DX12, Vulkan, and Async Compute.


----------



## doza

Quote:


> Originally Posted by *Newbie2009*
> 
> It is in Doom advanced options or video options, down bottom of list.


ahahahah thx man,im so off these days.Have this game for a week now, and only now i learned how to switch api's in game+ fps info









these heat is killing my brain









FPS is like 90% same with Vulkan.....


----------



## dagget3450

Quote:


> Originally Posted by *Horsemama1956*
> 
> 1440P isn't a cpu limited resolution.


On older CPU's? I think it can be, especially in dx11/AMD


----------



## ZealotKi11er

Quote:


> Originally Posted by *Horsemama1956*
> 
> 1440P isn't a cpu limited resolution.


It depends in the games. It is for AMD card especially 290X+. GPUs have gotten faster while CPUs have not.


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Right now all DX12/Vulkan games are not true DX12/Vulkan.


Can you explain why DOOM is not a true Vulkan game? I don't believe it is in the same boat as The Talos Principle. Do you have insider info on how Vulkan is implemented in DOOM to where it is not a true Vulkan game? Be interested to hear this info.


----------



## hyp36rmax

Quote:


> Originally Posted by *the9quad*
> 
> Can you explain why DOOM is not a true Vulkan game? I don't believe it is in the same boat as The Talos Principle. Do you have insider info on how Vulkan is implemented in DOOM to where it is not a true Vulkan game? Be interested to hear this info.


I'm curious about this also... Is it because it wasn't originally released as Vulkan when it went gold? I was always under the impression this was always a Vulkan title with two version available. OpenGL and Vulkan. With the later being held back for whatever reason (marketing, tech, tuning...)


----------



## p4inkill3r

Quote:


> Originally Posted by *dagget3450*
> 
> Did anyone check the PCIE power draw on AMD gpu's with these gains? It stands logically that they are using more power and thus blowing up main boards all over the place.
> 
> I cant wait to test this on my old 1366 platform with furyx, So i wonder why they took so long to release this.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Can you explain why DOOM is not a true Vulkan game? I don't believe it is in the same boat as The Talos Principle. Do you have insider info on how Vulkan is implemented in DOOM to where it is not a true Vulkan game? Be interested to hear this info.


I would think a true DX12 or Vulkan game would take advantage of these API in such way that DX11 would not be able to handle such engine. Right now games are still build with DX11 limitations in mind and ported to DX12/Vulkan. What ever increase or now increase in performance it's all a bonus right now.


----------



## Marios145

Is 4K cpu-limited??????

source
From 31->36fps, that's almost 20% for gpu-limited scenario, i'm now wondering how Fury scales.
I believe that Doom on Vulkan(OpenGL too), is a perfect example of an actual game that takes 100% advantage of gpu and cpus.

It's interesting though, that nvidia yet again has no gains with a low-level API, only this time it's opengl. I was expecting -some- gain, maybe frame times/dips improved?

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would think a true DX12 or Vulkan game would take advantage of these API in such way that DX11 would not be able to handle such engine. Right now games are still build with DX11 limitations in mind and ported to DX12/Vulkan. What ever increase or now increase in performance it's all a bonus right now.


Exactly, being allowed to have 100k draw calls on screen would allow for huge amounts of npcs/players and particle effects with minimal performance penalties


----------



## ChevChelios

Quote:


> Originally Posted by *Marios145*
> 
> Is 4K cpu-limited??????
> 
> 
> It's interesting though, that nvidia yet again has no gains with a low-level API, only this time it's opengl. I was expecting -some- gain, maybe frame times/dips improved?


maybe nvidia driver isnt out for Doom-Vulkan ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Marios145*
> 
> Is 4K cpu-limited??????
> 
> source
> From 31->36fps, that's almost 20% for gpu-limited scenario, i'm now wondering how Fury scales.
> I believe that Doom on Vulkan(OpenGL too), is a perfect example of an actual game that takes 100% advantage of gpu and cpus.
> 
> It's interesting though, that nvidia yet again has no gains with a low-level API, only this time it's opengl. I was expecting -some- gain, maybe frame times/dips improved?
> Exactly, being allowed to have 100k draw calls on screen would allow for huge amounts of npcs/players and particle effects with minimal performance penalties


Well it's OpenGL so I have no idea. In DX11 it's probably easy to spot CPU Overhead. OpenGL in the entirety probably just runs slower with AMD has nothing to do with the CPU but the API itself. Vulkan is almost nothing like OpenGL and it's based on Mantle.


----------



## sage101

Quote:


> Originally Posted by *ChevChelios*
> 
> maybe nvidia driver isnt out for Doom-Vulkan ?


Well the devs did say they were working with nvidia to implement Async Compute so I expect to see a new driver soon.


----------



## Rabit

Quote:


> Originally Posted by *sage101*
> 
> Well the devs did say they were working with nvidia to implement Async Compute so I expect to see a new driver soon.


We have heard the same story with Asynch Drivers for Maxwell, a year ago but they still not release it


----------



## EightDee8D

Quote:


> Originally Posted by *sage101*
> 
> Well the devs did say they were working with nvidia to implement Async Compute so I expect to see a new driver soon.


Oxide guys also said same thing and we know what happened. you can't utilize any hardware more than 100%. that's what happening here.


----------



## PlugSeven

Quote:


> Originally Posted by *ChevChelios*
> 
> maybe nvidia driver isnt out for Doom-Vulkan ?


Didn't nvidia showcase Doom running on Vulkan at their 1080 unveil? Now we waiting for a driver, perhaps the same driver that AoS is waiting on


----------



## ChevChelios

Quote:


> Originally Posted by *Rabit*
> 
> We have heard the same story with Asynch Drivers for Maxwell, a year ago but they still not release it


you *can* have gains in performance even without async









its shocking, I know

.. although if your OpenGL/DX11 performance was already perfect before, then maybe you cant


----------



## TFL Replica

Quote:


> Originally Posted by *Rabit*
> 
> We have heard the same story with Asynch Drivers for Maxwell, a year ago but they still not release it


That was straight from id Software though.


----------



## umeng2002

Quote:


> Originally Posted by *PlugSeven*
> 
> Didn't nvidia showcase Doom running on Vulkan at their 1080 unveil? Now we waiting for a driver, perhaps the same driver that AoS is waiting on


I would say since it performs the same in Vulkan as OpenGL, the current nVidia driver is already quite optimized for Doom.

But since async compute isn't working on nVidia, there is no way the performance would further increase.


----------



## ChevChelios

question is how much of the increase was from the API and how much directly from Async ?

because in the latest RotR patch AMD gained some performance on DX12, but only few % of that was from async

Nvidia didnt gain anything, but its DX11 fps was already as good as AMDs newly improved DX12 fps


----------



## Robenger

Quote:


> Originally Posted by *the9quad*
> 
> Can you explain why DOOM is not a true Vulkan game? I don't believe it is in the same boat as The Talos Principle. Do you have insider info on how Vulkan is implemented in DOOM to where it is not a true Vulkan game? Be interested to hear this info.


Because none of them are built from the ground up as a DX12 or Vulkan game. DX11 and OpenGL are both holding back the development of the new api's because they have to keep parts from the old api's in tact.


----------



## Fancykiller65

Quote:


> Originally Posted by *ChevChelios*
> 
> question is how much of the increase was from the API and how much directly from Async ?
> 
> because in the latest RotR patch AMD gained some performance on DX12, but only few % of that was from async
> 
> Nvidia didnt gain anything, but its DX11 fps was already as good as AMDs newly improved DX12 fps


Quote:


> Originally Posted by *ChevChelios*
> 
> you *can* have gains in performance even without async
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its shocking, I know
> 
> .. although if your OpenGL/DX11 performance was already perfect before, then maybe you cant


ChevChelios has a point, AMD's OpenGL and DX11 performance has been generally lower or at best equal to Nvidia's performance so far in most games, and the question now, does Vulkan and DX12 make it so that Nvidia and AMD will be equal on a performance basis, will AMD still lag behind, or is there potential for AMD to actually be ahead? Unfortunately, its too early to tell with so little games being released on the new APIs.


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> question is how much of the increase was from the API and how much directly from Async ?
> 
> because in the latest RotR patch AMD gained some performance on DX12, but only few % of that was from async
> 
> Nvidia didnt gain anything, but its DX11 fps was already as good as AMDs newly improved DX12 fps


Depends on what kind of graphic effects are using compute instead of graphics pipeline. so 5-10% mostly. other performance gains amd is getting from vlk/dx12 are low overhead and better utilization of hardware. so async isn't the only thing helping amd here.

Nvidia has simple and less hardware in their gpus which can be already fully utilized. that's why their gpus have better p/w and better dx11 performance. but once they add more hardware for future apis their lead will be less.

in doom, tssaa or no aa means async is active .

https://twitter.com/idSoftwareTiago/status/752590016988082180


----------



## pengs

Quote:


> Originally Posted by *Marios145*
> 
> Is 4K cpu-limited??????
> 
> source
> From 31->36fps, that's almost 20% for gpu-limited scenario, i'm now wondering how Fury scales.
> I believe that Doom on Vulkan(OpenGL too), is a perfect example of an actual game that takes 100% advantage of gpu and cpus.


Yeah that 4K result is interesting and substantial.
Quote:


> It's interesting though, that nvidia yet again has no gains with a low-level API, only this time it's opengl. I was expecting -some- gain, maybe frame times/dips improved?
> Exactly, being allowed to have 100k draw calls on screen would allow for huge amounts of npcs/players and particle effects with minimal performance penalties


Your thinking DX11, 100K was a round number which was thrown about but 10-50k is more or less what is achieved.
DX12 is perceived to be 600k by Wardell (6x) from the XBO division which was based on consoles but 13x seems to be the uniform result amongst draw call benchmarks and analysis - it makes sense given the CPU power on PC's, though the test itself is very idealized and takes almost nothing but draw calls into account.




And of course this advantage is very similar to what Vulkan can achieve.


----------



## looniam

vulkan blows chunks on nvidia. or should i saw nvidia blows chunks w/vulkan(?)

switched API, which restarted the game and noticed a few stutters albeit no OSD. after going through the blue door in hell, grabbed the red orb thingy and crash/locked at the menu to select what i wanted to upgrade.

game was saved going through the door and no crash/lock second time i got the orb but noticed another stutter before i jumped before i looked and died.

however i seem to be able to kill monsters more efficiently so there's that.


----------



## ChevChelios

Quote:


> Originally Posted by *looniam*
> 
> vulkan blows chunks on nvidia. or should i saw nvidia blows chunks w/vulkan(?)
> 
> switched API, which restarted the game and noticed a few stutters albeit no OSD. after going through the blue door in hell, grabbed the red orb thingy and crash/locked at the menu to select what i wanted to upgrade.
> 
> game was saved going through the door and no crash/lock second time i got the orb but noticed another stutter before i jumped before i looked and died.
> 
> however i seem to be able to kill monsters more efficiently so there's that.


you should probably wait for Nvidia Doom-Vulkan driver first if there's crashes

although its unlikely to give an fps boost for reasons discussed above, so might as well keep playing on OpenGL


----------



## EightDee8D

Would like to see perf/w now with vulkan on 480x vs pascal. it should be better now but still behind pascal ofc.


----------



## looniam

Quote:


> Originally Posted by *ChevChelios*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> vulkan blows chunks on nvidia. or should i saw nvidia blows chunks w/vulkan(?)
> 
> switched API, which restarted the game and noticed a few stutters albeit no OSD. after going through the blue door in hell, grabbed the red orb thingy and crash/locked at the menu to select what i wanted to upgrade.
> 
> game was saved going through the door and no crash/lock second time i got the orb but noticed another stutter before i jumped before i looked and died.
> 
> however i seem to be able to kill monsters more efficiently so there's that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you should probably wait for Nvidia Doom-Vulkan driver first if there's crashes
> 
> although its unlikely to give an fps boost for reasons discussed above, so might as well keep playing on OpenGL
Click to expand...

nvidia has been claiming they're "vulkan ready" for awhile now and even have a few demos.

the chopper demo crashes consistently except using the driver from the dev section.

it's a freaking joke. no apologists are necessary.


----------



## EightDee8D

Quote:


> Originally Posted by *looniam*
> 
> nvidia has been claiming they're "vulkan ready" for awhile now and even have a few demos.
> 
> the chopper demo crashes consistently except using the driver from the dev section.
> 
> it's a freaking joke. *no apologists are necessary*.


----------



## HMBR

cool, I wonder if the 260X performs closer to the Xbox One version with Vulkan, with OGL it was far behind, comparing consoles vs the PC OGL version the game look a lot better optimized on consoles.


----------



## infranoia

Had a 780 in my shopping cart back in 2013. Decided to take a chance on the 290x instead. It was a good call.

Now I've got a 1080 in my cart, and this comes up. Cart is empty again, but no 290x on the horizon.


----------



## ChevChelios

Quote:


> Originally Posted by *looniam*
> 
> nvidia has been claiming they're "vulkan ready" for awhile now and even have a few demos.


and it will be ready after the official driver is released


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> and it *will be ready* after the official driver is released


After volta.

*maybe*


----------



## FLCLimax

Volta will be out by the time they have Async drivers. Makes no sense to even put out such a thing now.


----------



## EightDee8D

Quote:


> Originally Posted by *FLCLimax*
> 
> Volta will be out by the time they have Async drivers. Makes no sense to even put out such a thing now.


People said same thing for pascal.


----------



## ChevChelios

will be interesting to compare 1060 OpenGL vs 480 Vulkan after 1060 comes out


----------



## Kuivamaa

Well Vulkan is basically mantle. AMD has been designing their GPUs around such APIs for 5 years, hence the performance boost. Volta must have been designed around DX12 and Vulkan too finally.


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> will be interesting to compare 1060 OpenGL vs 480 Vulkan after 1060 comes out


All dx12 games too.


----------



## FLCLimax

It'll be interesting to compare 1060 Vulkan to 480 Vulkan. Ditto for DX12.


----------



## infranoia

Quote:


> Originally Posted by *FLCLimax*
> 
> Volta will be out by the time they have Async drivers. Makes no sense to even put out such a thing now.


Quote:


> *April 8th, Windows 364.91, Linux 364.16*
> 
> Updated Vulkan API to 1.0.8
> Improve pipeline creation performance and multi-threaded scaling
> Increase our maximum bound descriptor sets from 4 to 8
> Add support for asynchronous transfer queue
> ...


Not sure what the holdup is. After all, it's been in there for several driver releases now.


----------



## y2kcamaross

Funny how people are saying just wait for the nvidia vulkan driver...yet at the 1080 reveal they were supposedly playing doom with Vulkan


----------



## FLCLimax

Quote:


> Originally Posted by *infranoia*
> 
> Not sure what the holdup is. After all, it's been in there for several driver releases now.


It's just empty words like DX12 support on the Fermi spec sheet.


----------



## rcfc89

Does this have to be activated in the Options menu?


----------



## ChevChelios

Quote:


> Originally Posted by *EightDee8D*
> 
> All dx12 games too.


all 3-4 of them


----------



## infranoia

Quote:


> Originally Posted by *FLCLimax*
> 
> It's just empty words like DX12 support on the Fermi spec sheet.


I doubt Nvidia is ready to cede Doom to AMD. That's just like giving up.


----------



## infranoia

Quote:


> Originally Posted by *ChevChelios*
> 
> all 3-4 of them


Irrelevant and inaccurate. Storm's a comin', ChefCheetos.


----------



## looniam

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> nvidia has been claiming they're "vulkan ready" for awhile now and even have a few demos.
> 
> 
> 
> and it will be ready after the official driver is released
Click to expand...

you're not getting it are you?

nvidia has been talking up their support for vulkan since february and have passed kronos testing BEFORE AMD. *for them to get caught with their pants down on a AAA games is utterly ridiculous.*

if i wanted to wait i would have bought AMD!


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> all 3-4 of them


https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

I know waiting for free stuff is hard, meanwhile you can educate yourself about 1 or 2 things before spreading fud.


----------



## FLCLimax

Quote:


> Originally Posted by *ChevChelios*
> 
> 
> 
> Spoiler: Warning: Spoiler!


lol


----------



## ChevChelios

Quote:


> Originally Posted by *infranoia*
> 
> Irrelevant. Storm's a comin', ChefCheetos.


its been coming ever since GCN appeared

been taking its sweet time









so far it seems like the storm amounts to "require waiting months for a DX12/Vulkan patch and/or driver to catch up to DX11/OpenGL performance that Nvidia had _on launch day_ .. and then async may or may not make a difference"


----------



## rcfc89

Quote:


> Originally Posted by *infranoia*
> 
> Irrelevant. Storm's a comin', ChefCheetos.


Lol I'm sure Nvidia is shaking in their Jimmy Choo boots while they sit on 80% of the market share. Amd loyalist are always good for a laugh.


----------



## EightDee8D

Quote:


> Originally Posted by *rcfc89*
> 
> Lol I'm sure Nvidia is shaking in their Jimmy Choo boots while they sit on 80% of the market share. Amd loyalist are always good for a laugh.


They don't have 80% anymore, 2 months ago it was 78 and going down.


----------



## rcfc89

Quote:


> Originally Posted by *EightDee8D*
> 
> They don't have 80% anymore, 2 months ago it was 78 and going down.


Yeah I'm sure with record sales of the gtx1080's its probably up to 85% by now.


----------



## EightDee8D

Quote:


> Originally Posted by *rcfc89*
> 
> Yeah I'm sure with record sales of the gtx1080's its probably up to 85% by now.


----------



## infranoia

Quote:


> Originally Posted by *rcfc89*
> 
> Yeah I'm sure with record sales of the gtx1080's its probably up to 85% by now.


56.71% on the Steam Hardware Survey, which is the only metric devs really care about. John Peddie measures sales rates, and of course Nvidia has a higher turnover.


----------



## FLCLimax

Quote:


> Originally Posted by *rcfc89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EightDee8D*
> 
> They don't have 80% anymore, 2 months ago it was 78 and going down.
> 
> 
> 
> Yeah I'm sure with record sales of the gtx1080's its probably up to 85% by now.
Click to expand...

LMAO, yea selling through all 800 of the 1080's boosted them up quite a bit!


----------



## ZealotKi11er

Quote:


> Originally Posted by *infranoia*
> 
> Had a 780 in my shopping cart back in 2013. Decided to take a chance on the 290x instead. It was a good call.
> 
> Now I've got a 1080 in my cart, and this comes up. Cart is empty again, but no 290x on the horizon.


Trust your instincts. GTX780 came out before 290X. GTX1080 is a pretty bad buy unless you upgrade to the fastest GPU eveytime there is one out.


----------



## LancerVI

Running a GTX980ti right now.

Definitely waiting for Vega. Definitely going AMD this go'round.

OT: Impressive gains with Vulkan!!! Truly impressed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LancerVI*
> 
> Running a GTX980ti right now.
> 
> Definitely waiting for Vega. Definitely going AMD this go'round.
> 
> OT: Impressive gains with Vulkan!!! Truly impressed.


GTX980 Ti is a very powerful DX11 card but its quite sad that it's only 1 years old and you have to make compromises with new APIs.


----------



## XenoRad

Interesting times. I've actually just switched from an R9 290X to a GTX 1070. Good gains for AMD but I'll still be getting more FPS with the GTX 1070 in Open GL then I would have with the 290X + Vulkan. Also I get to enable Nightmare Shadows & Textures without hacks and not risk crashing.

In any case AMD does have a good advantage in Dx 12 and Vulkan for the moment, but otherwise they still can't compete with nVidia on the high end. If Vega fails to take the crown or come close to the GTX 1080 in most games (not just Dx 12) then it still won't be good for AMD.

The majority of games are Dx 11 and many will continue to be so for a while now.


----------



## Greenland

I'm glad I never felt for the GTX goymer crowd. My Fury saw around 50% fps increase during the opening sequence, around 30% later on, running Ultra settings with TSSAA on:

http://imgur.com/a/Ah02x


----------



## FLCLimax

Quote:


> Good gains for that 3 year old GPU over there, but my brand new GPU is faster. I can continue to play old games and hope that games keep being made on old API's to make myself feel good


K.


----------



## rcfc89

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GTX980 Ti is a very powerful DX11 card but its quite sad that it's only 1 years old and you have to make compromises with new APIs.


Very sad. I'm hating life right now with running every game in Ultra @100fps in Ultra Wide resolutions with the butter smoothness of Gsync. Where can I find a good rope?


----------



## nagle3092

Quote:


> Originally Posted by *rcfc89*
> 
> Very sad. I'm hating life right now with running every game in Ultra @100fps in Ultra Wide resolutions with the butter smoothness of Gsync. Where can I find a good rope?


Jen-Hsun will probably sell you one with a geforce claw on it for $100.


----------



## sugarhell

http://www.pcgameshardware.de/Doom-2016-Spiel-56369/News/Vulkan-Patch-bessere-Performance-Benchmarks-1201321/


----------



## magnek

Quote:


> Originally Posted by *y2kcamaross*
> 
> Funny how people are saying just wait for the nvidia vulkan driver...yet at the 1080 reveal they were supposedly playing doom with Vulkan


AMD *ahem* loyalists got constant flak for saying "but just wait for XYZ!". Well it seems nVidia apologists are in fact no better after all.
Quote:


> Originally Posted by *looniam*
> 
> you're not getting it are you?
> 
> nvidia has been talking up their support for vulkan since february and have passed kronos testing BEFORE AMD. *for them to get caught with their pants down on a AAA games is utterly ridiculous.*
> 
> if i wanted to wait i would have bought AMD!


I'm still waiting for that magical Maxwell async driver they promised








Quote:


> Originally Posted by *rcfc89*
> 
> Lol I'm sure Nvidia is shaking in their Jimmy Choo boots while they sit on 80% of the market share. Amd loyalist are always good for a laugh.


Same with nVidia apologists.


----------



## ChevChelios

why does someone need to apologise for having this kind of performance 2+ months ago ?


----------



## sugarhell

Quote:


> Originally Posted by *ChevChelios*
> 
> why does someone need to apologise for having this kind of performance 2+ months ago ?


It is like getting your new fancy car to the new awesome road and while you expect the same speed as the full of holes old road you get less speed.


----------



## Greenland

2 months?

http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,16.html

and

http://www.guru3d.com/news-story/new-patch-brings-vulkan-support-to-doom.html

Look at the GTX 970, the 480 just leapfrogs, how about the 970, any fps again switching to Vulkan?


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> why does someone need to apologise for having this kind of performance 2+ months ago ?


Because instead of getting more performance they are still stuck on 2 months old performance ?


----------



## magnek

Quote:


> Originally Posted by *ChevChelios*
> 
> why does someone need to apologise for having this kind of performance 2+ months ago ?


Apologist
Quote:


> a person who defends or supports something (such as a religion, cause, or organization) that is being criticized or attacked by other people


----------



## Semel

I bet most of performance increase on AMD hardware is due to eliminating or reducing API overhead AMD had with OpenGL\dx11. That's why they get better gains compared to nvidia. nvidia GPU already perform at their maximum and don't have this APi overhead "weakness"

Nvidia GPU performance compared to AMD:-

DX 11 - win
OpenGL win
vulkan - either parity or win
DX12 -either parity or win










So, you see, nvidia and those who got nvidia gpus have nothing to worry about.


----------



## looniam

drop the dope yo:


----------



## LancerVI

Quote:


> Originally Posted by *rcfc89*
> 
> Very sad. I'm hating life right now with running every game in Ultra @100fps in Ultra Wide resolutions with the butter smoothness of Gsync. Where can I find a good rope?


The 980ti is indeed a good card. I have no complaints. But I believe the writing's on the wall. With DX12/Vulkan, I think (note I said think, not know) AMD is in for a good year or so with ASYNC compute and such.

Not only that, I have my son running SLI'd 580's. He's going to get my 980ti and I can grab a Vega with a freesync ultrawide. I'm done with nVidia for a while. I don't hold brand loyalty as dearly as some of you.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> GTX980 Ti is a very powerful DX11 card but its quite sad that it's only 1 years old and you have to make compromises with new APIs.


Indeed.


----------



## ChevChelios

Quote:


> Originally Posted by *Greenland*
> 
> 2 months?
> 
> http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,16.html
> 
> and
> 
> http://www.guru3d.com/news-story/new-patch-brings-vulkan-support-to-doom.html
> 
> Look at the GTX 970, the 480 just leapfrogs, how about the 970, any fps again switching to Vulkan?


970 is not its competitor

1060 is

Quote:


> bet most of performance increase on AMD hardware is due to eliminating or reducing API overhead AMD had with OpenGL\dx11. That's why they get better gains compared to nvidia. nvidia GPU already perform at their maximum and don't have this APi overhead "weakness"


dx12 patches fix AMDs dx11 performance

same thing in RotR


----------



## sugarhell

Quote:


> Originally Posted by *looniam*
> 
> drop the dope yo:


What? With TSSAA (temporal super sample) instead of having a performance hit vs SMAA the performance increase...


----------



## Greenland

Quote:


> Originally Posted by *ChevChelios*
> 
> 970 is not its competitor
> 
> 1060 is
> dx12 patches fix AMDs dx11 performance
> 
> same thing in RotR


You just said "why does someone need to apologise for having this kind of performance 2+ months ago ? redface.gif"

GTX 1060 didn't exist 2 months ago, what's your excuse now?
Quote:


> Originally Posted by *sugarhell*
> 
> What? With TSSAA (temporal super sample) instead of having a performance hit vs SMAA the performance increase...


That's because only no-AA and TSSAA aka Temperal SSAA are implemented with Async Compute atm. The performance gain with Vulkan + AC is insane.


----------



## ChevChelios

Quote:


> You just said "why does someone need to apologise for having this kind of performance 2+ months ago ? redface.gif"


980Ti and Fury X, for example


----------



## LancerVI

Seriously.

Some of you should switch brands occasionally just to keep yourselves honest. In general, I try not to stick with one brand for too long unless its name is Coca-Cola.


----------



## Semel

Quote:


> Originally Posted by *LancerVI*
> 
> AMD is in for a good year or so with ASYNC compute and such..


Neither dx12\vulkan nor async compute will change the balance of power performance wise.

So, say, 980ti (and it OCs like crazy) will still beat furys in most if not all games.

Call me a pessimist (I got a fury) but I just don't believe in miracles. Especially if you consider a pretty aggressive nvidias"teh way it's meant to stutter" policy









So far dx12\vulkan\async have helped AMD to close the gap performance wise between their gpus and nvidias, but certainly not surpass nvidias sheer performance power.


----------



## rcfc89

Quote:


> Originally Posted by *LancerVI*
> 
> The 980ti is indeed a good card. I have no complaints. But I believe the writing's on the wall. With DX12/Vulkan, I think (note I said think, not know) AMD is in for a good year or so with ASYNC compute and such.
> 
> Not only that, I have my son running SLI'd 580's. He's going to get my 980ti and I can grab a Vega with a freesync ultrawide. I'm done with nVidia for a while. I don't hold brand loyalty as dearly as some of you.
> Indeed.


FreeSync Ultra wide's are stuck at 75hz though. No thanks to that or ever putting the trainwreck that is Amd in my rig.


----------



## LancerVI

Quote:


> Originally Posted by *Semel*
> 
> Neither dx12\vulkan nor async compute will change the balance of power performance wise.
> 
> So, say, 980ti (and it OCs like crazy) will still beat furys in most if not all games.
> 
> Call me a pessimist (I got a fury) but I just don't believe in miracles. Especially if you consider a pretty aggressive nvidias"*teh way it's meant to stutter*" policy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far dx12\vulkan have helped AMD to close the gap performance wise between their gpus and nvidias, but certainly not surpass nvidias sheer performance power.


Call me a blurry eyed optimist then. I believe DX12/Vulkan is going to take off *within* the next two years.

Also, it's not always about getting the absolute best performance for me. It has to make sense $$$ wise. I'm not spending $700 bucks on this generation's GTX 680.
Quote:


> Originally Posted by *rcfc89*
> 
> FreeSync Ultra wide's are stuck at 75hz though. No thanks to that or ever putting the trainwreck that is Amd in my rig.


Maybe I'm blind or whatever, but 35 to 75hz is plenty for me. I'm not playing CS:GO or what-have-you. I'm a flight sim / strategy wargaming guy with some battlefield / war thunder thrown in. 75hz is plenty for entertainment.


----------



## tp4tissue

Here's the problem though, no one plays doom..,. hahaha

Hurray for AMD.. wh00t wh00t..


----------



## LancerVI

Quote:


> Originally Posted by *tp4tissue*
> 
> Here's the problem though, no one plays doom..,. hahaha
> 
> Hurray for AMD.. wh00t wh00t..


I play Doom.

Now what?

Good thing I'm concerned with my entertainment and not everyone else's.


----------



## EightDee8D

Quote:


> Originally Posted by *LancerVI*
> 
> I play Doom.
> 
> Now what?


He is making fun of those who say no one plays game "xyz" because amd performs better.


----------



## Myst-san

I play Doom.


----------



## GorillaSceptre

Add one more DX12 benchmark to the list... oh, nvm..


----------



## Noufel

Quote:


> Originally Posted by *ChevChelios*
> 
> maybe nvidia driver isnt out for Doom-Vulkan ?


----------



## Derp

Has anyone seen Fury/Nano/FuryX Vulkan numbers? Some people are claiming +50% gains which is amazing.


----------



## Noufel

so anyone to say that DOOM is another fail Async bechmark à la AOTS


----------



## EightDee8D

Quote:


> Originally Posted by *Noufel*
> 
> so anyone to say that DOOM is another fail Async bechmark à la AOTS


This time it's 2 months old performance.









Can't wait for Battlefield 1, 480 will rapeon that 1060.


----------



## provost

Quote:


> Originally Posted by *LancerVI*
> 
> Call me a blurry eyed optimist then. I believe DX12/Vulkan is going to take off *within* the next two years.
> 
> Also, it's not always about getting the absolute best performance for me. It has to make sense $$$ wise. I'm not spending $700 bucks on this generation's GTX 680.
> Maybe I'm blind or whatever, but 35 to 75hz is plenty for me. I'm not playing CS:GO or what-have-you. I'm a flight sim / strategy wargaming guy with some battlefield / war thunder thrown in. 75hz is plenty for entertainment.


Well, sure, Dx12 will take off on an accelerated pace. It's ironic that Nvidia seems to have forgotten the key
takeaways from the "lessons learned" part of 3dfx history..lol


----------



## magnek

Quote:


> Originally Posted by *EightDee8D*
> 
> This time it's 2 months old performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait for Battlefield 1, 480 will *rapeon* that 1060.


You owe me royalties kthx.


----------



## EightDee8D

Quote:


> Originally Posted by *magnek*
> 
> You owe me royalties kthx.


Kden


----------



## FLCLimax

Quote:


> Originally Posted by *rcfc89*
> 
> 
> 
> Spoiler: Warning: Spoiler!


----------



## helis4life

1440p ultra playable now on 290x?


----------



## Wovermars1996

RX 480



GTX 1080


----------



## Rabit

Quote:


> Originally Posted by *Derp*
> 
> Has anyone seen Fury/Nano/FuryX Vulkan numbers? Some people are claiming +50% gains which is amazing.


I find one on Polish forum he claims average 56% gains * his PC i5-4460 3,2GHz | ASrock H81M | Crucial Tactical 2x8GB 1600MHz CL8 | Asus R9 Fury Strix 4GB 1050/500 MHz

http://imgur.com/a/GQs6N

Source:http://forum.pclab.pl/topic/1118462-AMD-Radeon-RX-4x0-Polaris/page__st__4540


----------



## pengs

Quote:


> Originally Posted by *Semel*
> 
> Neither dx12\vulkan nor async compute will change the balance of power performance wise.


Eh, thing is is that it will. If your pushing more frame rate and using the same amount of power performance per watt goes up. This probably raises the power consumption slightly (not talking vsynced) but it's still going to increase the efficiency.

NM, your talking about hierarchy. Still don't agree.


----------



## Wovermars1996

Quote:


> Originally Posted by *Rabit*
> 
> I find one on Polish forum he claims average 56% gains * his PC i5-4460 3,2GHz | ASrock H81M | Crucial Tactical 2x8GB 1600MHz CL8 | Asus R9 Fury Strix 4GB 1050/500 MHz
> 
> http://imgur.com/a/GQs6N
> 
> Source:http://forum.pclab.pl/topic/1118462-AMD-Radeon-RX-4x0-Polaris/page__st__4540


I knew that there would be significant gains but that is amazing


----------



## infranoia

I honestly believed that FPS would *never* benefit from Async Shaders-- that only RTS like AotS and Total War would benefit, for all the units onscreen.

I stand corrected.


----------



## JackCY

Beware Async Compute is disabled with some AA, you gotta use the compatible AA or no AA. It's on r/AMD and Bethesda's twitter or where I saw it. So if you're doing comparisons, beware.

*Here.*
Quote:


> DOOM GL vs Vulkan on an awesome AMD 480. Heads up benchmarkers, use TSSAA or no AA (else Async Compute is disabled)


----------



## SlackerITGuy

Now more people will get to enjoy what we've been enjoying when running Battlefield 4 using Mantle since January 2014.

Congrats DOOM players.

Awesome showing by AMD cards.


----------



## Wovermars1996

Decided to test with my system. Ultra settings with motion blur and depth of field disabled with TSSAA.
Edit: If I did my math right, this is an improvement of 52%
OpenGL

Vulkan


----------



## Eorzean

Hory shet.

Those are some impressive gains. Come the -snip- on Nvidia. Hopefully they start to get their -snip- together because software gains > upgrading hardware every year.


----------



## Blameless

Quote:


> Originally Posted by *ChevChelios*
> 
> 970 is not its competitor
> 
> 1060 is


Until the GTX 1060 can be bought, the GTX 970 is the most capable card NVIDIA has in roughly the same price segment as the RX 480.
Quote:


> Originally Posted by *infranoia*
> 
> I honestly believed that FPS would *never* benefit from Async Shaders-- that only RTS like AotS and Total War would benefit, for all the units onscreen.
> 
> I stand corrected.


Number of units on screen doesn't imply much of anything about the graphical complexity of a scene. Maybe the AI and physics, but these aren't typically run on the GPU.
Quote:


> Originally Posted by *Derp*
> 
> Has anyone seen Fury/Nano/FuryX Vulkan numbers? Some people are claiming +50% gains which is amazing.


I wouldn't be surprised if this was accurate. Fiji has the most imbalanced shader to front-end ratios of any GPU in recent memory and should benefit a lot from anything that gives better utilization.


----------



## sugarhell

If you convert 20-30 % of your shader to compute shaders and run it with Async shaders GCN is going to win no matter what.

Nvidia architecture will never match GCN shader array in performance


----------



## JackCY

Quote:


> Originally Posted by *Blameless*
> 
> Until the GTX 1060 can be bought, the GTX 970 is the most capable card NVIDIA has in roughly the same price segment as the RX 480.
> Number of units on screen doesn't imply much of anything about the graphical complexity of a scene. Maybe the AI and physics, but these aren't typically run on the GPU.
> I wouldn't be surprised if this was accurate. Fiji has the most imbalanced shader to front-end ratios of any GPU in recent memory and should benefit a lot from anything that gives better utilization.


Actually when they send using a stupid lazy method 1000 units to be rendered in 1000 separate commands to the GPU instead of using 1 command and copy those 1000 same units around instead there is a GPU performance issue. Which is something they were showing ages ago with Mantle and is usable in some games like RTS but not just there, you could have grass objects etc. There is tons tricks and ways to do stuff proper that were said to not be possible with older higher level APIs.


----------



## infranoia

So Total War implements their DX12 render path. AMD sees big gains, Nvidia none. Nvidia is said to be working on it.

Doom releases Vulkan render path, it benefits only AMD. Nvidia is said to be working on it.

At what point do we start complaining about how awful Nvidia drivers are?


----------



## Tgrove

Just got doom off greenmangaming for $28.79. Downloading tonight will play in the morning on sig rig. Will be playing @ 4k if anyone wants numbers. Will be 1 fury x of course

Also i imagine msi afterburner wont work with it, any other software i can use to monitor system with vulcan?


----------



## PsYcHo29388

Quote:


> Originally Posted by *infranoia*
> 
> At what point do we start complaining about how awful Nvidia drivers are?


When you go back to DX11 and realize their drivers were much better suited for that API than AMD ever attempted.

So now it seems the tables are slowly turning, which is kinda nice.


----------



## JackCY

Quote:


> Originally Posted by *infranoia*
> 
> So Total War implements their DX12 render path. AMD sees big gains, Nvidia none. Nvidia is said to be working on it.
> 
> Doom releases Vulkan render path, it benefits only AMD. Nvidia is said to be working on it.
> 
> At what point do we start complaining about how awful Nvidia drivers are?


Never? It's not 100% software reason more of a mostly if not only hardware. That's my humble opinion. NV already has what the cards can do at max, but AMD's newer architecture (GCN) was held back by high level APIs. NV wasn't held back or not as much beside rare cases. They will both benefit but not both in the same amount with the current HW archs.


----------



## FLCLimax

Quote:


> Originally Posted by *infranoia*
> 
> So Total War implements their DX12 render path. AMD sees big gains, Nvidia none. Nvidia is said to be working on it.
> 
> Doom releases Vulkan render path, it benefits only AMD. Nvidia is said to be working on it.
> 
> At what point do we start complaining about how awful Nvidia drivers are?


Just wait until Volta...or the one after that. NOBODY PLAY [insert game here] ANYWAY.


----------



## FLCLimax

Quote:


> Originally Posted by *Tgrove*
> 
> Just got doom off greenmangaming for $28.79. Downloading tonight will play in the morning on sig rig. Will be playing @ 4k if anyone wants numbers. Will be 1 fury x of course


I have a Fury running a 4K monitor but i haven't been able to get DOOM to launch in over a week. I'll see if anything is different.


----------



## JackCY

Quote:


> Originally Posted by *FLCLimax*
> 
> Just wait until Volta...or the one after that. NOBODY PLAY [insert game here] ANYWAY.


Wait 6 months... NOBODY PLAYS DOOM ANYWAY


----------



## phenom01

o jeesh here we go again with the AMD camp claiming victory only to be sorely mistaken in a couple a days when the dust settles. BUT BUT THIS one game shows dominance!!!!!!!!!....So did AOTS and victory was claimed then. We all know how that turned out. Mark my words.









*edit* Meh 3.5 whatever show me a better 1080p 144hz setup than this. http://www.3dmark.com/fs/9177137


----------



## sugarhell

Quote:


> Originally Posted by *phenom01*
> 
> o jeesh here we go again with the AMD camp claiming victory only to be sorely mistaken in a couple a days when the dust settles. BUT BUT THIS one game shows dominance!!!!!!!!!....So did AOTS and victory was claimed then. We all know how that turned out. Mark my words.


I put on this effort 3.5/4


----------



## FLCLimax

Quote:


> Originally Posted by *JackCY*
> 
> Wait 6 months... NOBODY PLAYS DOOM ANYWAY


Wait 1 year... NOBODY PLAYS DEUS EX ANYWAY...err i mean NOBODY PLAYS BATTLEFIELD ANYWAY...no i mean NOBODY PLAYS STAR CITIZEN ANYWAY...whoops err NOBODY PLAYS GEARS OF WAR 4 ANYWAY...umm NOBODY PLAYS MASS EFFECT ANYWAY!
Quote:


> It'll be released when it's ready, they're working on it


How's that?


----------



## Marios145

Quote:


> Originally Posted by *sugarhell*
> 
> I put on this effort 3.5/4



^That


----------



## EightDee8D

*Nobody plays any game where Amd is faster* than their nvidia counterparts. there i said it.


----------



## Dudewitbow

Quote:


> Originally Posted by *FLCLimax*
> 
> Wait 1 year... NOBODY PLAYS DEUS EX ANYWAY...err i mean NOBODY PLAYS BATTLEFIELD ANYWAY...no i mean NOBODY PLAYS STAR CITIZEN ANYWAY...whoops err NOBODY PLAYS GEARS OF WAR 4 ANYWAY...umm NOBODY PLAYS MASS EFFECT ANYWAY!
> 
> How's that?


potentially within the next year or so assuming Microsoft convinces devs to make game DX12, given their announcement on making these games the crossplay titles:
Quote:


> NOBODY PLAYS:
> Halo Wars 2
> Forza Horizon 3
> Scalebound
> Recore
> Sea of Thieves
> State of Decay 2
> Crackdown 3


----------



## Greenland

Quote:


> Originally Posted by *Marios145*
> 
> 
> ^That


This is the perfect representation of each vendor.


----------



## FLCLimax

Quote:


> Originally Posted by *EightDee8D*
> 
> *Nobody plays any game where Amd is faster* than their nvidia counterparts. there i said it.


Everybody has a backlog of 3 - 5 year old DX11 games anyway!


----------



## dagget3450

Quote:


> Originally Posted by *EightDee8D*
> 
> *Nobody plays any game where Amd is faster* than their nvidia counterparts. there i said it.


That is because the AMD GPU blew up the motherboard from drawing too much current so they couldn't play. /s -reddit


----------



## magnek

Quote:


> Originally Posted by *Eorzean*
> 
> Hory shet.
> 
> Those are some impressive gains. Come the -snip- on Nvidia. Hopefully they start to get their -snip- together because software gains > upgrading hardware every year.


Wait for Volta.
Quote:


> Originally Posted by *Dudewitbow*
> 
> potentially within the next year or so assuming Microsoft convinces devs to make game DX12, given their announcement on making these games the crossplay titles:
> Quote:
> 
> 
> 
> NOBODY PLAYS:
> Halo Wars 2 *Halo? That's like sooooo 2003 dude*
> Forza Horizon 3 *You don't need DX12 to play a lame ass racing game
> 
> 
> 
> 
> 
> 
> 
> *
> Scalebound *Never heard of it*
> Recore *See above*
> Sea of Thieves *See above*
> State of Decay 2 *See above*
> Crackdown 3 *See above*
Click to expand...


----------



## sugarhell

When nvidia wins a dx12/vulkan game all will claim that this is the first real dx12/vulkan game.

Maybe we will have to wait a lot for that...


----------



## FLCLimax

When CawaDuty and WoW adopt DX12 nobody will play those either!


----------



## FLCLimax

Quote:


> Originally Posted by *sugarhell*
> 
> When nvidia wins a dx12/vulkan game all will claim that this is the first real dx12/vulkan game.
> 
> Maybe we will have to wait a lot for that...


They won Tomb Raider, but after the last patch it's no longer true DX12 and can't be compared.


----------



## Dudewitbow

Quote:


> Originally Posted by *FLCLimax*
> 
> When CawaDuty and WoW adopt DX12 nobody will play those either!


I honestly wish MMO's openly picked up DX12. It's honestly the one genre that needs the most CPU resources available to them


----------



## sugarhell

Quote:


> Originally Posted by *FLCLimax*
> 
> They won Tomb Raider, but after the last patch it's no longer true DX12 and can't be compared.


Even the developers said to not use dx12 at the current form because it's still in early development. But whatever you know the first patches are the real deal.

At least i am happy that me and the fellow amd fanboys now can fight back without mentioning only AOTS anymore


----------



## Blameless

Quote:


> Originally Posted by *JackCY*
> 
> Actually when they send using a stupid lazy method 1000 units to be rendered in 1000 separate commands to the GPU instead of using 1 command and copy those 1000 same units around instead there is a GPU performance issue. Which is something they were showing ages ago with Mantle and is usable in some games like RTS but not just there, you could have grass objects etc. There is tons tricks and ways to do stuff proper that were said to not be possible with older higher level APIs.


You can do the same sort of tiled rendering and copying of assets without having fully independent units in the sense of units in an RTS and this isn't dependent on async shaders anyway.
Quote:


> Originally Posted by *JackCY*
> 
> Never? It's not 100% software reason more of a mostly if not only hardware. That's my humble opinion.


I'm certain that AMD having hardware more conducive to, and more dependent on, asynchronous compute performance is more than an opinion.

It's vaguely like SMT on CPUs. Hyperthreading a single issue core gets you nothing because the narrow core is easy to completely utilize. Hyperthreading a quad issue i7 core gains quite a bit. Going further SMT becomes positively mandatory to get satisfactory performance from something like an Itanium or POWER core.

Maxwell didn't benefit from, or need async compute because it was well utilized without it. The same is probably true of Pascal, to a lesser extent.
Quote:


> Originally Posted by *FLCLimax*
> 
> Everybody has a backlog of 3 - 5 year old DX11 games anyway!


I've been playing the same _Daggerfall_ save game, on and off, for 20 years. Never beat the main story, but I did get lost in a particularly convoluted random dungeon for most of a decade.


----------



## LoLomgbbq

Quote:


> Originally Posted by *Noufel*


I dont understand...

If Nvidia needs to release a driver update for a patch in order for a game to see benefits from that patch, then its nothing more than a standard driver update that could yield the same performance increase that driver updates usually bring over the course of a few months?


----------



## infranoia

Just bought and installed Doom. I'm struggling to understand these numbers.

I have my 290x set to stock, 1000/1250. it's a launch-day Sapphire with an AIO on it. I get 60 to 80 FPS in the first few rooms in OpenGL. Fired up Vulkan, and I'm getting 120 to 140 FPS in the same areas in gameplay (i.e. not staring at walls).

I'm shocked.

Is there a semi-'official' benchmark procedure?

Should mention 1080p (hey, it's a 65" screen, man)


----------



## EightDee8D

Quote:


> Originally Posted by *infranoia*
> 
> Just bought and installed Doom. I'm struggling to understand these numbers.
> 
> I have my 290x set to stock, 1000/1250. it's a launch-day Sapphire with an AIO on it. I get *60 to 80 FPS* in the first few rooms in OpenGL. Fired up Vulkan, and I'm getting *120 to 140 FPS* in the same areas in gameplay (i.e. not staring at walls).
> 
> I'm shocked.
> 
> Is there a semi-'official' benchmark procedure?
> 
> Should mention 1080p (hey, it's a 65" screen, man)


Holy damn, that booost man.


----------



## Kpjoslee

I am not surprised to see AMD gpus benefits most from Vulkan drivers more than Nvidia gpus, as they had lot more to gain from their so-so OpenGL performance. Nvidia GPUs doesn't suffer from CPU overhead problem so I expect any games that is DX11/12 or OpenGL/Vulkan, Nvidia won't get much benefits. I guess key would be how fast industry moves to Dx12/Vulkan exclusive games to make even playing field for AMD to compete better.


----------



## GorillaSceptre

Quote:


> Originally Posted by *infranoia*
> 
> Just bought and installed Doom. I'm struggling to understand these numbers.
> 
> I have my 290x set to stock, 1000/1250. it's a launch-day Sapphire with an AIO on it. I get 60 to 80 FPS in the first few rooms in OpenGL. Fired up Vulkan, and I'm getting 120 to 140 FPS in the same areas in gameplay (i.e. not staring at walls).
> 
> I'm shocked.
> 
> Is there a semi-'official' benchmark procedure?
> 
> Should mention 1080p (hey, it's a 65" screen, man)










wth? That's near a 1070.. Hawaii just won't die... lol.


----------



## Slomo4shO

WTB greater vulkan support.


----------



## Clocknut

Quote:


> Originally Posted by *Kpjoslee*
> 
> I am not surprised to see AMD gpus benefits most from Vulkan drivers more than Nvidia gpus, as they had lot more to gain from their so-so OpenGL performance. Nvidia GPUs doesn't suffer from CPU overhead problem so I expect any games that is DX11/12 or OpenGL/Vulkan, Nvidia won't get much benefits. I guess key would be how fast industry moves to Dx12/Vulkan exclusive games to make even playing field for AMD to compete better.


still same IMO, until DX12/vulkan took >50% of share, it is still Nvidia all the way for DX11/OpenGL driver.

AMD seems to didnt understand their refusing to fix DX11 overhead problem cost them so much.


----------



## infranoia

Quote:


> Originally Posted by *Clocknut*
> 
> still same IMO, until DX12/vulkan took >50% of share, it is still Nvidia all the way for DX11/OpenGL driver.
> 
> AMD seems to didnt understand their refusing to fix DX11 overhead problem cost them so much.


"Fixing" DX11 only goes so far in software. It's a serial API on parallel hardware (for AMD). Now DX12 / Vulkan is a parallel API on that same parallel hardware,* and Nvidia's architectures are still built out for DX11.

* gross oversimplification. "Parallel" is really "asynchronous", when DX11 just didn't provide that function.


----------



## EightDee8D

Quote:


> Originally Posted by *Clocknut*
> 
> still same IMO, until DX12/vulkan took >50% of share, it is still Nvidia all the way for DX11/OpenGL driver.
> 
> AMD seems to didnt understand their refusing to fix DX11 overhead problem cost them so much.


It's a hardware issue which they can't really fix it via drivers. do you really think they are not fixing it to push dx12 ? lol they made a whole new api which is alot harder to push than fixing a driver issue.

polaris has a bit less overhead under dx11, but still nowhere close to nvidia. although that issue only matters when you are using a cpu lower than i5. and only in some games not all.


----------



## Ghoxt

Quote:


> Originally Posted by *EightDee8D*
> 
> https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support
> 
> I know waiting for free stuff is hard, meanwhile you can educate yourself about 1 or 2 things before spreading fud.


Question, you don't think he meant current fully implemented AAA games? I don't think he's wrong in the quality department.

Released:

ROTR

Hitman

Quantum Break

Ashes - Do we call this a AAA game or a AA Benchmark?

Elder Scrolls Online DX12 patch is not out yet afaik.

Gears of War port necroed from death? No!









Beta:

Forza 6

Warhammer


----------



## Sammael7

Guys, this result in doom was predicted decades ago in dragon ball z.

Cell (nvidia) was engaged in an opengl beam battle with Gohan (amd), and then Vegeta delivered the Vulkan patch.

https://www.youtube.com/watch?v=I0WaiRulyoY

It all makes sense now.


----------



## EightDee8D

Quote:


> Originally Posted by *Ghoxt*
> 
> Question, you don't think he meant current fully implemented AAA games? I don't think he's wrong in the quality department.
> 
> Released:
> ROTR
> Hitman
> Quantum Break
> Ashes - Do we call this a AAA game or a AA Benchmark?
> 
> Elder Scrolls Online DX12 patch is not out yet afaik.
> Gears of War port necroed from death? No!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Beta:
> Forza 6
> Warhammer


The point is there are more than 3-4 games already on dx12.


----------



## 8-Ball

Should have been released with Vulkan to begin with...
I am going to say that the majority of people have already beat the game already.


----------



## infranoia

Quote:


> Originally Posted by *8-Ball*
> 
> Should have been released with Vulkan to begin with...
> I am going to say that the majority of people have already beat the game already.


Beat multiplayer? Who won?


----------



## Kana Chan

Any changes in Image Quality? No sacrifices anywhere?


----------



## EightDee8D

Quote:


> Originally Posted by *infranoia*
> 
> Beat multiplayer? Who won?


Ayysinc


----------



## Malinkadink

So we know Pascal still lacks Async support so I guess this generation of GPUs is going to be Nvidia brute forcing things as well as shoving gameworks down the throats of developers to put into their games to make themselves look good.

AMD will be just benefiting from Vulkan/DX12 with their Async shaders and keeping up with Nvidia or besting them in a more efficient manner. Interesting times ahead.


----------



## infranoia

Quote:


> Originally Posted by *Kana Chan*
> 
> Any changes in Image Quality? No sacrifices anywhere?


Not that I can see, if anything the 120+ FPS makes the particle trails even more spectacular.

But let's be honest, it's hard to tell at that speed if there are tradeoffs, but none to be found with static scenes. But you can look for yourself, there are Youtubes posted above.


----------



## 8-Ball

Quote:


> Originally Posted by *infranoia*
> 
> Beat multiplayer? Who won?












Like the multiplayer is any good to begin with.


----------



## Chargeit

Quote:


> Originally Posted by *infranoia*
> 
> Beat multiplayer? Who won?


Didn't the MP die at launch? I mean, even now, with Vulkan releasing there's less then 4,000 people playing the game on steam. It's right under "Brawlhalla" for current players online... wow.

People need to face it. This is another example for a day late dollar short. Sure, AMD finally got some performance out of this game. Problem is that performance is and was always there for Nvidia. Wow, AMD caught up. Good job. Cookies all around. You're all good consumers and make well informed purchases that validate you as a person.


----------



## infranoia

Quote:


> Originally Posted by *Chargeit*
> 
> Didn't the MP die at launch? I mean, even now, with Vulkan releasing there's less then 4,000 people playing the game on steam. It's right under "Brawlhalla" for current players online... wow.
> 
> People need to face it. This is another example for a day late dollar short. Sure, AMD finally got some performance out of this game. Problem is that performance is and was always there for Nvidia. Wow, AMD caught up. Good job. Cookies all around. You're all good consumers and make well informed purchases that validate you as a person.





Spoiler: Warning: Spoiler!


----------



## SoloCamo

Quote:


> Originally Posted by *Chargeit*
> 
> Didn't the MP die at launch? I mean, even now, with Vulkan releasing there's less then 4,000 people playing the game on steam. It's right under "Brawlhalla" for current players online... wow.
> 
> People need to face it. This is another example for a day late dollar short. Sure, AMD finally got some performance out of this game. Problem is that performance is and was always there for Nvidia. Wow, AMD caught up. Good job. Cookies all around. You're all good consumers and make well informed purchases that validate you as a person.


I plan to buy this game... still haven't played anything other than the MP beta. Games not even that old? Oh wait, I forgot on this site there are two types of buyers only - those who wait for steam sales and those who drop 60 bucks on a brand new game and beat it in 3 days because they literally have nothing better to do.

I can't believe the amount of salt I'm seeing from people when a new API is being pushed and showing very successful results. It's like some of you would prefer to not have progress at all.

This green vs red crap is a joke and some of you honestly need to grow up. Almost every thread turns into the same arguments... what an embarrassment...
Quote:


> Originally Posted by *Semel*
> 
> I bet most of performance increase on AMD hardware is due to eliminating or reducing API overhead AMD had with OpenGL\dx11. That's why they get better gains compared to nvidia. nvidia GPU already perform at their maximum and don't have this APi overhead "weakness"
> 
> Nvidia GPU performance compared to AMD:-
> 
> DX 11 - win
> OpenGL win
> vulkan - either parity or win
> DX12 -either parity or win
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, you see, nvidia and those who got nvidia gpus have nothing to worry about.


And this doesn't even make sense. Many cards from AMD are already beating their competition by a decent amount in regards to their launch times, (7970 > 680, 290x > 780ti, etc.) and with DX12 AMD cards are are ahead in many titles by good margins. So where did you get the information from to conclude that for vulkan and dx12 nvidia cards are either only exactly equal or doing better?

I wish people would just do a smidgen of research before posting. I can sometimes excuse someone speaking in person when they slip up, but here on a forum you literally read what you write as you type it (actually I'm starting to question that for many here unfortunately).


----------



## Chargeit

That's amusing. The last 20 pages of AMD trolls bashing Nvidia becomes directed at my one comment.

Nah, the AMD trolls need to stop being so damned salty. If you get a win, show some damned poise and win with dignity. Though I use win loosely since I personally couldn't care less about fps and benchmarks since my end is good.


----------



## infranoia

Quote:


> Originally Posted by *Chargeit*
> 
> That's amusing. The last 20 pages of AMD trolls bashing Nvidia becomes directed at my one comment.
> 
> Nah, the AMD trolls need to stop being so damned salty. If you get a win, show some damned poise and win with dignity. Though I use win loosely since I personally couldn't care less about fps and benchmarks since my end is good.


A strong showing by AMD means we all win. There is no "us" and "them", only competing products in a duopoly.


----------



## Bugzzz

Quote:


> Originally Posted by *Chargeit*
> 
> ... You're all good consumers and make well informed purchases that validate you as a person.


Said hypothetical person(s) can be very well informed but still being unable to afford a "well informed purchase". And lets assume even if it could afford, that hypothetical person might instead spend 1/4 or 1/5th "consuming" a console and the other 3/4'ers or 4/5'ths on not only this doom game but dozens more titles with an end result of much higher return in entertainment for $/€ spent. That seems like something a very well informed consumer would/could do.

Then sometime later at the end of the fiscal year some report/study comes out again of how PC gaming is declining and all impending doom... solution? Maybe GPU prices should double again at the mid and high end segment, it should help reverse things right? After all It's just about being "well informed" and having online social "validations".


----------



## SoloCamo

Quote:


> Originally Posted by *Chargeit*
> 
> That's amusing. The last 20 pages of AMD trolls bashing Nvidia becomes directed at my one comment.
> 
> Nah, the AMD trolls need to stop being so damned salty. If you get a win, show some damned poise and win with dignity. Though I use win loosely since I personally couldn't care less about fps and benchmarks since my end is good.


This post proves my point. The "green" vs "red". Who is winning? The consumer. Not team red nor team green as some have made this site to be about.
Quote:


> Originally Posted by *infranoia*
> 
> A strong showing by AMD means we all win. There is no "us" and "them", only competing products in a duopoly.


Exactly my point.


----------



## GorillaSceptre

Benchmarks/performance have nothing to do with whether or not a game is popular/beaten etc., one of the most de facto benches that gets thrown around is TW3... hasn't anyone beaten that yet? How about BF3 that has been posted all over the place recently?









Also has nothing to do with a "win" or "loss".. It's just giving us an idea of how games that support the new API's will perform. According to SteamSpy, Doom has sold nearly 900k copies and has had nearly 400k unique players in the last two weeks.. It's also a great game from what I've been hearing.

Strange how within a few pages of the same thread we can go from "DX11 titles matter more because backlog", to "game doesn't count because beaten already". Sounds like a group of consumers are trying to use whatever angle they can to lessen the importance of how games are going to be made going forward.

If PC ever catches up to consoles and we start seeing 30% of engines being done in compute asynchronousy, then this place is really going to get fun.


----------



## Kpjoslee

Quote:


> Originally Posted by *SoloCamo*
> 
> This post proves my point. The "green" vs "red". Who is winning? The consumer. Not team red nor team green as some have made this site to be about.
> Exactly my point.


Bu-But you are fan man with AMD logo on Avatar!











Spoiler: Warning: Spoiler!



Joke post. Don't hurt me lol.


----------



## magnek

Quote:


> Originally Posted by *GorillaSceptre*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmarks/performance have nothing to do with whether or not a game is popular/beaten etc., one of the most de facto benches that gets thrown around is TW3... hasn't anyone beaten that yet? How about BF3 that has been posted all over the place recently?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also has nothing to do with a "win" or "loss".. It's just giving us an idea of how games that support the new API's will perform. According to SteamSpy, Doom has sold nearly 900k copies and has had nearly 400k unique players in the last two weeks.. It's also a great game from what I've been hearing.
> 
> Strange how within a few pages of the same thread we can go from "DX11 titles matter more because backlog", to "game doesn't count because beaten already". Sounds like a group of consumers are trying to use whatever angle they can to lessen the importance of how games are going to be made going forward.
> 
> If PC ever catches up to consoles and we start seeing 30% of engines being done in compute asynchronousy, then this place is really going to get fun *salty*.


FTFY

In fact it'll become the world's largest virtual salt mine


----------



## tweezlednutball

just played some doom. 7970's going strong, not sure if its utilizing both of them but im getting around 100 fps everything cranked to the max 1920x1200. seriously the most bang for the buck in terms of computer hardware ever.


----------



## GorillaSceptre

Quote:


> Originally Posted by *magnek*
> 
> FTFY
> 
> In fact it'll become the world's largest virtual salt mine


I don't doubt it..


----------



## SoloCamo

Quote:


> Originally Posted by *Kpjoslee*
> 
> Bu-But you are fan man with AMD logo on Avatar!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Joke post. Don't hurt me lol.


Ha, good point. Fan man was supposed to be about the ridiculous amount of fans I typically have in my cases and the fact I don't really watercool. The AMD logo was actually intended for expressing disappointment at AMD over the recent years.

Think I'm going to have to change these up


----------



## infranoia

Quote:


> Originally Posted by *tweezlednutball*
> 
> just played some doom. 7970's going strong, not sure if its *utilizing both of them* but im getting around 100 fps everything cranked to the max 1920x1200. seriously the most bang for the buck in terms of computer hardware ever.


Definitely not. No multi-GPU yet in Doom Vulkan.

Which means you're pulling 100 FPS at 1920x1200 on that 2012 GPU.


----------



## Clocknut

Quote:


> Originally Posted by *EightDee8D*
> 
> It's a hardware issue which they can't really fix it via drivers. do you really think they are not fixing it to push dx12 ? lol they made a whole new api which is alot harder to push than fixing a driver issue.
> 
> polaris has a bit less overhead under dx11, but still nowhere close to nvidia. although that issue only matters when you are using a cpu lower than i5. and only in some games not all.


The end result for most of the games is all that matters, consumers dont really care about the hardware limitation. AMD bet their architecture too early. Nvidia get around the DX11 limitation and deliver a better overall performance for the most popular API at its time.

the Dx11 issue happens to most games, not every DX11 game push parallel multi-treading like EA frostbite engine. Vast majority of the games are still rely a lot on single thread performance, even with i7 it wont fix this problem, we simple wont have enough single threaded performance for all these games, AMD overhead problem just make fps dips further.


----------



## Kpjoslee

Quote:


> Originally Posted by *SoloCamo*
> 
> Ha, good point. Fan man was supposed to be about the ridiculous amount of fans I typically have in my cases and the fact I don't really watercool. The AMD logo was actually intended for expressing disappointment at AMD over the recent years.
> 
> Think I'm going to have to change these up


I got you lol. I was thinking that too but it was funny regardless since you had AMD logo there also. I just wish AMD Vega happens this year, so we can stop seeing Nvidia doing those anti-consumer FE editions.


----------



## Decade

Game already ran strong on my hardware, but damn. I don't think I saw a dip below 80FPS at 1440p with all the goodies turned on and ultra-nightmare enabled.
Granted, I wasn't actively looking at the FPS counter while shotgunning demons in the face running through Lazarus Labs to wreck the cyberdemon.


----------



## Defoler

Well they disabled async compute even though the 10 series does support it, so no wander there are no gains on nvidia cards.
Also the game has a lot of crash issues with older cards even in OpenGL, so this patch kinda... meh.
Its a nice starter for vulkan, but ID needs to work better to make it run on more than just a couple of cards.


----------



## infranoia

I put the 290x at 1100 GPU / 1500 MEM and FPS goes from 150 to 180. Dips to 140 looking out over the pit in the first level, and peaks at 200 when whipping around canyons. Average appears to be in the 160s.

This card just refuses to let me upgrade.


----------



## Serios

Quote:


> Originally Posted by *rcfc89*
> 
> Yeah I'm sure with record sales of the gtx1080's its probably up to 85% by now.


There we more 480's sold in 1 day that 1080 in almost a month.


----------



## Wishmaker

Oh Look! Vulkan, a tech that should have been around in 2012 but instead we got milked with expensive hw and cheap promises of proper overhead fixing in DX11







. For some, Vulkan is nothing special but proof that AMD did not even care to fix their products when they needed to do so !


----------



## ChevChelios

so right now we have 1 big Vulkan game = Doom (I can hardly count Talos Principle, its irrelevant) and 1 fully DX12 game = Ashes ..

1 + 1 = 2 games

slowly but surely


----------



## infranoia

Quote:


> Originally Posted by *Wishmaker*
> 
> Oh Look! Vulkan, a tech that should have been around in 2012 but instead we got milked with expensive hw and cheap promises of proper overhead fixing in DX11
> 
> 
> 
> 
> 
> 
> 
> . For some, Vulkan is nothing special but proof that AMD did not even care to fix their products when they needed to do so !


Know how I know that you didn't think that through? Hint: DX12, Vulkan, and Mantle > DX11 driver-of-the-week.

AMD had a plan, they called out that plan back in the Mantle days, and this stuff was in motion back in 2013. None of this should be a surprise to anyone, least of all Nvidia.
Quote:


> Originally Posted by *ChevChelios*
> 
> so right now we have 1 big Vulkan game = Doom (I can hardly count Talos Principle, its irrelevant) and 1 full DX12 game = Ashes ..
> 
> 1 + 1 = 2 games
> 
> slowly but surely


Your lies are sounding more desperate. Are the checks drying up?


----------



## ChevChelios

Quote:


> AMD had a plan, they called out that plan back in the Mantle days, and this stuff was in motion back in 2013


so you saiyan it only took them 3 years to catch up in a few games

*slowclap*


----------



## Noufel

Quote:


> Originally Posted by *ChevChelios*
> 
> so right now we have 1 big Vulkan game = Doom (I can hardly count Talos Principle, its irrelevant) and 1 fully DX12 game = Ashes ..
> 
> 1 + 1 = 2 games
> 
> slowly but surely


Nah ashes is a benchmark not a game and doom ...... who plays doom anyway


----------



## infranoia

Quote:


> Originally Posted by *ChevChelios*
> 
> so you saiyan it only took them 3 years to produce these 2 games *APIs* that I just mentioned


FTFY


----------



## ChevChelios

Quote:


> Originally Posted by *Noufel*
> 
> Nah ashes is a benchmark not a game and doom ...... who plays doom anyway


well Doom MP *is* pretty dead, which also makes me sad

but the SP is fantastic, and if you own AMD GPU and still havent beaten SP then this patch has real value for you


----------



## flippin_waffles

Isnt it true that Vulkan and DX12 are being adopted by developers at a very high rate? If any advantages that NV has in DX11 disappears with modern APIs then what. Is a 1060 going to be able to compete with an RX480 in modern games like Doom? It wont come close in Doom judging by these results, that is for sure. Its not surprising though considering AMD's expertise on the subject with the creation of Mantle which went on to become Vulkan. GCN should do quite well in Vulkan titles as well as DX12 titles. This puts Polaris in a powerful position on the market.


----------



## Noufel

Quote:


> Originally Posted by *ChevChelios*
> 
> well Doom MP *is* pretty dead, which also makes me sad
> 
> but the SP is fantastic, and if you own AMD GPU and still havent beaten SP then this patch has real value for you


Some of the mp mods are fantastic and very fun








On topic we need opengl and dx11 to die already


----------



## GorillaSceptre

Quote:


> Originally Posted by *infranoia*
> 
> Know how I know that you didn't think that through? Hint: DX12, Vulkan, and Mantle > DX11 driver-of-the-week.
> 
> AMD had a plan, they called out that plan back in the Mantle days, and this stuff was in motion back in 2013. None of this should be a surprise to anyone, least of all Nvidia.


Exactly..

People are completely missing the point (some of which I'd say are intentionally doing so). DICE, and other big studios + countless other devs who talk big wanted an architecture like GCN for years, and still no one used it, it got so bad that they had to resort to making their own damn API (with DICE's help + others).. GCN was unfortunately far ahead of it's time.

Whoever says "All this proves is how AMD didn't fix DX11" doesn't know what they're talking about.. There's nothing to fix, it's the same as expecting/telling Nvidia to "fix" Maxwell's DX12 performance, it comes down to the architectures, it's not all about drivers..

Quote:


> Originally Posted by *infranoia*
> 
> Your lies are sounding more desperate. Are the checks drying up?


Don't even waste your time, he's one of the worst trolls on this board.


----------



## ChevChelios

Quote:


> Is a 1060 going to be able to compete with an RX480 in modern games like Doom?


considering its undoubtedly excellent OpenGL performance - yes it will

Quote:


> On topic we need opengl and dx11 to die already


only *after* I upgrade to 1180 Volta


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> considering its undoubtedly *excellent* OpenGL performance - yes it will


----------



## ChevChelios

Quote:


> Originally Posted by *EightDee8D*


share with the class


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> share with the class


Nothing to share, waiting for magical driver and reviews.


----------



## GorillaSceptre

Quote:


> Originally Posted by *infranoia*
> 
> I put the 290x at 1100 GPU / 1500 MEM and FPS goes from 150 to 180. Dips to 140 looking out over the pit in the first level, and peaks at 200 when whipping around canyons. Average appears to be in the 160s.
> 
> This card just refuses to let me upgrade.


What settings are you using? What do you mean dips into the 140's..







Lol, this game is getting a ridiculously huge boost with Vulkan, what the hell did ID do..


----------



## infranoia

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What settings are you using? What do you mean dips into the 140's..
> 
> 
> 
> 
> 
> 
> 
> Lol, this game is getting a ridiculously huge boost with Vulkan, what the hell did ID do..


I'm as surprised as you are, I've never seen a single game patch *OR* API show these gains. Well, maybe Gears of War...









I'm at full Ultra, everything maxxed out, Nightmare stats running. 1080p. And I did see a 127 in a scrum just a bit ago, that's the bottom end I've seen so far at 1100/1500. Mostly I'm seeing mid-160's up there, at least when I can look away from the gore.


----------



## GorillaSceptre

Quote:


> Originally Posted by *infranoia*
> 
> I'm as surprised as you are, I've never seen a single game patch *OR* API show these gains. Well, maybe Gears of War...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm at full Ultra, everything maxxed out, Nightmare stats running. 1080p. And I did see a 127 in a scrum just a bit ago, that's the bottom end I've seen so far at 1100/1500. Mostly I'm seeing mid-160's up there, at least when I can look away from the gore.


Just... wow.


----------



## infranoia

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Just... wow.


A-ha! Aniso was only 8x, missed the scroll bar there. Bumped it up to 16x and shaved maybe 5fps off, now averaging high 150's low 160's. TSSAA was always on, so async shaders obviously cranking hard.


----------



## tkenietz

You have to wonder, will other devs see this and think, "we need to get on this"?
What developer wouldn't want their game to look as good as possible, maxed out on more people's systems?

Also, if this is indicitive of what Vulcan can do and not an outlier, and Vulcan catches on, could we see 290x competing with yet another x80 gpu? Very interesting.


----------



## GorillaSceptre

Quote:


> Originally Posted by *infranoia*
> 
> A-ha! Aniso was only 8x, missed the scroll bar there. Bumped it up to 16x and shaved maybe 5fps off, now averaging high 150's low 160's. TSSAA was always on, so async shaders obviously cranking hard.


I thought i was being far to optimistic about Vulkan/DX12.. Even with those expectations i didn't think gains like these would ever happen.. Well done to id Software i guess, they knocked it out the park. I wonder if DICE will deliver the same with BF1.

Still quite unbelievable if I'm honest, lol..


----------



## MikeDuffy

This Vulkan patch and the new Tomb Raider DX12 are a nightmare for Nvidia's PR. I mean, the 1060's launch will be have trouble convincing people of their superiority in the new APIs - seems as though the impression of poor next-gen performance is creeping into people's minds.


----------



## Defoler

Quote:


> Originally Posted by *infranoia*
> 
> FTFY


Nvidia are also on the same group as AMD in there, and they also contributed to vulkan the same as they contributed to OpenGL. AMD didn't developed the API on their own.

Also since you already forgot since it must have been 10 decades ago, AMD put all of their eggs into mantle, not vulkan. They claimed mantle will be superior in any way. Except that it took them so long, and its contribution was so low, that it got scraped.

So basically, the ones who made the good parts of mantle idea successful, were actually mirosoft (DX12) and Khronos (vulkan/opengl).
AMD took a good idea, made it crap, worthless, and "open" but only belongs to them (an open source which was never released, is not an open source, same with tressfx for 3 full years), and then someone else took it and actually made it successful and working.
You can pretty much say that if AMD hadn't been so head strong with mantle and kept it for themselves, we could have gotten vulkan and DX12 earlier.
Also you can say that because AMD kept it so secretly for so long and so proprietary (until it died), nvidia didn't get a sniff of it, until lately with vulkan and DX12, so basically, AMD hindered nvidia in the first place, making it "gameworks 2.0" act, while whining about gameworks constantly.


----------



## Kpjoslee

Quote:


> Originally Posted by *infranoia*
> 
> A-ha! Aniso was only 8x, missed the scroll bar there. Bumped it up to 16x and shaved maybe 5fps off, now averaging high 150's low 160's. TSSAA was always on, so async shaders obviously cranking hard.


Pics? Kinda hard to believe you are getting those numbers.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Kpjoslee*
> 
> Pics? Kinda hard to believe you are getting those numbers.


Async isn't enabled there..

GamersNexus did the same thing, i wonder why id didn't make Async activation more clear cut..


----------



## Defoler

Quote:


> Originally Posted by *MikeDuffy*
> 
> This Vulkan patch and the new Tomb Raider DX12 are a nightmare for Nvidia's PR. I mean, the 1060's launch will be have trouble convincing people of their superiority in the new APIs - seems as though the impression of poor next-gen performance is creeping into people's minds.


For now.
It will even out once ID reactivates async compute for the 10 series, which is the reason why AMD gets a boost and nvidia doesn't. Currently they completely shut down the async compute for nvidia.


----------



## EightDee8D

Quote:


> Originally Posted by *Defoler*
> 
> For now.
> It will even out once ID reactivates async compute for the 10 series, which is the reason why AMD gets a boost and nvidia doesn't. Currently they completely shut down the async compute for nvidia.


Keep dreaming lol


----------



## infranoia

Quote:


> Originally Posted by *Kpjoslee*
> 
> Pics? Kinda hard to believe you are getting those numbers.


I was hoping another 290x'er could validate. I'm working on it, trying to find the right utility:
Quote:


> *While running the Vulkan API, DOOM crashes when taking a screenshot using the Steam overlay on AMD GPUs.*
> This is a known issue with Vulkan API and AMD GPUs.


Quote:


> Originally Posted by *GorillaSceptre*
> 
> Async isn't enabled there..
> 
> GamersNexus did the same thing, i wonder why id didn't make Async activation more clear cut..


Yep, look at that. SMAA is enabled. No async shader render path is active in that video. It has to be TSSAA or nothing.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Defoler*
> 
> For now.
> It will even out once ID reactivates async compute for the 10 series, which is the reason why AMD gets a boost and nvidia doesn't. Currently they completely shut down the async compute for nvidia.


Pascal/Maxwell cannot do compute+graphics in parallel..

There's a reason every game with Async support disable it on Nvidia hardware..


----------



## Wovermars1996

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I thought i was being far to optimistic about Vulkan/DX12.. Even with those expectations i didn't think gains like these would ever happen.. Well done to id Software i guess, they knocked it out the park. I wonder if DICE will deliver the same with BF1.
> 
> Still quite unbelievable if I'm honest, lol..


I can't believe I'm averaging 100fps with Vulkan on Ultra preset with a 380X when I used to only able to get about 60fps on OpenGL


----------



## infranoia

I can't find a tool that works to take screenshots of Doom Vulkan in exclusive fullscreen. If anyone has a lead let me know.

MSI Afterburner fails, Steam Overlay crashes, and when I go borderless window I lose ~60 FPS.

Gah, I'll have to install Raptr to test that. Please help me avoid that fate...


----------



## Greenland

You can only take screenshot with VSYNC ON, and you lose a couple of FPS because of that ( 7 for me ).


----------



## Tgrove

Quote:


> Originally Posted by *Defoler*
> 
> Nvidia are also on the same group as AMD in there, and they also contributed to vulkan the same as they contributed to OpenGL. AMD didn't developed the API on their own.
> 
> Also since you already forgot since it must have been 10 decades ago, AMD put all of their eggs into mantle, not vulkan. They claimed mantle will be superior in any way. Except that it took them so long, and its contribution was so low, that it got scraped.
> 
> So basically, the ones who made the good parts of mantle idea successful, were actually mirosoft (DX12) and Khronos (vulkan/opengl).
> AMD took a good idea, made it crap, worthless, and "open" but only belongs to them (an open source which was never released, is not an open source, same with tressfx for 3 full years), and then someone else took it and actually made it successful and working.
> You can pretty much say that if AMD hadn't been so head strong with mantle and kept it for themselves, we could have gotten vulkan and DX12 earlier.
> Also you can say that because AMD kept it so secretly for so long and so proprietary (until it died), nvidia didn't get a sniff of it, until lately with vulkan and DX12, so basically, AMD hindered nvidia in the first place, making it "gameworks 2.0" act, while whining about gameworks constantly.


How much more wrong could you be? dx12 and vulkan are mantle.

https://mobile.twitter.com/renderpipeline/status/581086347450007553

https://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019


----------



## Greenland

So Vulkan and Dx12 are AMD's Gameworks now, HAHAHAHAHAHA.


----------



## magnek

Quote:


> Originally Posted by *Defoler*
> 
> For now.
> It will even out once ID reactivates async compute for the 10 series, which is the reason why AMD gets a boost and nvidia doesn't. Currently they completely shut down the async compute for nvidia.


Async compute is only enabled if you use no AA or TSSAA, so it'd be easy enough to test how much async compute (doesn't) play a role.


----------



## infranoia

I'll have to go old-school and shoot it on camera-- even Action! fails to capture fullscreen Vulkan. Too new I guess. Tomorrow's project, if I'm not vindicated by then.


----------



## Kpjoslee

Quote:


> Originally Posted by *infranoia*
> 
> I was hoping another 290x'er could validate. I'm working on it, trying to find the right utility:
> 
> Yep, look at that. SMAA is enabled. No async shader render path is active in that video. It has to be TSSAA or nothing.


I just noticed a problem also, that you just can't take screenshots with TSSAA enabled. I have seen about 20-40fps improvements from all the videos/screenshots i have seen from AMD gpus on Vulkan using TSSAA so your numbers do make sense. Should be around 130-140fps average on 1080p including all the demanding scenario.


----------



## HackHeaven

Who gives a dam about dx11 dx12 and vulkan will be coming its just a matter of time so get over it already
That link he posted lists like 20 games that have dx12 and most of them are newish/games people play alot

Why do they even make dx11 games anymore thats redundant isnt it?


----------



## Kpjoslee

Quote:


> Originally Posted by *HackHeaven*
> 
> Who gives a dam about dx11 dx12 and vulkan will be coming its just a matter of time so get over it already
> That link he posted lists like 20 games that have dx12 and most of them are newish/games people play alot
> 
> Why do they even make dx11 games anymore thats redundant isnt it?


Because Not everyone has Windows 10 lol. Games going to be DirectX11/12 for a while. Vulkan do show a lot of promise so I hope more developers consider using Vulkan, and it doesn't even require WIndows 10.


----------



## XenoRad

I'd like to see a new benchmark where both nVidia and AMD are tested and the highest results (whether on OpenGL or Vulkan) will be used to rate the cards.

I think we'll get a better picture of how the cards fare regarding best performance.


----------



## Randomdude

This is reminiscent of Netburst vs. more IPC/cores. AMD GPU's have more raw power but can't utilize it. I am hopeful that dx12 and Vulcan provide the means.


----------



## Kuivamaa

Quote:


> Originally Posted by *SlackerITGuy*
> 
> Now more people will get to enjoy what we've been enjoying when running Battlefield 4 using Mantle since January 2014.
> 
> Congrats DOOM players.
> 
> Awesome showing by AMD cards.


Doom Vulkan implementation perfectly mirrors BF4 and mantle. And yes, Radeon users have been enjoying such benefits for 2,5 years already.


----------



## infranoia

Quote:


> Originally Posted by *Kpjoslee*
> 
> I just noticed a problem also, that you just can't take screenshots with TSSAA enabled. I have seen about 20-40fps improvements from all the videos/screenshots i have seen from AMD gpus on Vulkan using TSSAA so your numbers do make sense. Should be around 130-140fps average on 1080p including all the demanding scenario.


Best I could do was run it in Windowed mode. It's about a 10 to 15 FPS difference from exclusive fullscreen, but it's still not too shabby.



Spoiler: Warning: Screenshots!


----------



## Orthello

Um just an interuption to the storm this vulkan patch has brewed up on here ..

How do you get this thing to install ? , steam says its up to date and yet it just loads open gl 4.5 version with no advanced menu on load etc.

Is the update regionalised ?


----------



## ku4eto

Quote:


> Originally Posted by *Orthello*
> 
> Um just an interuption to the storm this vulkan patch has brewed up on here ..
> 
> How do you get this thing to install ? , steam says its up to date and yet it just loads open gl 4.5 version with no advanced menu on load etc.
> 
> Is the update regionalised ?


I think it had something to do with adding -vulkan to the launch options, just like DotA 2.


----------



## Kpjoslee

Quote:


> Originally Posted by *infranoia*
> 
> Best I could do was run it in Windowed mode. It's about a 10 to 15 FPS difference from exclusive fullscreen, but it's still not too shabby.
> 
> 
> 
> Spoiler: Warning: Screenshots!










Vulkan showing some great promises.

I think this might be the best comparison video for now. I was able to guess your average based on 480 numbers with Vulkan.


----------



## Orthello

Quote:


> Originally Posted by *ku4eto*
> 
> I think it had something to do with adding -vulkan to the launch options, just like DotA 2.


Hmm just tried that an still the same thing, its like it has not updated at all . I'm in NZ so maybe the patch is regionalised , any guys from Australia here able to update their version ??


----------



## Orthello

Hmm.. got it working, it did update, it must have updated yesterday here ..


----------



## GunfighterAK

Strange how some are experiencing so much trouble. I got the option to choose Vulkan or OpenGL in the advanced video section and a nice boost in FPS on my old R9 290


----------



## Orthello

Well it seems i'm getting about a 10%-15% lift in max frame rate in 4k , max of about 76 fps in ultra 4k settings for this particular part i'm in , got about a max of 86 fps in vulkan . Definately faster as the average frame rate in vulcan is near the maximum frame rate in open gl .. and i'd put it at about 10% + from what i'm seeing. I'll try some more areas.

Hmm just re run it in open gl .., just hit what seemed like similar numbers in open gl to vulkan i guess i might have been wanting it to be better.







, more testing needed. Need a way to record min and max fps for a particular run.

Just loaded the level and stood still and then turned 360 degrees - not much action happening admittedly, frame rates are virtually identical on my setup. 5820k @ 4.7 , Titan X 1545/8050.

Still 4k is not exactly the cpu limited res .. RX480 gets a nice lift at 4k though so was hoping.


----------



## Serios

This one of the worst attempts at distorting facts I've seen.
Your problem is we already know better than you how these events took place.
Quote:


> Originally Posted by *Defoler*
> 
> Nvidia are also on the same group as AMD in there, and they also contributed to vulkan the same as they contributed to OpenGL. AMD didn't developed the API on their own.


And what is Nvidia's contribution? It's anyway less meaningful than AMD's.
What is Nvidia's contribution to DX12 for example? Nvidia fans were claiming that DX12 was going to save the world from DX11 and Mantle but these days they insist on bashing it.
Quote:


> Also since you already forgot since it must have been 10 decades ago, AMD put all of their eggs into mantle, not vulkan. They claimed mantle will be superior in any way. Except that it took them so long, and its contribution was so low, that it got scraped.


No point in continuing with Mantle when Vulkan and DX12 are basically the same thing. It makes sense don't you think?
AMD's strategy has paid off Mantle was a success for the company, now both major APIs that are Mantle like.
Quote:


> So basically, the ones who made the good parts of mantle idea successful, were actually mirosoft (DX12) and Khronos (vulkan/opengl).
> AMD took a good idea, made it crap, worthless, and "open" but only belongs to them (an open source which was never released, is not an open source, same with tressfx for 3 full years), and then someone else took it and actually made it successful and working.


What's with this nonsense? Both Microsoft and Khronos embraced AMD's idea and created Mantle like APIs. The rest is history.
Quote:


> You can pretty much say that if AMD hadn't been so head strong with mantle and kept it for themselves, we could have gotten vulkan and DX12 earlier.


Nonsense. We got both Vulkan and DX12 an soon as it was possible.

Quote:


> Also you can say that because AMD kept it so secretly for so long and so proprietary (until it died), nvidia didn't get a sniff of it, until lately with vulkan and DX12, so basically, AMD hindered nvidia in the first place, making it "gameworks 2.0" act, while whining about gameworks constantly.


Funny joke, didn't Nvidia refuse to have anything to do with Mantle? didn't they say they aren't interested at all in Mantle that they will concentrate on DX12(this is an even funnier joke)?


----------



## _LDC_

you know guys what's really missing here? DOOM/Vulkan running on Linux....


----------



## f1LL

Quote:


> Originally Posted by *_LDC_*
> 
> you know guys what's really missing here? DOOM/Vulkan running on Linux....


Exactly my thoughts.


----------



## daviejams

Quote:


> Originally Posted by *_LDC_*
> 
> you know guys what's really missing here? DOOM/Vulkan running on Linux....


Better if they concentrate on the operating system 99% of people use , so no nothing is missing here


----------



## Blameless

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Lol, this game is getting a ridiculously huge boost with Vulkan, what the hell did ID do.


Possible that Vulkan is just that much more efficient than OGL on modern ATI/AMD parts.

In all the cross API tests I've got, OGL performs substantially worse than the DX11 renderer. If they had Vulkan it would likely perform noticeably better than DX11 on GCN hardware, which would certainly leave OGL in the dust. So, I'm not sure ID has done anything particularly special, or if this is just a reflection of the API's intrinsic advantages on AMD modern hardware.
Quote:


> Originally Posted by *Defoler*
> 
> It will even out once ID reactivates async compute for the 10 series, which is the reason why AMD gets a boost and nvidia doesn't. Currently they completely shut down the async compute for nvidia.


Might help Pascal, but not likely to the same degree as it's helping any of the GCN parts, and it's not likely to help pre-Pascal cards at all.

My fastest NVIDIA parts are still Kepler and I'm not expecting any significant improvements in anything with any future driver on them.
Quote:


> Originally Posted by *HackHeaven*
> 
> Who gives a dam about dx11 dx12 and vulkan will be coming its just a matter of time so get over it already
> That link he posted lists like 20 games that have dx12 and most of them are newish/games people play alot


I'm still more concerned with DX11 performance because the game I play most and am likely to be playing most for the next several years is DX11 and won't be switching API's any time soon (and when it does Vulkan is a stronger possibility than DX12).

I don't have any DX12 games and I don't expect to play any of the DX 12 games that have been announced. I'm also not at all impressed with Windows 10, so if I can't scrounge up some affordable Server 2016 licences, DX12 won't be possible for me outside of testing (I'll boot to it to run some benchmarks, but I cannot stand using that OS for long).

Vulkan performance does interest me, as it works in Windows 7/Server 2008 R2 If I have to transition away from these OSes, I'm probably going Linux and games that have Vulkan support are much more likely to receive Linux versions.
Quote:


> Originally Posted by *HackHeaven*
> 
> Why do they even make dx11 games anymore thats redundant isnt it?


Because not every with DX12 capable hardware has Windows 10 and not everyone with Windows 10 has hardware capable of even the lowest DX12 feature levels.

Games that only have DX12 support automatically rule out a substantial portion of their potential market.


----------



## XenoRad

Here's how I see things

1. nVidia has better Dx11 performance;

2. AMD has an advantage in Dx12 and Vulkan but not enough to dethrone the high-end nVidia cards though some mid-range cards may trade places in specific games;

3. Dx12 and Vulkan are better suited for AMD hardware as they've had the most input in creating it;

4. For the moment all APIs (OpenGl, Vulkan, Dx11 and Dx12) offer the same graphics at different performance levels depending on the game and video card brand.

All in all Dx12 and Vulkan performance may improve in nVidia or it may not. At the end of the day I think it's perfectly viable to use Dx12 and Vulkan if you have AMD and Dx11 if you have nVidia. Whatever gets you the best performance in any specific game - use that.

If Vulkan works great for AMD then excellent. If nVidia doesn't need it to produce the same or better results - then great as well.


----------



## ChevChelios

Quote:


> Originally Posted by *Kuivamaa*
> 
> Doom Vulkan implementation perfectly mirrors *BF4 and mantle. And yes, Radeon users have been enjoying such benefits for 2,5 years already*.


agreed









http://forums.videocardz.com/topic/1513-amd-crimson-edition-1672-hotfix/
Quote:


> Know Issues:
> 
> Battlefield™ 4 may experience crashes when using Mantle. As a work around users are suggested to switch to DirectX®11.


----------



## Unkzilla

Coming from someone who played through Doom on a 390 then replayed most of it on a FuryX - the performance was below where it should have been.. good to see this gets some gains. And hopefully Triple buffering now works as that was a absolute joke









On my 1080 Vulkan seems to help @ 4k. With SMAA I haven't seen many dips under 60FPS now, used to dip a lot more often . Bit hard to judge without a specific built in benchmark but no complaints here


----------



## Hanjin

Woah I'm getting close to 175fps with my I3 6100 and R9 390 on 1080p on Ultimate and Nightmare settings.

Even getting 80-90fps on 1440p settings using VSR.


----------



## ku4eto

Quote:


> Originally Posted by *ChevChelios*
> 
> agreed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.videocardz.com/topic/1513-amd-crimson-edition-1672-hotfix/


The game was super buggy the last time i played. Random crashes on both DX11 and Mantle for no reason. Invisible walls, AI getting stuck, falling through textures and such. Its not an AMD issue, its the game.


----------



## JackCY

Quote:


> Originally Posted by *FLCLimax*
> 
> When CawaDuty and WoW adopt DX12 nobody will play those either!


LOL. Might actually be true with WoW, that crap is dying thanks to Blizzard themselves killing their own community, killing mods and servers.


----------



## ChevChelios

Quote:


> Originally Posted by *ku4eto*
> 
> The game was super buggy the last time i played. Random crashes on both DX11 and Mantle for no reason. Invisible walls, AI getting stuck, falling through textures and such. Its not an AMD issue, its the game.


never had any serious issues with BF4 DX11 on Nvidia card/drivers

and definitely not crashes


----------



## Kuivamaa

Quote:


> Originally Posted by *ChevChelios*
> 
> agreed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.videocardz.com/topic/1513-amd-crimson-edition-1672-hotfix/


You can troll better than that. Mantle was awesome early on,through 2014 and for about half 2015. BF4, DA:I, Thief, Sniper Elite III became very enjoyable experiences using mantle. The particular render path has been abandoned since last year in recent drivers but DX12-vulkan are already here (I am already enjoying Hitman and Doom) and DX11 radeon drivers are very good these days on BF4 too (and DAI).


----------



## JackCY

Quote:


> Originally Posted by *ChevChelios*
> 
> never had any serious issues with BF4 DX11 on Nvidia card/drivers
> 
> and definitely not crashes


Not even stutter? *cough* Paxwell.


----------



## ChevChelios

Quote:


> Originally Posted by *JackCY*
> 
> LOL. Might actually be true with WoW, that crap is dying thanks to Blizzard themselves killing their own community, killing mods and servers.


WoW has been "dying" for the last 8-10 years now according to some

how much longer can it take









will probably have 9+ mil subs at Legion launch, then slowly decline down to 6-7 until the first content patch

not bad for a subscription MMO in 2016


----------



## ChevChelios

Quote:


> Originally Posted by *JackCY*
> 
> Not even stutter? *cough* Paxwell.


butter smooth


----------



## ku4eto

Quote:


> Originally Posted by *ChevChelios*
> 
> WoW has been "dying" for the last 8-10 years now according to some
> 
> how much longer can it take
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will probably have 9+ mil subs at Legion launch, then slowly decline down to 6-7 until the first content patch
> 
> not bad for a subscription MMO in 2016


Lol, not close. Cata was 9mils at launch and 6-7 after that. Pandaria and afterwards - it has even lower, but no numbers are available, because Blizzard decided to go silent about this. Legion will have probably 6-7 mils at launch and tops 5 mils 3 months later. The game is nothing it used to be, they completely disregarded the community opinion. A game that still runs on DX9 and barely utilizes 4 threads in 2016 (prior to WoTLK it was only 2 threaded) when there are games that can use 8.


----------



## ChevChelios

Quote:


> Originally Posted by *ku4eto*
> 
> Lol, not close. Cata was 9mils at launch and 6-7 after that. Pandaria and afterwards - it has even lower, but no numbers are available, because Blizzard decided to go silent about this. Legion will have probably 6-7 mils at launch and tops 5 mils 3 months later. The game is nothing it used to be, they completely disregarded the community opinion. A game that still runs on DX9 and barely utilizes 4 threads in 2016 (prior to WoTLK it was only 2 threaded) when there are games that can use 8.


WoD had 10 million at launch, but lost a lot after that, but Legion looks to have much better lasting content, so likely less than 10 at launch and then a slower decline than WoD had

also, WoW has DX11 option


----------



## Particle

I tried it last night. I can't say it felt any smoother, but that is likely a CPU limitation as I had had the audacity to play a YT video in the background. There are only so many cycles to go around on a 3.7 GHz quad core.


----------



## Robenger

Quote:


> Originally Posted by *Particle*
> 
> I tried it last night. I can't say it felt any smoother, but that is likely a CPU limitation as I had had the audacity to play a YT video in the background. There are only so many cycles to go around on a 3.7 GHz quad core.


What CPU do you have?


----------



## Particle

Quote:


> Originally Posted by *Robenger*
> 
> What CPU do you have?


Athlon X4 845 (Excavator)


----------



## Rabit

Lets theoretically calculate 1060 performance in Doom:
1060 Gflop/ 1070 Gflop X 150 FPS ~ 100FPS


----------



## SoloCamo

Quote:


> Originally Posted by *ChevChelios*
> 
> agreed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.videocardz.com/topic/1513-amd-crimson-edition-1672-hotfix/


Do you literally have nothing better to do? That known issue first of all is that it "may" cause crashes. I've still been able to use it for the most part without issue. Secondly, and more importantly this issue just came up in the last two or three BETA driver releases that were released in a very short time period of each other as AMD has been pumping out drivers/hotfixes like no tomorrow recently.

It has not been an issue at all since it's inception otherwise. And again, it's not like it doesn't work still even on the known issue beta drivers. The only issue I've really ran into is sometimes it will crash on launching it, but usually just relaunching it works immediately and it doesn't affect actual gameplay.


----------



## 7850K

saw this german article posted on reddit. http://www.pcgameshardware.de/Doom-2016-Spiel-56369/News/Vulkan-Patch-bessere-Performance-Benchmarks-1201321/
They tested the FX-8350 in DOOM/vulkan and were able to keep 60fps even with the CPU at 1.8Ghz in power saver mode.


----------



## looniam

though still not finding it acceptable to be locking/freezing w/nvidia, happened again doing "nothing", but _it does appear my FPS went up due to the CPU rendering time_:

openGL max: 12.4(something)ms
vulkan max: 7.68ms

don't remember the avg FPS ATM but overall from just over ~105 fps(OGL) to pretty steady over ~120ish(vulkan).

so maybe that's why i can kill monsters more efficiently.


----------



## iRUSH

I got a big FPS boost but it seems like the graphics took a slight hit. I always felt that Mantle did the same thing. Everything looks a little washed out. Anyone else notice this?


----------



## infranoia

Quote:


> Originally Posted by *iRUSH*
> 
> I got a big FPS boost but it seems like the graphics took a slight hit. I always felt that Mantle did the same thing. Everything looks a little washed out. Anyone else notice this?


I don't see any difference in IQ on my 290x.

Hardly proof due to video compression, but there's nothing obvious here:


----------



## Shadowarez

What preset you using iv cranked it to nightmare/Ultra its still beatifull inc the fluid fire. Im using a TX though with gsync monitor.


----------



## pengs

Overclocking responds and scales really well using Vulkan. 1000/1250 120fps vs 1100//1500 140fps, minimum changes from 65fps to 75fps and that's a really basic OC. That's what happens when the game is no longer CPU dependent.
Quote:


> Originally Posted by *looniam*
> 
> though still not finding it acceptable to be locking/freezing w/nvidia, happened again doing "nothing", but _it does appear my FPS went up due to the CPU rendering time_:
> 
> openGL max: 12.4(something)ms
> vulkan max: 7.68ms
> 
> don't remember the avg FPS ATM but overall from just over ~105 fps(OGL) to pretty steady over ~120ish(vulkan).
> 
> so maybe that's why i can kill monsters more efficiently.


Some people on mid range/low end NVIDIA cards who are running 30-45fps are also commenting that Vulkan is more playable even at a lower frame rate vs. OGL.

Probably something to do with the frame timing. I don't doubt that the Vulkan was able to reduce mouse lag also at lower frame rates.


----------



## _LDC_

Quote:


> Originally Posted by *daviejams*
> 
> Better if they concentrate on the operating system 99% of people use , so no nothing is missing here


"If I had asked people what they wanted, they would have said faster horses." (Henry Ford)


----------



## Eorzean

Quote:


> Originally Posted by *_LDC_*
> 
> "If I had asked people what they wanted, they would have said faster horses." (Henry Ford)


Amazing quote. Lol!


----------



## looniam

Quote:


> Originally Posted by *pengs*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Overclocking responds and scales really well using Vulkan. 1000/1250 120fps vs 1100//1500 140fps, minimum changes from 65fps to 75fps and that's a really basic OC. That's what happens when the game is no longer CPU dependent.
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> though still not finding it acceptable to be locking/freezing w/nvidia, happened again doing "nothing", but _it does appear my FPS went up due to the CPU rendering time_:
> 
> openGL max: 12.4(something)ms
> vulkan max: 7.68ms
> 
> don't remember the avg FPS ATM but overall from just over ~105 fps(OGL) to pretty steady over ~120ish(vulkan).
> 
> so maybe that's why i can kill monsters more efficiently.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some people on mid range/low end NVIDIA cards who are running 30-45fps are also commenting that Vulkan is more playable even at a lower frame rate vs. OGL.
> 
> Probably something to do with the frame timing. I don't doubt that the Vulkan was able to reduce mouse lag also at lower frame rates.
Click to expand...

thats good to hear for the budget guys.









i froze again since my last post and though i didn't want my day off this week to be screwing with drivers, i may just give the vulkan devs a spin.

EDIT:
*nevermind those dev drivers just redirect to what i already have . .what a freaking JOKE!*

EDIT II;
found what i was looking for the 356.45 - vulkan only . .


Spoiler: Warning: Spoiler!


----------



## magnek

Who the hell is V. Konly? That sounds suspiciously like malware to me.


----------



## daviejams

Quote:


> Originally Posted by *_LDC_*
> 
> "If I had asked people what they wanted, they would have said faster horses." (Henry Ford)


irrelevant

Why waste time and money making a game run on an operating system 20 people use when you can spend that time and money improving it for the people who actually buy your games


----------



## looniam

Quote:


> Originally Posted by *magnek*
> 
> Who the hell is V. Konly? That sounds suspiciously like malware to me.


it was NV's first vulkan driver. however, nevermind:


Spoiler: Warning: Spoiler!



Code:



Code:


Callstack Function(desc)                       Line Bytes File       Process       Address
---------------------------                    ---- ----- ----       -------       -------
vkWaitForFences()                              ...  +     0xea9ddcf0 vulkan-1.dll  
GetGameSystemInterface()                       ...  +     0x5a5af2da DOOMx64vk.exe 
GetGameSystemInterface()                       ...  +     0x5d784ff6 DOOMx64vk.exe 
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x4d093280 DOOMx64vk.exe 
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x00000286 ?             
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x00000202 ?             
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x095cf670 ?             
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x095d04a0 ?             
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x59b92878 DOOMx64vk.exe 
GetGameSystemInterface()                       ...  +     0x5b818b06 DOOMx64vk.exe 
GetGameSystemInterface()                       ...  +     0x5b6ca790 DOOMx64vk.exe 
GetGameSystemInterface()                       ...  +     0x5b6ab3a8 DOOMx64vk.exe 
RtlRestoreContext()                            ...  +     0x7718cd51 ntdll.dll     
** UNKNOWN **(** FUNC_PARAM_ERROR **)          ...  +     0x59b8dead DOOMx64vk.exe 
GetGameSystemInterface()                       ...  +     0x5b2eb8d6 DOOMx64vk.exe 
BaseThreadInitThunk()                          ...  +     0x770359ed kernel32.dll  
RtlUserThreadStart()                           ...  +     0x7716b371 ntdll.dll

Register Info                
---------------------------  
EDI:    0x00000000095D04A0  ESI: 0x00000000095D1D70  EAX:   0x0000000141E2FAD8
EBX:    0x0000000144B5A7A0  ECX: 0x0000000000000000  EDX:   0x0000000000000002
EIP:    0x000007FEEA9DDCF0  EBP: 0x00000000095DBC70  SegCs: 0x0000000000000033
EFlags: 0x0000000000010206  ESP: 0x00000000095CF628  SegSs: 0x000000000000002B

Exception Info               
---------------------------  
ExpCode:          0xC0000005 (Access Violation)
ExpFlags:         0         
ExpAddress:       0x000007FEEA9DDCF0

Build & Runtime Info        
--------------------------- 
User:             LoonIam        
Version:          20160706-141600-denim-ginger        
File Path:        G:\Program Files (x86)\Steam\steamapps\common\DOOM\DOOMx64vk.exe        
System Time:      7/12/2016 12:44:10        
Build String:     20160706-141600-denim-ginger        
VT File Path:             
VMTR Override:    generated/pagefiles        
Launch Command:   "G:\Program Files (x86)\Steam\steamapps\common\DOOM\DOOMx64vk.exe" G:\Program Files (x86)\Steam\steamapps\common\DOOM\DOOMx64.exe +r_renderAPI -2

Memory Info                 
--------------------------- 
In Use:           17%      
MB Physical RAM:  16323        
MB Physical Free: 13533        
MB Paging File:   18369        
MB Paging Free:   14838        
MB User Address:  8388608        
MB User Free:     8387714

CPU Info                    
--------------------------- 
Num Packages:     1        
Num Cores:        4        
Num Logical:      8        
CPU ID:           Generic        
CPU MHz:          3400


----------



## Evil Penguin

Quote:


> Originally Posted by *daviejams*
> 
> irrelevant
> 
> Why waste time and money making a game run on an operating system 20 people use when you can spend that time and money improving it for the people who actually buy your games


It's about giving people options and potentially cashing in on a relatively new market.

The main reason I still boot into Windows is for gaming purposes.

I'd like to rid myself of MS's proprietary and invasive operating system as much as possible.

Bringing more quality games over to Linux helps me do that.


----------



## raghu78

https://www.computerbase.de/2016-07/doom-vulkan-benchmarks-amd-nvidia/#diagramm-doom-mit-vulkan-2560-1440

Massive perf increases for AMD GPUs with Vulkan on Doom. 40-50% across the board for all AMD GPUs. btw Async compute works with TSSAA or no AA only. So this is the best case result. Fury X is thumping GTX 1070 and 980 Ti at 1440p by 25-30%. Surprisingly Fury X perf gain at 4K is very low at just above 10%.

Rx 480 is the real star of the show. It provides 90% of the GTX 1070 performance at 1440p and 1080p. AMD 's gamble with Mantle has paid off with DX12 / Vulkan finally exploiting GCN to its fullest.


----------



## Blameless

Quote:


> Originally Posted by *raghu78*
> 
> Fury X is thumping GTX 1070 and 980 Ti at 1440p by 25-30%. Surprisingly Fury X perf gain at 4K is very low at just above 10%.


Seems like it either running out of VRAM or becoming fill rate limited at 4k.


----------



## Derp

Holy crap... That computerbase.de result. The FuryX is finally revealing it's true form.


----------



## Ha-Nocri

Is Fury X faster than 1080? Well, that's surprising. Makes one wonder what Vega will do


----------



## bossie2000

What kind of sorcery is this?


----------



## ku4eto

Quote:


> Originally Posted by *Derp*
> 
> Holy crap... That computerbase.de result. The FuryX is finally revealing it's true form.


Huehuehe, this is fun. 60% increase. Makes you wonder where the issue was - driver overhead or not used resources.


----------



## Randomdude

Mother of God. That's what I'm talking about, MOVE THE INDUSTRY FORWARD.

Someone edit the title to include that benchmarks are added.


----------



## xxela

Fury goes FULL ****** on this one.


----------



## zealord

damn that Fury X


----------



## arearverdairchi

My 980 ti performs noticeably worse with vulkan for some reason. I get about 10-20 fps higher with opengl.


----------



## Ha-Nocri

Quote:


> Originally Posted by *arearverdairchi*
> 
> My 980 ti performs noticeably worse with vulkan for some reason. I get about 10-20 fps higher with opengl.


what resolution? From what I've seen Paxwell gains performance @ lower resolutions in situation where it is CPU bottlenecked.


----------



## arearverdairchi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> what resolution? From what I've seen Paxwell gains performance @ lower resolutions in situation where it is CPU bottlenecked.


1440p. I wasn't expecting huge gains or anything but I also wasn't expecting performance to be worse.


----------



## ZealotKi11er

RX 480 vs GTX 1060 is already a Won Battle @ DX12/Vulkan.


----------



## SoloCamo

Quote:


> Originally Posted by *Derp*
> 
> Holy crap... That computerbase.de result. The FuryX is finally revealing it's true form.


**Grins at my 290x yet again as the best video card purchase I've ever made**


----------



## OneB1t

i can just agree with you







thats all i have to say about this


----------



## f1LL

Quote:


> Originally Posted by *SoloCamo*
> 
> **Grins at my 290x yet again as the best video card purchase I've ever made**


Quote:


> Originally Posted by *OneB1t*
> 
> i can just agree with you
> 
> 
> 
> 
> 
> 
> 
> thats all i have to say about this


And every day I regret more and more getting a 970 instead of a 390 only because of power consumption


----------



## GunfighterAK

Devs need to jump on this Vulkan business


----------



## Blameless

Quote:


> Originally Posted by *ku4eto*
> 
> Makes you wonder where the issue was - driver overhead or not used resources.


Probably both.


----------



## dir_d

Quote:


> Originally Posted by *Blameless*
> 
> Probably both.


Fury X just wanted to be fed... Feed me Seymour


----------



## Evil Penguin

Damn that's sexy. 

OpenGL was never really AMD's strong suit.


----------



## huzzug

Quote:


> Originally Posted by *ChevChelios*
> 
> Quote:
> 
> 
> 
> AMD had a plan, they called out that plan back in the Mantle days, and this stuff was in motion back in 2013
> 
> 
> 
> so you saiyan it only took them 3 years to catch up in a few games
> 
> *slowclap*
Click to expand...

I gotta ask, back when you decided to join, did you consider "ILeakStuff" as a username before deciding on ChevChelios. You guys seem to be cut from the same cloth, so don't mind me asking.


----------



## spyshagg




----------



## ChevChelios

Quote:


> Originally Posted by *huzzug*
> 
> I gotta ask, back when you decided to join, did you consider "ILeakStuff" as a username before deciding on ChevChelios. You guys seem to be cut from the same cloth, so don't mind me asking.


nah

but my point there stands - 3 years for this 1 patch to come out

when is the next Vulkan game ?


----------



## pengs

Quote:


> Originally Posted by *SoloCamo*
> 
> **Grins at my 290x yet again as the best video card purchase I've ever made**


Based on that benchmark the 290x is most likely 5-8% off a 980Ti and 1070.

Another interesting thing is that Hawaii (Hawaii/Grenada) started competing with the GTX780. There were many times the 970 was faster than the 390X/290X where they would be 35-40% faster here than the 970 here and consuming the 980 easily also.
The 290 and 290X originally competed with the 780 and if this type of thing keeps up...780<780Ti/970< 980< 980 Ti /1070

Hawaii five


----------



## Ha-Nocri

You should leave the guy alone. At first I didn't like his posts one bit, he was even the 1st (and only) person I blocked on this forum b/c I think he often doesn't see reason, but AMD threads would be boring w/o him. Who would we discuss and argue stuff with?!


----------



## Greenland

Quote:


> Originally Posted by *ChevChelios*
> 
> nah
> 
> but my point there stands - 3 years for this 1 patch to come out
> 
> when is the next Vulkan game ?


3 years and still getting love, unlike certain someone "cough" Kepler, Maxwell.


----------



## iRUSH

Hmmm, this makes the Fury X on newegg for $399 an interesting solution.

Should I do it? Who decides my fate?


----------



## spyshagg

Its just unprecedented. My 290x is almost 3 years old.

The shill catch-frase of the moment (it took 3 years for AMd cards to perform) doesn't really apply. The 290x always performed. And keeps performing. Look at those graphs. Its insane.


----------



## gamervivek

Quote:


> Originally Posted by *Evil Penguin*
> 
> 
> 
> 
> Damn that's sexy.
> 
> 
> 
> 
> 
> 
> 
> 
> OpenGL was never really AMD's strong suit.


Actually it was.

http://forums.anandtech.com/showthread.php?p=37391200#post37391200


----------



## Pintek

Didn't read through all the posts but I'm really blown away with how drastic a fps boost I got on my system after vulcan I'm seeing peaks of 100fps an lows of 70fps on a system that before was only seeing around a stable 30-60 throughout the game o.o Settings are on ultra at 1920x1200 but jesus this just melts my brain o.o.

AMD phenom II 1045t oc 3.2ghz
Asus M4A88T-I delux mini itx
16gb corsair vengeance 1600mhz
R9 Fury nano Asus white


----------



## OneB1t

Quote:


> Originally Posted by *spyshagg*
> 
> Its just unprecedented. My 290x is almost 3 years old.
> 
> The shill catch-frase of the moment (it took 3 years for AMd cards to perform) doesn't really apply. The 290x always performed. And keeps performing. Look at those graphs. Its insane.


yep 290X is all the way up there with 1070/980ti


----------



## Assirra

Quote:


> Originally Posted by *Greenland*
> 
> 3 years and still getting love, unlike certain someone "cough" Kepler, Maxwell.


My 980 still preforms great, so i have no idea what you are talking about.


----------



## iRUSH

Quote:


> Originally Posted by *Assirra*
> 
> My 980 still preforms great, so i have no idea what you are talking about.


I'm sure it does! But, the 290x started off life behind the 970 and now hangs with a 980.


----------



## Greenland

Quote:


> Originally Posted by *Assirra*
> 
> My 980 still preforms great, so i have no idea what you are talking about.


I didn't say your 980 didn't perform. I'm saying it's left behind now. Like a first wife when the husband marries a second one. Younger and hotter.

Can't say the same about the 290X, performs even better over time.

Will you stay faithful to your wife? or will you abandon her once you find a younger and hotter one?


----------



## OneB1t

started life as 780 oponent







then raped 780 and also 780ti after that 970 and then 980 now looking for 980ti


----------



## bossie2000

Quote:


> Can't say the same about the 290X, performs even better over time.


And it's hotter also!


----------



## OneB1t

reference 290X desing with nonreference cooler is about same power draw +-30W as reference 980 not that big deal


but alot of reviews have nonreference 290X with increased TDP or reference 290X which burned itself to 95C so power usage is much higher than with nonreference cooler


----------



## Ha-Nocri

Vulkan + Async 480 (reference) vs 970 vs 1070

480 ~20% in-front of 970:


----------



## looniam

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Assirra*
> 
> My 980 still preforms great, so i have no idea what you are talking about.
> 
> 
> 
> I'm sure it does! But, *the 290x started off life behind the 970* and now hangs with a 980.
Click to expand...

depends what resolution you're looking at. *1080p the 970 had a small lead but at 1440 and above the 290x had a slightly greater advantage*. and there wasn't *that much of a p[performance difference between 970/980.
Quote:


> Originally Posted by *OneB1t*
> 
> started life as 780 oponent
> 
> 
> 
> 
> 
> 
> 
> then raped 780 and also 780ti after that 970 and then 980 now looking for 980ti


NOPE

the 290x was release to *"ridicule the titan."* which it did.

and held it's own against the 780ti at 4K . . already.

fwiw i am NOT saying anything against the 290x its a great card with extraordinary longevity. however lets forgo the revisionist histories.


----------



## gamervivek

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Vulkan + Async 480 (reference) vs 970 vs 1070
> 
> 480 ~20% in-front of 970:


He uses 970 strix in his reviews so the difference is even greater. computerbase have 480 just shy of 980Ti/1070


----------



## ILoveHighDPI

Wow, I'm only getting an extra 10FPS with a 980Ti (Maxed settings goes from 45fps to 55fps in my test spot on the Crucible level), but hearing about all the awesome boosts on AMD is great news too, in the long term I'm hoping ot keep an AMD system alongside my Intel/Nvidia system.


----------



## Chaython

70% higher framerate on amd


----------



## ZealotKi11er

Just one game. DX11 is a lot more important.


----------



## OneB1t

in fact it isnt







in half year time all AAA graphic intensive titles will be either DX12 or vulkan remember my words


----------



## Greenland

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just one game. DX11 is a lot more important.


Just one car, horse is a lot more important.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Greenland*
> 
> Just one car, horse is a lot more important.


Are you saying GTX980 Ti is a Horse? You know fast horses cost more than cars right?


----------



## helis4life

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Are you saying GTX980 Ti is a Horse? You know fast horses cost more than cars right?


Horse comparison is quite relevant seeing as youre flogging a dead horse


----------



## rcfc89

Wait so I have to run the Fxaa or No AA to get the benefits from Vulcan? Basically the one that gives the poorest visual fidelity. I'll stick to TSSAA (8 TX) it makes the game look fantastic. I'm easily holding over 80fps in Ultra-Wide resolutions with one 980Ti anyways.


----------



## helis4life

Quote:


> Originally Posted by *rcfc89*
> 
> Wait so I have to run the Fxaa or No AA to get the benefits from Vulcan? Basically the one that gives the poorest visual fidelity. I'll stick to TSSAA (8 TX) it makes the game look fantastic. I'm easily holding over 80fps in Ultra-Wide resolutions with one 980Ti anyways.


No, you reap the benefits of vulkan as soon as you select it. However if you want to also reap the benefits of asyc compute on AMD hardware you need TSSAA or no AA selected currently

Edit: Just saw youre on a 980ti..... Sorry


----------



## rcfc89

Quote:


> Originally Posted by *helis4life*
> 
> No, you reap the benefits of vulkan as soon as you select it. However if you want to also reap the benefits of asyc compute on AMD hardware you need TSSAA or no AA selected currently
> 
> Edit: Just saw youre on a 980ti..... Sorry


Thanks. I need to load the game back up from Steam and give it a go.


----------



## pengs

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just one game. DX11 is a lot more important.


Oh come on Zealot. It's more than one... car per Greenland.
Quote:


> Originally Posted by *rcfc89*
> 
> Wait so I have to run the Fxaa or No AA to get the benefits from Vulcan? Basically the one that gives the poorest visual fidelity. I'll stick to TSSAA (8 TX) it makes the game look fantastic. I'm easily holding over 80fps in Ultra-Wide resolutions with one 980Ti anyways.


?
No, TSSAA enables asyncronus compute. You benefit from Vulkan by running Vulkan, async adds performance on top of it. Doesn't matter for you, your cards will not see an improvement with async.


----------



## Wovermars1996




----------



## looniam

so yeah, noticed i was using 368.39 instead of the "official" vulkan driver, 368.69. nice, so now i freeze even faster.

GG nvidia.









if that gets fixed - i can't complain with a ~30% increase in minimum frame rate.


----------



## tajoh111

This update should provide Nvidia with lots of concerns. Strategy games are a genre Nvidia can ignore. Not FPS. It is their bread and butter.

Not only is this game a big title, it is a FPS which is atypical of the improvements of AC for FPS.


----------



## DesertRat

Well, can play the game maxed out @ 1440p on my 4.5Ghz i7-3770K and R9 290X now. Prior, was having performance issues even on all low. Especially with FreeSync, it's so buttery smooth.


----------



## czin125

What if the 7970 had this level of performance in 2012? What does the 680 get in Doom 2016?


----------



## flippin_waffles

Wasn't the 580 the competitor to a 7970?


----------



## Gungnir

Quote:


> Originally Posted by *flippin_waffles*
> 
> Wasn't the 580 the competitor to a 7970?


For about 2 months until the 680 released. The 580's competitor was the 6970.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gungnir*
> 
> For about 2 months until the 680 released. The 580's competitor was the 6970.


It had no competition. HD 6970 competitor was GTX 570.


----------



## Gungnir

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It had no competition. HD 6970 competitor was GTX 570.


I know that, but the 580 and 6970 were the single-GPU flagships of that generation, and that's what I was referring to.


----------



## mtcn77

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It had no competition. HD 6970 competitor was GTX 570.


HD6870x2 was placed in competition with it.


----------



## Tgrove

Seeing 20-30 frame increase on 1 fury x with sig rig with vulkan @ 4k. Framrate avg seems to be over 70 fps. Anyone know a frame limiter that works with vulkan?


----------



## iRUSH

Ok so Vulkan is clearly amazing for newer AMD hardware.

Can we expect this kind of jump with Nvidia in an upcoming driver?

Be honest please. I and many others would love to see this gap stay but if it's unlikely, I'd like to know.

Science this sucker out for me OCN ?


----------



## mtcn77

Quote:


> Originally Posted by *iRUSH*
> 
> Ok so Vulkan is clearly amazing for newer AMD hardware.
> 
> Can we expect this kind of jump with Nvidia in an upcoming driver?
> 
> Be honest please. I and many others would love to see this gap stay but if it's unlikely, I'd like to know.
> 
> Science this sucker out for me OCN ?


You can bet the chairman will not hold back. The only regulation they have is for fair-play, the rest is up to the vendors for suggestion.


----------



## Slomo4shO

Anyone that owns a Fury X, how flexible is the AIO tubing?


----------



## raghu78

id software has done probably the best performance optimizations for AMD GPUs possible in a modern AAA game. Doom running on Vulkan benefits from 3 major performance optimizations.
1. Async compute
2. Shader intrinsics (exposed through GPUOpen)
3. Frame flip optimizations

https://community.bethesda.net/thread/54585?start=0&tstart=0
http://radeon.com/doom-vulkan/

AMD needs to do more such work with developers to exploit their GPUs potential to the fullest by promoting DX12 / Vulkan and GPUOpen. The next few months are going to see huge AAA titles aligned with AMD Gaming Evolved such as Deus Ex Mankind Divided, Battlefield 1, Watch Dogs 2. There are other DX12 titles too releasing in 2016 such as Gears of War 4, Forza Horizon 3 which should perform well on AMD GPUs .


----------



## infranoia

Quote:


> Originally Posted by *raghu78*
> 
> id software has done probably the best performance optimizations for AMD GPUs possible in a modern AAA game. Doom running on Vulkan benefits from 3 major performance optimizations.
> 1. Async compute
> 2. Shader intrinsics (exposed through GPUOpen)
> 3. Frame flip optimizations
> 
> https://community.bethesda.net/thread/54585?start=0&tstart=0
> http://radeon.com/doom-vulkan/
> 
> AMD needs to do more such work with developers to exploit their GPUs potential to the fullest by promoting DX12 / Vulkan and GPUOpen. The next few months are going to see huge AAA titles aligned with AMD Gaming Evolved such as Deus Ex Mankind Divided, Battlefield 1, Watch Dogs 2. There are other DX12 titles too releasing in 2016 such as Gears of War 4, Forza Horizon 3 which should perform well on AMD GPUs .


AMD's been a bit of a roller coaster ride for you, eh raghu?


----------



## raghu78

Quote:


> Originally Posted by *infranoia*
> 
> AMD's been a bit of a roller coaster ride for you, eh raghu?


I am disappointed with Rx 480's perf/watt in general. I also feel the GF 14LPP process is not in good shape and there is scope for improvement. There seem to be plans for a second revision of the Polaris chips. Those should launch next year when GF works out the process issues. Still AMD has an uphill task with Vega and Zen in trying to get close to their competition in perf/watt.


----------



## deepor

Quote:


> Originally Posted by *iRUSH*
> 
> Ok so Vulkan is clearly amazing for newer AMD hardware.
> 
> Can we expect this kind of jump with Nvidia in an upcoming driver?
> 
> Be honest please. I and many others would love to see this gap stay but if it's unlikely, I'd like to know.
> 
> Science this sucker out for me OCN ?


I think what's happening is that AMD's OpenGL driver was pretty bad. Using Vulkan, you can now see what AMD's hardware can actually do.

NVIDIA's OpenGL driver is apparently very good. It likely already got pretty close to what's theoretically possible with the hardware. You won't ever see a big improvement from Vulkan on NVIDIA if that's true.

The DirectX 11 vs. 12 situation seems to be somewhat similar, though with AMD's DX11 driver being a lot better than their OpenGL driver, so there's less of an improvement there.

Another hint about all of this can perhaps be seen in the Linux situation where there's an open source effort of some random nerds writing an OpenGL driver. The open source OpenGL driver there can compete pretty well with AMD's official OpenGL driver for things like fps in Valve games, but a similar open source OpenGL driver for NVIDIA produces a lot lower fps compared to NVIDIA's official driver. This could perhaps be proof that NVIDIA's drivers are a lot closer to what the hardware can do when everything's perfect, meaning you can't expect great improvements from Vulkan.


----------



## infranoia

Quote:


> Originally Posted by *raghu78*
> 
> I am disappointed with Rx 480's perf/watt in general. I also feel the GF 14LPP process is not in good shape and there is scope for improvement. There seem to be plans for a second revision of the Polaris chips. Those should launch next year when GF works out the process issues. Still AMD has an uphill task with Vega and Zen in trying to get close to their competition in perf/watt.


As far as i can tell that perf/watt has been figured for DX11. It would be interesting to refactor it with a modern API.


----------



## zipper17

IS there any benchmark/video Fury X vs GTX 1080 apples to apples on Doom Vulkan?


----------



## mtcn77

Quote:


> Originally Posted by *zipper17*
> 
> IS there any benchmark/video Fury X vs GTX 1080 apples to apples on Doom Vulkan?


There's this:




Spoiler: Fury X, 2K 8x TSSAA*,














Spoiler: GTX 1080, 2K 1x SMAA,














Spoiler: Puny RX 480, 1K 8x TSSAA,














Spoiler: Puny RX 480, 1K 8x TSSAA,










*not crossfire, thus not Pro Duo; add 15 FPS recording cutback.


----------



## junkman

Quote:


> Originally Posted by *Slomo4shO*
> 
> Anyone that owns a Fury X, how flexible is the AIO tubing?


I'd say it's pretty bendable. It's got braided mesh around it, which makes it more robust.

I can say this with experience because I fit mine into an Ncase M1 with a regular ATX PSU


----------



## Defoler

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Pascal/Maxwell cannot do compute+graphics in parallel..
> 
> There's a reason every game with Async support disable it on Nvidia hardware..


Both can. Maxwell via driver, pascal via hardware.
AOTS for example runs async on nvidia, and the pascal gains have been very very good.

So please, don't make stuff up.

The fact is that ID intentionally disabled async on nvidia since when they developed it, they did so without pascal in mind. They also stated that they are working with nvidia to enable it. So laughing and making stuff up now, is irrelevant really.


----------



## Pyrotagonist

Quote:


> Originally Posted by *Defoler*
> 
> Both can. Maxwell via driver, pascal via hardware.
> *AOTS for example runs async on nvidia, and the pascal gains have been very very good.*.


Last I heard, async is still disabled on 1080/1070 in Ashes. Do you have a link proving otherwise?


----------



## EightDee8D

Quote:


> Originally Posted by *Defoler*
> 
> Both can. Maxwell via driver, pascal via hardware.
> AOTS for example runs async on nvidia, and the pascal gains have been very very good.
> 
> So please, don't make stuff up.
> 
> The fact is that ID intentionally disabled async on nvidia since when they developed it, they did so without pascal in mind. They also stated that they are working with nvidia to enable it. So laughing and making stuff up now, is irrelevant really.


You are the one making stuff up here, maxwell and pascal both can't run gfx+compute at same time. but pascal has preemption so it doesn't take performance hit as maxwell do. so please educate your self and stop spreading fud. or show a game running off/on async on nvidia and gaining performance from it.


----------



## Defoler

Quote:


> Originally Posted by *iRUSH*
> 
> Ok so Vulkan is clearly amazing for newer AMD hardware.
> 
> Can we expect this kind of jump with Nvidia in an upcoming driver?
> 
> Be honest please. I and many others would love to see this gap stay but if it's unlikely, I'd like to know.
> 
> Science this sucker out for me OCN ?


If they are going to enable async compute on pascal, than we should see an increase in performance for nvidia new cards with new driver update and the game update.


----------



## infranoia

Quote:


> Originally Posted by *Defoler*
> 
> Both can [run compute+graphics in parallel]. Maxwell via driver, pascal via hardware.


I understand your intent but this sounds absurd. If you don't have a parallel engine to run both compute+graphics _in parallel in hardware_, then it's called _software preemption_. And that's not parallel. It's cleverly scheduled by the driver to minimize latency, which provides the benefits of async compute, but it's not parallel and doesn't run compute+graphics at the exact same time.

Pedantic, since async compute then competes with AMD's implementation thanks to Nvidia's faster overall architecture, but it's not compute+graphics in parallel.

But maybe not so pedantic, if Nvidia doesn't come through with a Doom or Total War update soon.


----------



## magnek

Quote:


> Originally Posted by *Defoler*
> 
> Both can. Maxwell via driver, pascal via hardware.
> *AOTS for example runs async on nvidia, and the pascal gains have been very very good.*
> 
> So please, don't make stuff up.
> 
> The fact is that ID intentionally disabled async on nvidia since when they developed it, they did so without pascal in mind. They also stated that they are working with nvidia to enable it. So laughing and making stuff up now, is irrelevant really.


You call a *2.8% gain* "very very good"?



And if I really wanted to cherry pick:



Spoiler: This is just embarrassing


----------



## Defoler

Quote:


> Originally Posted by *Pyrotagonist*
> 
> Last I heard, async is still disabled on 1080/1070 in Ashes. Do you have a link proving otherwise?


You can enable it if you wish.


Quote:


> Originally Posted by *EightDee8D*
> 
> You are the one making stuff up here, maxwell and pascal both can't run gfx+compute at same time. but pascal has preemption so it doesn't take performance hit as maxwell do. so please educate your self and stop spreading fud. or show a game running off/on async on nvidia and gaining performance from it.


It still can run async compute. Regardless of its preemption or not.
So, not making stuff up. Just looking at your trying to fan-boy-trolling at everything. Which I started to find it funny and adorable


----------



## EightDee8D

Quote:


> Originally Posted by *Defoler*
> 
> 
> It still can run async compute. Regardless of its preemption or not.
> So, not making stuff up. Just looking at your trying to fan-boy-trolling at everything. Which I started to find it funny and adorable


It's not from async since that path is still disabled for pascal. lol and it's quite hypocritical to say that looking at how you make stuff up yourself and call other out for actually correcting you lol. i know i'm adorable


----------



## magnek

Man who gives a damn if async is running or not. You gain a whooping 2.8% or 1.8 FPS in the best case scenario.









So if async was actually running, that's just embarrassing lol.


----------



## infranoia

Quote:


> Originally Posted by *magnek*
> 
> Man who gives a damn if async is running or not. You gain a whooping 2.8% or 1.8 FPS in the best case scenario.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So if async was actually running, that's just even worse lol.


Sounds like preemption. Lipstick on a pig.

Is that the only async compute patch that Nvidia has in the wild?


----------



## EightDee8D

Quote:


> Originally Posted by *infranoia*
> 
> Sounds like preemption. Lips on a pig.
> 
> Is that the only async compute patch that Nvidia has in the wild?


Async path is still disabled for pascal in atos.


----------



## infranoia

Quote:


> Originally Posted by *EightDee8D*
> 
> Async path is still disabled for pascal in atos.


So wait-- there are no active Async Compute render paths for Nvidia in any DX12 or Vulkan title, anywhere?


----------



## EightDee8D

Quote:


> Originally Posted by *infranoia*
> 
> So wait-- there are no active Async Compute render paths for Nvidia in any DX12 or Vulkan title, anywhere?


yep, because they don't support it. isn't that obvious by now llol


----------



## magnek

I find it extremely funny and ironic it's now nVidia's turn to play the "b-b-but just wait for async" game.









and yes I'm veeeeery slightly salty I now need a 30% OC on my 980 Ti to match a stock Fury X


----------



## infranoia

OK, slightly OT but on-topic to the current discussion, now I'm really starting to wonder if Total War's Async Compute render path is actually *enabled* on Nvidia.

This video? It would make sense. Preemption in the software compute+graphics scheduler could cause that frame timing stutter.






I'd love to be a fly on the wall at Nvidia labs... Maybe their Async Compute preemption strategy is causing them some pain-- who knows?


----------



## Slomo4shO

Quote:


> Originally Posted by *junkman*
> 
> I'd say it's pretty bendable. It's got braided mesh around it, which makes it more robust.
> 
> I can say this with experience because I fit mine into an Ncase M1 with a regular ATX PSU


I am thinking of putting it in a Silverstone SG05BB-LITE for my new build. This case is 1.8 L smaller in volume and doesn't officially have any watercooling support. I imagine it would be a tighter fit.


----------



## Pyrotagonist

Quote:


> Originally Posted by *magnek*
> 
> I find it extremely funny and ironic it's now nVidia's turn to play the "b-b-but just wait for async" game.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and yes I'm veeeeery slightly salty I now need a 30% OC on my 980 Ti to match a stock Fury X


The sad part is that the Fury X is now only really equaling the full power of a 980 Ti. I guess with its own overclocking it would end up substantially ahead. But a 30% overclock is something a 980 Ti is actually capable of.


Spoiler: Warning: Spoiler!







34.4% to be exact (relative to stock).


----------



## EightDee8D

Quote:


> Originally Posted by *Pyrotagonist*
> 
> The sad part is that the Fury X is now only really equaling the full power of a 980 Ti. I guess with its own overclocking it would end up substantially ahead. But a 30% overclock is something a 980 Ti is actually capable of.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 34.4% to be exact (relative to stock).


one game =/= average oc performance for every other game.


----------



## Pyrotagonist

Quote:


> Originally Posted by *EightDee8D*
> 
> one game =/= average oc performance for every other game.


No, but it is capable of that much.

Here's a whole slew of benches comparing on OC'ed (1497MHz) 980 Ti vs other cards: http://www.overclockersclub.com/reviews/msi_gtx_980_ti_lightning/4.htm

(All 1440p)

Metro: 41.8%
Far Cry 4: 26.8%
Crysis 3: 24.5%
Bioshock Infinite: 21.2%
Battlefield 3 (as we've seen): 34.4% (1517Mhz)
Battlefield 4: 20.5%
Ass Creed Unity: 23.3%

So you're right, not up to 30% in most games, average is 27.5%.

Still, can't deny that the 980 Ti is competitive with the Fury X, even for future prospects, and it's helped along by the extra VRAM.


----------



## EightDee8D

Yep, but furyx can also oc at least 10%, so it's still something. but i agree that 4gig vram will be a limiting factor if they stop ram optimizations.


----------



## infranoia

Quote:


> Originally Posted by *Pyrotagonist*
> 
> Still, can't deny that the 980 Ti is competitive with the Fury X, even for future prospects, and it's helped along by the extra VRAM.


I feel like I'm in bizarro world reading that sentence. As if there would ever be any question of the 980Ti's dominance...

If DX12 gets here in time, Vega will be very interesting.


----------



## Pyrotagonist

Yeah, it is absolutely incredible that a stock Fury X may eventually end up beating an overclocked aftermarket 980 Ti. I hope this trend continues; it would be a huge, and I think well-deserved break for AMD.


----------



## Kana Chan

Is Vulkan lower level than DX12?


----------



## OneB1t

its about same but vulkan should have some improvements for GCN architecture as it was mantle before


----------



## SoloCamo

Quote:


> Originally Posted by *Pyrotagonist*
> 
> Yeah, it is absolutely incredible that a stock Fury X may eventually end up beating an overclocked aftermarket 980 Ti. I hope this trend continues; it would be a huge, and I think well-deserved break for AMD.


Seems like it may just follow the same trend the 7970 and 290x have...

Though the Fury X is limited by 4gb of vram which will hurt it's long term value.


----------



## Blameless

Quote:


> Originally Posted by *SoloCamo*
> 
> Seems like it may just follow the same trend the 7970 and 290x have...
> 
> Though the Fury X is limited by 4gb of vram which will hurt it's long term value.


I've been hitting my head on the 4GiB limit of my 290Xes ever since _Elite: Dangerous_ went 64-bit. Performance is solid with my CFX setup, but I have zero headroom to increase resolution or texture quality without introducing major stuttering/hitching during certain scenes when assets have to be evicted from VRAM.

This is far from the only game that can do this, and as far as I am concerned, 4GiB is quite dead at the high-end.


----------



## LancerVI

Quote:


> Originally Posted by *magnek*
> 
> I find it extremely funny and ironic it's now nVidia's turn to play the "b-b-but just wait for async" game.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and yes I'm veeeeery slightly salty I now need a 30% OC on my 980 Ti to match a stock Fury X


I feel that. In exactly the same boat.

I'm actually thinking about putting my two R9 290s w/waterblocks back in just to see.

Anyone want a MSI GTX 980Ti 6G Gaming?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> I've been hitting my head on the 4GiB limit of my 290Xes ever since _Elite: Dangerous_ went 64-bit. Performance is solid with my CFX setup, but I have zero headroom to increase resolution or texture quality without introducing major stuttering/hitching during certain scenes when assets have to be evicted from VRAM.
> 
> This is far from the only game that can do this, and as far as I am concerned, 4GiB is quite dead at the high-end.


Even 6GB is dead at the high end right now. 8GB is the very minimum.


----------



## SoloCamo

Quote:


> Originally Posted by *Blameless*
> 
> I've been hitting my head on the 4GiB limit of my 290Xes ever since _Elite: Dangerous_ went 64-bit. Performance is solid with my CFX setup, but I have zero headroom to increase resolution or texture quality without introducing major stuttering/hitching during certain scenes when assets have to be evicted from VRAM.
> 
> This is far from the only game that can do this, and as far as I am concerned, 4GiB is quite dead at the high-end.


I can agree to that. I hit the 4gb limit on GTA V as is with a mix of settings at 4k. Aside from the general dual gpu issues, this is the main reason I never bothered grabbing another 290x even though I've been at 4k for just shy of two years now. Next card is going to be at least 8gb.

Can't wait to see what 8gb+ of hmb2 will do.

Digital Foundry review is up for this as well..






Frame times are certainly improved quite a bit too


----------



## jprovido

i knew it! it really felt like I had a boost with vulkan on my 1080


----------



## Robenger

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> 
> 
> i knew it! it really felt like I had a boost with vulkan on my 1080


Congrats on your 5% boost...


----------



## infranoia

Quote:


> Originally Posted by *Robenger*
> 
> Congrats on your 5% boost...


Don't be too salty, every little bit helps. If Vulkan isn't seen as a net win for over half the discrete gamers out there, then it won't gain traction for another generation of GPUs.


----------



## ChevChelios

Quote:


> Originally Posted by *Robenger*
> 
> Congrats on your 5% boost...


he had better fps to begin with on OpenGL, thus the lower boost (besides lack of async shaders), since there wasnt an overhead for Vulkan to improve upon

so theres that too

that can be verified from the video above in that Fury X now beats stock 980Ti by up to 11-12% (or less in one of their runs) despite having had gains of 40%+ overall from Vulkan


----------



## Robenger

Quote:


> Originally Posted by *infranoia*
> 
> Don't be too salty, every little bit helps. If Vulkan isn't seen as a net win for over half the discrete gamers out there, then it won't gain traction for another generation of GPUs.


I was teasing him due to how excited he seemed.


----------



## jprovido

Quote:


> Originally Posted by *Robenger*
> 
> I was teasing him due to how excited he seemed.


I'm pretty happy tbh. I get constant 144fps at 1440p with opengl with minor dips here and there. with vulkan I barely see dips just solid 144fps 99% of the time. feels like I'm playing cs:go it runs so smooth

edit:
and video says it's 6% - 19% boost depends on the scene. that's a good boost if you ask me


----------



## ChevChelios

yeah DX12 and Vulkan may not do much for Nvidia in terms of raw fps for reasons stated (maybe ~5-10% gains at times compared to Nvidias DX11/OpenGL, sometimes 0-1%)

but what it does do well is improve minimal fps

saw the same story in RotR, min fps went up in benches in DX12 more so than the avg score did


----------



## looniam

so yeah, it seems AMD is support the latest version of vulkan (1.0.17), whereas nvidia is behind with ver 1.0.8 and trying to manually update is fruitless.

have i mentioned how annoying the freezing is?


----------



## Robenger

Quote:


> Originally Posted by *looniam*
> 
> so yeah, it seems AMD is support the latest version of vulkan (1.0.17), whereas nvidia is behind with ver 1.0.8 and trying to manually update is fruitless.
> 
> have i mentioned how annoying the freezing is?


#GreenFeelingsMatter


----------



## jprovido

Quote:


> Originally Posted by *ChevChelios*
> 
> yeah DX12 and Vulkan may not do much for Nvidia in terms of raw fps for reasons stated (maybe ~5-10% gains at times compared to Nvidias DX11/OpenGL, sometimes 0-1%)
> 
> but what it does do well is improve minimal fps
> 
> saw the same story in RotR, min fps went up in benches in DX12 more so than the avg score did


it does help. My cpu is already a 6 core at 4.7ghz ram is overclocked but vulkan still helped. I don't think it's just cpu overhead
Quote:


> Originally Posted by *Robenger*
> 
> #GreenFeelingsMatter


I'm sick and tired of these green vs. red posts. somepeople spend more time defending their purchases, trolling forums instead of actually playing games smh


----------



## ChevChelios

what I like in Nvidia now is that they get excellent DX11 performance _on day 1 without having to rely on or wait for a DX12/async patches_ (note how RotR came out in Jan 2016, the first DX12 patch in March, but the async patch - only a few days ago, almost 6 months after launch)

but if such a patch does come out - they can still sometimes get a bit of increase for avg fps out of it (Warhammers current beta implementation of DX12 non-withstanding), as well as also get a decent minimum fps increase due to DX12 handling CPU better .. and min fps is pretty important all things considered

so its: "good from the start and _can_ get better later"


----------



## Robenger

Quote:


> Originally Posted by *jprovido*
> 
> I'm sick and tired of these green vs. red posts. somepeople spend more time defending their purchases, trolling forums instead of actually playing games smh


You read into that a little too much. /hug


----------



## Evil Penguin

Quote:


> Originally Posted by *looniam*
> 
> so yeah, it seems AMD is support the latest version of vulkan (1.0.17), whereas nvidia is behind with ver 1.0.8 and trying to manually update is fruitless.
> 
> have i mentioned how annoying the freezing is?


Should be noted that the minor version numbers consist primarily of documentation updates and new/enhanced conformance tests.

There shouldn't be a practical difference is what I'm getting at in terms of features and compatibility.


----------



## looniam

Quote:


> Originally Posted by *Evil Penguin*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> so yeah, it seems AMD is support the latest version of vulkan (1.0.17), whereas nvidia is behind with ver 1.0.8 and trying to manually update is fruitless.
> 
> have i mentioned how annoying the freezing is?
> 
> 
> 
> 
> 
> 
> Should be noted that the minor version numbers consist primarily of documentation updates and new/enhanced conformance tests.
> There shouldn't be a practical difference is what I'm getting at in terms of features and compatibility.
Click to expand...

well vulkaninfo17.exe (which comes with 1.0.17) reported a handful of extensions it looked for were not there . . . .


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> it does help. My cpu is already a 6 core at 4.7ghz ram is overclocked but vulkan still helped. I don't think it's just cpu overhead
> I'm sick and tired of these green vs. red posts. somepeople spend more time defending their purchases, trolling forums instead of actually playing games smh


Overload of one core cancause those lower fps. Vulkan and DX12 does better balancing of core usage.


----------



## looniam

Quote:


> Originally Posted by *Robenger*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> so yeah, it seems AMD is support the latest version of vulkan (1.0.17), whereas nvidia is behind with ver 1.0.8 and trying to manually update is fruitless.
> 
> have i mentioned how annoying the freezing is?
> 
> 
> 
> 
> 
> 
> 
> #GreenFeelingsMatter
Click to expand...


----------



## pengs

Quote:


> Originally Posted by *raghu78*
> 
> I am disappointed with Rx 480's perf/watt in general. I also feel the GF 14LPP process is not in good shape and there is scope for improvement. There seem to be plans for a second revision of the Polaris chips. Those should launch next year when GF works out the process issues. Still AMD has an uphill task with Vega and Zen in trying to get close to their competition in perf/watt.


My take on this is that the power target in general and stock VDDC on the 480 has been overshot to compensate for the wide gauge of quality the process is yielding atm. When a few random articles authors' are able to undervolt the 480 by 87mV; and 50mV; (and 50mV); without any complaints, retaining the same clocks, _gaining_ performance _and_ dropping 25w - 40w there is surely a discrepancy happening on GF's side - AMD increasing the stock VDDC to compensate..

A mature 14LPP should allow the voltage threshold, in effect, then the power target to drop significantly if these undervolts are an indication of it's efficiency potential.


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Overload of one core cancause those lower fps. Vulkan and DX12 does better balancing of core usage.


gpu load has gone down too. seems like it is easier now to get to 144fps compared to before with opengl. I'm seeing it at high 80's to low 90's. before it was always at the high 90's - max load.


----------



## pengs

Quote:


> Originally Posted by *pengs*
> 
> My take on this is that the power target in general and stock VDDC on the 480 has been overshot to compensate for the wide gauge of quality the process is yielding atm. When a few random articles authors' are able to undervolt the 480 by 87mV; and 50mV; (and 50mV); without any complaints, retaining the same clocks, _gaining_ performance _and_ dropping 25w - 40w there is surely a discrepancy happening on GF's side - AMD increasing the stock VDDC to compensate..
> 
> A mature 14LPP should allow the voltage threshold, in effect, then the power target to drop significantly if these undervolts are an indication of it's efficiency potential.


To complement this:




The RX480 becomes more efficient than the 970 using Vulkan, about 8-9% and about 33% more efficient than when the 480 runs OpenGL.
Basically the RX480 gets a 33% gain in efficiency when utilized to it's potential, and who's to say that ID squeezed everything out of GCN? Obviously applicable to everything GCN.

Full video


----------



## Ha-Nocri

I would like to see 480's core clock. I'm pretty sure it is throttling even more under Vulkan as it is being fully utilized, like it is throttling @4k under OpenGL/DX11. I expect even more gains from AIB cards


----------



## infranoia

Quote:


> Originally Posted by *infranoia*
> 
> As far as i can tell that perf/watt has been figured for DX11. It would be interesting to refactor it with a modern API.


Quote:


> Originally Posted by *pengs*
> 
> To complement this:
> 
> 
> 
> 
> The RX480 becomes more efficient than the 970 using Vulkan, about 8-9% and about 33% more efficient than when the 480 runs OpenGL.
> Basically the RX480 gets a 33% gain in efficiency when utilized to it's potential, and who's to say that ID squeezed everything out of GCN? Obviously applicable to everything GCN.
> 
> Full video


Bingo.

So the perf / watt that everyone has been whinging about is with DX11. GCN is a more efficient architecture than Pascal Maxwell with bare-metal APIs.

At least, until any potential driver response from Nvidia on the whole Async Compute front.


----------



## ChevChelios

Quote:


> GCN is a more efficient architecture than Pascal with bare-metal APIs.


thats impossible since a 480 still draws about the same or even a bit more than 1070 and even with Vulkan its a good 20+ fps behind 1070 in Doom





but yeah, the perf/w of GCN obviously increases a lot when gains like these appear


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> what I like in Nvidia now is that they get excellent DX11 performance _on day 1 without having to rely on or wait for a DX12/async patches_ (note how RotR came out in Jan 2016, the first DX12 patch in March, but the async patch - only a few days ago, almost 6 months after launch)


But according to your unbiased logic, people already beat doom, a 2 month old title. so obviously they have finished all other games too, right ? and upcoming big games are mostly dx12/vlk. so it's going to be awesome for amd since by your logic dx11 is now irrelevant







. only fury/x had to wait for 2 months to beat latest 1070, other amd cards were performing great after 4-5 days. unlike nvidia where they don't even bother to fix performance on 3-4 other dx12 games.

and stop with the fud. dx11 performance is pretty much same lvl on amd *(excluding fury/x that gpu is bottlenecked and scales badly even over other gcn cards)* when you have i5 or higher cpu with 1080p or higher res monitor. otherwise 390x won't beat 980 on 2k across many games.
.


----------



## infranoia

Quote:


> Originally Posted by *ChevChelios*
> 
> thats impossible since a 480 still draws about the same or even a bit more than 1070 and even with Vulkan its a good 20+ fps behind 1070 in Doom
> 
> 
> 
> 
> 
> but yeah, the perf/w of GCN obviously increases a lot when gains like these appear


Yep, you're right-- I said Pascal when I meant Maxwell. The test was against 970.


----------



## ChevChelios

Quote:


> and upcoming big games are mostly dx/*vlk*


why dont we wait for them to come out first and see how they perform and how their DX12 is/works









and there are also plenty of future big games missing from that wikipedia DX12 list you so like to link

id also like you to name me 1 upcoming confirmed AAA Vulkan game









Quote:


> other amd cards were performing great after 4-5 days.


Quote:


> dx11 performance is pretty much same lvl on amd


----------



## pengs

Quote:


> Originally Posted by *ChevChelios*
> 
> thats impossible since a 480 still draws about the same or even a bit more than 1070 and even with Vulkan its a good 20+ fps behind 1070 in Doom
> 
> 
> 
> 
> 
> but yeah, the perf/w of GCN obviously increases a lot when gains like these appear


33% on top of a, what seems to be easy to achieve, 50mV undervolt if the AIB's can get ahold on the power constraints and the AMD and GF can tighten up the process.



124w to 112w which is, what, another 10% if my maths are competent. Remember, most samples are able to be undervolted to large levels while gaining performance which is indicative of voltage and power target overshot.

If it doesn't have Pascal's efficiency yet it should either come extremely close or match it as the process matures and the quality of sampling goes up and DX12/Vulkan and asynchronous compute becomes more commonly used.


----------



## EightDee8D

Quote:


> Originally Posted by *ChevChelios*
> 
> why dont we wait for them to come out first and see how they perform and how their DX12 is/works
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and there are also plenty of future big games missing from that wikipedia DX12 list you so like to link


So you only looked at vlk ? hah, i also said dx12. and dx12/vlk doesn't matter it's same story for both. since both are derived from mantle









now look at this and laugh again



You can't because you forgot to read what i said - *ecluding fury/x that gpu is bottlenecked and scales badly even over other gcn cards) when you have i5 or higher cpu with 1080p higher res monitor. otherwise 390x won't beat 980 on 2k across many games.* context is important


----------



## OneB1t

under DX12 there is pretty simple situation

quantum break AMD
doom AMD
total war warhammer AMD
hitman AMD
ashes of singularity AMD
tomb raider patch 7 AMD
forza 6 AMD
gears of war NVIDIA

i think that i see pattern here


----------



## EightDee8D

Quote:


> Originally Posted by *OneB1t*
> 
> under DX12 there is pretty simple situation
> 
> quantum break AMD
> doom AMD
> total war warhammer AMD
> hitman AMD
> ashes of singularity AMD
> tomb raider patch 7 AMD
> forza 6 AMD
> gears of war NVIDIA AMD
> 
> i think that i see pattern here


FTFY


----------



## OneB1t

oh new patch fixed amd issues under gears of war

so now its correct thanks

whats funny is that even game under NVIDIA command (rise of tomb raider) works better for AMD now


----------



## Remij

Quote:


> Originally Posted by *OneB1t*
> 
> oh new patch fixed amd issues under gears of war
> 
> so now its correct thanks
> 
> whats funny is that even game under NVIDIA command (rise of tomb raider) works better for AMD now


What's funny is that all of those games perform best on Nvidia. Every single one of them. Call me when AMD brings out something to combat the 1080 let alone the 1080ti that will be coming soon enough.


----------



## OneB1t

vega will shred that mainstream overpriced NVIDIA cards to pieces







i think you also know that


----------



## Dudewitbow

Quote:


> Originally Posted by *Remij*
> 
> What's funny is that all of those games perform best on Nvidia. Every single one of them. Call me when AMD brings out something to combat the 1080 let alone the 1080ti that will be coming soon enough.


the problem about this logic is that top performing card doesn't carry through with all lower cards performance. Just cause one companies gpu is at top, doesn't mean everything under it is superior. performance of architecture/game is based on similarly compared/priced cards


----------



## Remij

Quote:


> Originally Posted by *OneB1t*
> 
> vega will shred that mainstream overpriced NVIDIA cards to pieces
> 
> 
> 
> 
> 
> 
> 
> i think you also know that


LOL no. Vega will be bottlenecked like every flagship AMD gpu these last few gens.

I gotta admit though, it's nice to see AMD fans happy for a change.
Quote:


> Originally Posted by *Dudewitbow*
> 
> the problem about this logic is that top performing card doesn't carry through with all lower cards performance. Just cause one companies gpu is at top, doesn't mean everything under it is superior. performance of architecture/game is based on similarly compared/priced cards


I really couldn't care less about midrange cards.


----------



## OneB1t

amd flagship is maybe bottlenecked but nvidia flagships from last generations are just useless








remember 780ti/titan or GTX680? total fail now


----------



## Glottis

Eurogamer Digital Foundry's results are a bit more down to earth. No questionable surreal Fury X lead like seen on computerbase.de website. Once Vulcan is enabled, Fury X reaches 980Ti performance levels, as it should.


----------



## Kpjoslee

Quote:


> Originally Posted by *OneB1t*
> 
> vega will shred that mainstream overpriced NVIDIA cards to pieces
> 
> 
> 
> 
> 
> 
> 
> i think you also know that


Vega will definitely compete with 1080, big question would be if that would be enough to make Nvidia to come out with 1080ti.


----------



## Remij

Quote:


> Originally Posted by *OneB1t*
> 
> amd flagship is maybe bottlenecked but nvidia flagships from last generations are just useless
> 
> 
> 
> 
> 
> 
> 
> 
> remember 780ti/titan or GTX680? total fail now


YES, 3 years later. I'm glad I suffered with janky stuttery AMD performance and drivers for those years... it was worth the wait. Now they're matching midrange Nvidia parts. YAY!


----------



## OneB1t

let me tell you secret
also NVIDIA sometime release their cards with delay...

also there is no suffer 390X(290X) overperforms GTX980 under DX11







and now closing onto 1070/980ti under DX12


----------



## Dudewitbow

Quote:


> Originally Posted by *Remij*
> 
> LOL no. Vega will be bottlenecked like every flagship AMD gpu these last few gens.
> 
> I gotta admit though, it's nice to see AMD fans happy for a change.
> I really couldn't care less about midrange cards.


Congratulations, You aren't part of AMD's current target audience(mainstream). Great to talk about something they aren't focusing on, almost like Nvidia and their upcoming Async driver /s


----------



## Remij

Quote:


> Originally Posted by *Dudewitbow*
> 
> Congratulations, You aren't part of AMD's current target audience(mainstream). Great to talk about something they aren't focusing on, almost like Nvidia and their upcoming Async driver /s


Yea, I guess that's why AMD hasn't appealed to me since the 97/9800 Pro days.


----------



## Glottis

I looked on all major tech sites and none of them can reproduce computerbase.de results for fury x. anyone curious what's up with that?

to recap here's computerbase.de claim


and here's everyone else (that i could find with fury x)
digital foundry:

gamegpu:


something's fishy here


----------



## EightDee8D

Quote:


> Originally Posted by *Glottis*
> 
> I looked on all major tech sites and none of them can reproduce computerbase.de results for fury x. anyone curious what's up with that?
> 
> to recap here's computerbase.de claim
> 
> 
> and here's everyone else (that i could find with fury x)
> digital foundry:
> 
> gamegpu:
> 
> 
> something's fishy here


nothing fishy there, no aa and tssaa are only options where async is enabled so that's why huge gains where tssaa is used. and cb.de used that. gamegpu.ru didn't.


----------



## sugarhell

Gamegpu doesnt even use TSSAA...


----------



## infranoia

Quote:


> Originally Posted by *Glottis*
> 
> I looked on all major tech sites and none of them can reproduce computerbase.de results for fury x. anyone curious what's up with that?
> 
> to recap here's computerbase.de claim
> 
> 
> and here's everyone else (that i could find with fury x)
> digital foundry:
> 
> gamegpu:
> 
> 
> something's fishy here


The GameGPU tests were done with SMAA. You can throw those out.


----------



## sugarhell

And DF calculated the performance based on the minimums


----------



## Remij

I love it. You can throw those tests out because they don't allow AMD to shine in the best possible light. Where have I heard this before? ...oh yea, gameworks games.

Did anyone bother asking why Async only works for certain settings?

But at least reviewers should know for the future. If Async isn't enabled on DX12/Vulcan games/benchmarks then it either doesn't count, or it's Nvidia's fault


----------



## GorillaSceptre

Quote:


> Originally Posted by *Remij*
> 
> I love it. You can throw those tests out because they don't allow AMD to shine in the best possible light. Where have I heard this before? ...oh yea, gameworks games.
> 
> Did anyone bother asking why Async only works for certain settings?
> 
> But at least reviewers should know for the future. If Async isn't enabled on DX12/Vulcan games/benchmarks then it either doesn't count, or it's Nvidia's fault


So people pointing out that there's nothing "fishy", or no conspiracy because the others weren't using Async somehow turns into a GW Nvidia victim post?

Logic.


----------



## infranoia

Quote:


> Originally Posted by *Remij*
> 
> I love it. You can throw those tests out because they don't allow AMD to shine in the best possible light. Where have I heard this before? ...oh yea, gameworks games.
> 
> Did anyone bother asking why Async only works for certain settings?
> 
> But at least reviewers should know for the future. If Async isn't enabled on DX12/Vulcan games/benchmarks then it either doesn't count, or it's Nvidia's fault


So bitter. As if it's AMD's fault that id limited Async Shaders to TSSAA for now.

https://twitter.com/idSoftwareTiago/status/752590343963422720


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> I love it. You can throw those tests out because they don't allow AMD to shine in the best possible light. Where have I heard this before? ...oh yea, gameworks games.
> 
> Did anyone bother asking why Async only works for certain settings?
> 
> But at least reviewers should know for the future. If Async isn't enabled on DX12/Vulcan games/benchmarks then it either doesn't count, or it's Nvidia's fault


https://twitter.com/idSoftwareTiago/status/752590016988082180

yes those reviews are invalid because those options are there to show gains. and if you disabled them what's the point ?


----------



## Remij

Quote:


> Originally Posted by *EightDee8D*
> 
> https://twitter.com/idSoftwareTiago/status/752590016988082180
> 
> yes those reviews are invalid because those options are there to show gains. and if you disabled them what's the point ?


The point is that DX12/Vulkan =/= Async.

But I guess none of these benchmarks are valid because ID themselves have stated they are working with Nvidia on implementing Async on Nvidia cards. Until we have async enabled on Nvidia gpus, this is all pointless...


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> The point is that DX12/Vulkan =/= Async.
> 
> But I guess none of these benchmarks are valid because ID themselves have stated they are working with Nvidia on implementing Async on Nvidia cards. Until we have async enabled on Nvidia gpus, this is all pointless...


1 year and waiting for magical async driver. that's not coming till volta i'm afraid.


----------



## Remij

Quote:


> Originally Posted by *EightDee8D*
> 
> 1 year a waiting for magical async driver. that's not coming till volta i'm afraid.


1 year is nothing for AMD peeps.

Worst comes to worst, I guess we could always just stick to OpenGL for Doom.. which still provides better framerates according to Eurogamer. But why do that when Vulkan gives 15fps over OpenGL and 25fps over the Fury X?


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> 1 year is nothing for AMD peeps.
> 
> Worst comes to worst, I guess we could always just stick to OpenGL for Doom.. which still provides better framerates according to Eurogamer. But why do that when Vulkan gives 15fps over OpenGL and 25fps over the Fury X?


wat, you know furyx's replacement isn't here right ? 1 year old gpu. compare it to it's competitor 980ti, and furyx beats it by 7.5%. lol nobody buys new gpu for only 1 game.

what a way to compare things.


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> What do you mean the FuryX's replacement isn't here. It's called the Nvidia GTX1080.


Ok, keep selling it, maybe nvidia will find enough money to fund async drivers for it.


----------



## magnek

Quote:


> Originally Posted by *Remij*
> 
> The point is that DX12/Vulkan =/= Async.
> 
> But I guess none of these benchmarks are valid because ID themselves have stated *they are working with Nvidia on implementing Async on Nvidia cards. Until we have async enabled on Nvidia gpus, this is all pointless...
> 
> 
> 
> 
> 
> 
> 
> *


Where have I heard this line before?

Also, care to guess how much nVidia gained with async in Aots with 1080?


----------



## Remij

Quote:


> Originally Posted by *EightDee8D*
> 
> Ok, keep selling it, maybe nvidia will find enough money to fund async drivers for it.


They don't even need to. Look at the numbers.


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> They don't even need to. Look at the numbers.


0 where i live.


----------



## Remij

Quote:


> Originally Posted by *magnek*
> 
> Where have I heard this line before?
> 
> Also, care to guess how much nVidia gained with async in Aots with 1080?


I dunno, ask ID software. They're the ones who are saying it.









It's funny how they can say that Async is coming for other AA options in the future.. and it's YAY. But when they say they are working with Nvidia on implementing Async, it's.. yeah right.









Care to guess which card has the highest performance in AotS?

Bu but lower settings am I right? The engine isn't rendering everything properly on Nvidia..









It's always something isn't it?


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> I dunno, ask ID software. They're the ones who are saying it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's funny how they can say that Async is coming for other AA options in the future.. and it's YAY. But when they say they are working with Nvidia on implementing Async, it's.. yeah right.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Care to guess which card has the highest performance in AotS?
> 
> Bu but lower settings


Forget 1080/1070 and then talk. they don't have competition yet. so ofc they will be faster.


----------



## f1LL

Sometimes I get the feeling that some of the nVidia high end card owners *actually* pay through the nose, though not in blood but in brain cells...


----------



## Remij

Quote:


> Originally Posted by *EightDee8D*
> 
> Forget 1080/1070 and then talk. they don't have competition yet. so ofc they will be faster.


I get that... but the 1080 and 1070 are here now. When AMD gets their (four stars) together they'll be dealing with the next Nvidia gpus.


----------



## Remij

Quote:


> Originally Posted by *f1LL*
> 
> Sometimes I get the feeling that some of the nVidia high end card owners *actually* pay through the nose, though not in blood but in brain cells...


Nah, that's buying an AMD card and getting performance you should have got day 1 on day 1095...


----------



## the9quad

Funny to watch remji get all worked up over nvidia versus AMD. Dude you are borderline hitting weirdo territory by how upset you get. No one cares about these internet arguments, go play your games and be happy with your card.


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> I get that... but the 1080 and 1070 are here now. When AMD gets their (four stars) together they'll be dealing with the next Nvidia gpus.


and they will have answers for both 1080/80ti.


----------



## sugarhell

I dont even know why you argue. Some results was with Async compute on and TSSAA meanwhile the gamegpu one was with SMAA and some user argued that the computerbase results looks fishy because of the difference in performance.

I dont care if the fury x wins by 50% 100% or 10%. Just compare the same things and stop fighting like fanboys.


----------



## magnek

Quote:


> Originally Posted by *Remij*
> 
> I dunno, ask ID software. They're the ones who are saying it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's funny how they can say that Async is coming for other AA options in the future.. and it's YAY. But when they say they are working with Nvidia on implementing Async, it's.. yeah right.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Care to guess which card has the highest performance in AotS?
> 
> Bu but lower settings am I right? The engine isn't rendering everything properly on Nvidia..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's always something isn't it?


Oxide also said the same thing about working with nVidia to implement async for Maxwell. Almost one year later I'm still waiting for that magical Maxwell async driver, so yeah.



1080 gained a whooping *2.8%* from async in AotS, and sadly this is with me cherry picking the best case scenario.

1080 is 13.5% faster than Fury X in AotS, but that's like saying 680 runs circles around 6970, which really isn't saying much.


----------



## Remij

Quote:


> Originally Posted by *magnek*
> 
> Oxide also said the same thing about working with nVidia to implement async for Maxwell. Almost one year later I'm still waiting for that magical Maxwell async driver, so yeah.
> 
> 
> 
> 1080 gained a whooping *2.8%* from async in AotS, and sadly this is with me cherry picking the best case scenario.
> 
> 1080 is 13.5% faster than Fury X in AotS, but that's like saying 680 runs circles around 6970, which really isn't saying much.


AotS is such an AMD biased title I'm not surprised.

But the reason why 1080 gains such small percent is because it's already well utilized.

But really we should start asking these developers why they say they are working with Nvidia on async when they aren't (I guess)?


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> But really we should start asking NVIDIA , where's async driver


Asking real questions here.


----------



## Robenger

Quote:


> Originally Posted by *Remij*
> 
> AotS is such an AMD biased title I'm not surprised.
> 
> But really we should start asking these developers why they say they are working with Nvidia on async when they aren't (I guess)?


It's not AMD biased, it's using a DX12 standard feature called Asynchronous Compute. Perhaps you should ask Nvidia why they didn't have the foresight to be able to use it.

Lot's of developers work with both AMD and Nvidia, but you already know AMD stands to benefit much more because of GCN.


----------



## Remij

Quote:


> Originally Posted by *EightDee8D*
> 
> Asking real questions here.


What happens if Nvidia actually does release an async driver that increases performance drastically??

I would love that, just to see the look on everyone's face.


----------



## magnek

Quote:


> Originally Posted by *Remij*
> 
> AotS is such an AMD biased title I'm not surprised.
> 
> But the reason why 1080 gains such small percent is because it's already well utilized.
> 
> But really we should start asking these developers why they say they are working with Nvidia on async when they aren't (I guess)?


To put that in perspective, Fury X also "only" gained around 10% with async compute across all resolutions, so it's not as if they're deliberately hindering nvidia's performance.

The better question would be, why do people even hold out hope for async with nvidia? Maxwell gained nothing from async compute, and Pascal had trivial gains with async compute, partly because it's already well utilized. Given nVidia's OpenGL results in DOOM, why is there an expectation that once async is enabled for nVidia under Vulkan they'd also suddenly start seeing 10%+ improvements?


----------



## EightDee8D

Quote:


> Originally Posted by *Remij*
> 
> What happens if Nvidia actually does release an async driver that increases performance drastically??
> 
> I would love that, just to see the look on everyone's face.


I really want to know that too, since according to many devs on b3d they don't really have hardware for it. means they won't gain any performance at all. that's why nvidia is so silent on that issue. even Tom peterson said no comments when he was asked by pcper.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Remij*
> 
> AotS is such an AMD biased title I'm not surprised.
> 
> *But the reason why 1080 gains such small percent is because it's already well utilized.*


The 1080 doesn't even execute all the terrain shaders with async, so it gaining 2% is meaningless.

Quote:


> Originally Posted by *Remij*
> 
> But really we should start asking these developers why they say they are working with Nvidia on async when they aren't (I guess)?


That's an obvious answer isn't it? PR.. As you and others like to point out, Nvidia has a huge marketshare advantage, you think it would be better for them to say "Maxwell/Pascal is utterly incapable of async"?

Nearly been a year that we've been waiting to see Maxwell's Async driver, at this point it's clearly all PR hogwash.


----------



## nagle3092

Quote:


> Originally Posted by *Remij*
> 
> What happens if Nvidia actually does release an async driver that increases performance drastically??
> 
> I would love that, just to see the look on everyone's face.


You'll get that update with the fermi dx12 driver.


----------



## infranoia

Quote:


> Originally Posted by *GorillaSceptre*
> 
> The 1080 doesn't even execute all the terrain shaders with async, so it gaining 2% is meaningless.
> That's an obvious answer isn't it? PR.. As you and others like to point out, Nvidia has a huge marketshare advantage, you think it would be better for them to say "Maxwell/Pascal is utterly incapable of async"?
> 
> Nearly been a year that we've been waiting to see Maxwell's Async driver, at this point it's clearly all PR hogwash.


It's telling that it's been about a year since DX12's been out, in that time Nvidia's been "promising" an AC update, and 3dmark continues to stonewall their release, which is normally in lock-step with a new DX launch.

I'm no conspiracy theorist, but they're certainly making it easy.


----------



## Robenger

Quote:


> Originally Posted by *Remij*
> 
> What happens if Nvidia actually does release an async driver that increases performance drastically??
> 
> I would love that, just to see the look on everyone's face.


They can't, there won't be an update. Nvidia's current architecture does not support it Async Compute.


----------



## magnek

Quote:


> Originally Posted by *Remij*
> 
> What happens if Nvidia actually does release an async driver that increases performance drastically??
> 
> I would love that, just to see the look on everyone's face.


I'll be first in line for the à la carte crow buffet.

But until then, I'll continue in my belief that nVidia lied repeatedly about Maxwell's (lack of) async compute ability.


----------



## Remij

Quote:


> Originally Posted by *GorillaSceptre*
> 
> The 1080 doesn't even execute all the terrain shaders with async, so it gaining 2% is meaningless.


I keep hearing this.. but looking at all the latest video benchmarks posted of AotS.. and there are ZERO differences in the graphics between AMD and Nvidia.

https://www.youtube.com/watch?v=VS49pIQeiqE

Ironically, the only vid that I can recall seeing a clear difference was in that one AMD posted.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Remij*
> 
> I keep hearing this.. but looking at all the latest video benchmarks posted of AotS.. and there are ZERO differences in the graphics between AMD and Nvidia.
> 
> https://www.youtube.com/watch?v=VS49pIQeiqE
> 
> Ironically, the only vid that I can recall seeing a clear difference was in that one AMD posted.


Not going down this rabbit hole again.. It's been discussed all over the place more than enough times already.


----------



## Greenland

Quote:


> Originally Posted by *Remij*
> 
> What's funny is that all of those games perform best on Nvidia. Every single one of them. Call me when AMD brings out something to combat the 1080 let alone the 1080ti that will be coming soon enough.


Quote:


> Originally Posted by *Remij*
> 
> 1 year is nothing for AMD peeps.


But it's a lifetime for nVidia peeps am I right lmao.


----------



## Kpjoslee

Looking at all these Green vs Red, these were good times









http://www.anandtech.com/show/1795


----------



## huzzug

Quote:


> Originally Posted by *Remij*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EightDee8D*
> 
> Asking real questions here.
> 
> 
> 
> What happens if Nvidia actually does release an async driver that increases performance drastically??
> 
> I would love that, just to see the look on everyone's face.
Click to expand...

Nah, we'll just call then a day late and a penny short. Nvidia need to get their heads out of their rear. They're letting all of their fans down by not being on top


----------



## dagget3450

Quote:


> Originally Posted by *huzzug*
> 
> Nah, we'll just call then a day late and a penny short. Nvidia need to get their heads out of their rear. They're letting all of their fans down by not being on top


I don't think they are letting any of the fans down. Like when the 970 issue was brought up with vram. They all rewarded Nvidia by selling and buying a 980gtx. The fans are dead set on the branding alone, they will never consider an alternative. The only way they would consider it is if AMD dominated Nvidia in performance and that is a BIG maybe. You see it all the time here on OCN. People say they want AMD to compete so they will get cheaper Nvidia cards or faster cards from nvidia quicker. So i think Nvidia would have to constantly mess up for a long time before they loose the fans.

I know because i used to be a hardcore nvidia fan, and i would only consider them at the time. Prices were my main motivator. I also didnt like things like SLI mainboards that costed more just because of SLI. Then someone makes a patch and proves SLI was easily circumvented because it was software not hardware. Things like locking out physx when it used to be open market with an addon card. I could go on but Nvidia just really pushed me away with price in the end.


----------



## the9quad

I didn't read any reviews really so....Any info on what is in the 480 cards as far as other abilities? Like DSR, they do 4k? Video streaming can they do 1080p/60fps using VCE? Has AMD done anything with these and VCE to actually make VCE better for streaming/game recording since it is completely terrible right now? Terrible on OBS and X-Split, and not even available on Gameshow since those devs said it is too hard to implement. This is just another area where Nvidias solution works, AMD introduces their version, and then completely wrecks it through neglect.

Anyway, happy with Doom and vulkan gained about 40 fps, so AMD is doing something good there.


----------



## dagget3450

Quote:


> Originally Posted by *the9quad*
> 
> I didn't read any reviews really so....Any info on what is in the 480 cards as far as other abilities? Like DSR, they do 4k? Video streaming can they do 1080p/60fps using VCE? Has AMD done anything with these and VCE to actually make VCE better for streaming/game recording since it is completely terrible right now? Terrible on OBS and X-Split, and not even available on Gameshow since those devs said it is too hard to implement. This is just another area where Nvidias solution works, AMD introduces their version, and then completely wrecks it through neglect.
> 
> Anyway, happy with Doom and vulkan gained about 40 fps, so AMD is doing something good there.


the 4k results i saw in FSU didn't impress me. Of course now we have Timespy and i don't think anyone has done 4k yet. Don't know about 4k vsr, or the other stuff but i don't see them not allowing 4k vsr.


----------



## SuperZan

The 32 ROP's put the 480 a bit out of its element at 4k, I think. Some Time Spy 390 vs 480 4k results would be handy to see some of the specifics.


----------



## Ha-Nocri

I saw some clock speed graphs for 480, it is really throttling @4k as CPU bottleneck is completely removed at that resolution. It's probably the case with DOOM Vulkan too, even @1080p. We will have to wait for AIB cards to see the real performance.


----------



## Orthello

Quote:


> Originally Posted by *SuperZan*
> 
> The 32 ROP's put the 480 a bit out of its element at 4k, I think. Some Time Spy 390 vs 480 4k results would be handy to see some of the specifics.


Its not all the ROPs fault, the card is throttling in this res to the tune of about 150 mhz off default boost rate from the reviews i've seen. Fix that throttling and then it will be interesting to see what difference the rops are making on their own. AIB cards should have this well sorted.


----------



## looniam

fyi on the green front:

used vulkan SDK and then installed the RT for ver. 1.0.17 so vulkan info reads all extentions are supports but still using NVs ver (1.0.8) in doom. however it seems not to crash . . yet. and boy temps just skyrocket! (mid 60c to lower 70c)

that is all.


----------



## SuperZan

Quote:


> Originally Posted by *Orthello*
> 
> Its not all the ROPs fault, the card is throttling in this res to the tune of about 150 mhz off default boost rate from the reviews i've seen. Fix that throttling and then it will be interesting to see what difference the rops are making on their own. AIB cards should have this well sorted.


Agree 100%. The reference model isn't doing the GPU any favours. The AIB partner cards should be more impressive on all performance fronts.


----------



## magnek

Quote:


> Originally Posted by *looniam*
> 
> fyi on the green front:
> 
> used vulkan SDK and then installed the RT for ver. 1.0.17 so vulkan info reads all extentions are supports but still using NVs ver (1.0.8) in doom. however it seems not to crash . . yet. and boy temps just skyrocket! (mid 60c to lower 70c)
> 
> that is all.


Sounds like increased GPU utilization if temps increased that much. That's always a good sign... I think.


----------



## pengs

Quote:


> Originally Posted by *looniam*
> 
> fyi on the green front:
> 
> used vulkan SDK and then installed the RT for ver. 1.0.17 so vulkan info reads all extentions are supports but still using NVs ver (1.0.8) in doom. however it seems not to crash . . yet. and boy temps just skyrocket! (mid 60c to lower 70c)
> 
> that is all.


Quote:


> Originally Posted by *magnek*
> 
> Sounds like increased GPU utilization if temps increased that much. That's always a good sign... I think.


Yeah, that's GPU utilization/the lack of under(?) utilization, for lack of a better word. There's absolutely no CPU bottleneck now, no area where the CPU is limiting the GPU's performance so the GPU is running at 100% load all the time. Higher temps.

But that's normal... or should be normal, but it hasn't been the norm for a long time. GPU cooling solutions will see a bit of a hay-day with these low level API's because there's now absolutely nothing stopping the GPU from screaming (besides vsync).


----------



## Butthurt Beluga

I apologize for the off-topic question, but with these next-gen APIs like Vulkan and DX12 it got me wondering: is asynchronous compute an AMD technology?
I always see it associated with AMD and I was trying to find out exactly who made it, and I haven't been able to find anything useful.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I apologize for the off-topic question, but with these next-gen APIs like Vulkan and DX12 it got me wondering: is asynchronous compute an AMD technology?
> I always see it associated with AMD and I was trying to find out exactly who made it, and I haven't been able to find anything useful.


It's a DX12/Vulkan feature that's been used in consoles, enabling games to perform great on the aging hardware. Only AMD hardware fully supports it on PC atm.


----------



## Marios145

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I apologize for the off-topic question, but with these next-gen APIs like Vulkan and DX12 it got me wondering: is asynchronous compute an AMD technology?
> I always see it associated with AMD and I was trying to find out exactly who made it, and I haven't been able to find anything useful.


Async *Compute* is a DX12 technology for switching between graphics/compute queue, AMD's implementation called Async *Shaders* allows compute or graphics used in parallel without the need to switch queues.


----------



## mtcn77

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I apologize for the off-topic question, but with these next-gen APIs like Vulkan and DX12 it got me wondering: is asynchronous compute an AMD technology?
> I always see it associated with AMD and I was trying to find out exactly who made it, and I haven't been able to find anything useful.


I think it started sometime in 2006:


Spoiler: PcPer



Quote:


> In November 2006, AMD's website stated they started the "GPGPU revolution" with the introduction of "*Close To Metal*", the first iteration of their GPGPU technology that has now evolved into ATI Stream.


[Source]


One other thing: it is the standard business policy for a small company with a big incentive to be merged with a bigger well-funded company to set forth on the new venture. Such happened with Google Earth(_thanks, Nvidia!_): Nvidia payed for the kickstarted until the software company caught attention of Google and the rest is history...
ATi's big goal was just as huge - substituting mainstream gpgpu software libraries for the cpu - which we can say has been making some progress, at least recently, under AMD supervision.


----------



## looniam

Quote:


> Originally Posted by *magnek*
> 
> Sounds like increased GPU utilization if temps increased that much. That's always a good sign... I think.


Quote:


> Originally Posted by *pengs*
> 
> Yeah, that's GPU utilization/the lack of under(?) utilization, for lack of a better word. There's absolutely no CPU bottleneck now, no area where the CPU is limiting the GPU's performance so the GPU is running at 100% load all the time. Higher temps.
> 
> But that's normal... or should be normal, but it hasn't been the norm for a long time. GPU cooling solutions will see a bit of a hay-day with these low level API's because there's now absolutely nothing stopping the GPU from screaming (besides vsync).


pretty much yep.

@1080 while "looking at a wall" i see 5ms across the board (gpu/cpu)

when fighting a buttload, the max cpu hit taps ~7.48ms while everything else ~6.xxms

so yeah, IF (and i do hope) vulkan takes off - i see water in my future.


----------



## magnek

DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


----------



## mtcn77

Quote:


> Originally Posted by *magnek*
> 
> DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


Dx12 is relevant to you as well, you only miss out cpu abstraction benefits with "11.3". The master thread recorder brings cpu overhead, playing back all the commands generated by the rest of the cpu cores back to the gpu.


----------



## 7850K

Quote:


> Originally Posted by *magnek*
> 
> DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


yes I was thinking the same. I was going to wait until all support was dropped for 8.1 to bite the bullet anyway.


----------



## iRUSH

Quote:


> Originally Posted by *magnek*
> 
> DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


Why are you stuck on W7?


----------



## looniam

i see the floodgates opening. . . .


----------



## Kpjoslee

Quote:


> Originally Posted by *iRUSH*
> 
> Why are you stuck on W7?


Some doesn't want to upgrade to Windows 10 for.....privacy issues.


----------



## Eorzean

My GTX 1070 is on its way back to NE, was going to use the store credit and upgrade to a 1080, but now that Vulkan and its performance is real, and Nvidia and their async drivers are nowhere in sight, I'm going to wait till October and see what the 490 is all about. I'm not a fan of upgrading on a yearly basis, which is what Nvidia seemingly wants their customers to do.


----------



## Fancykiller65

Quote:


> Originally Posted by *Kpjoslee*
> 
> Some doesn't want to upgrade to Windows 10 for.....privacy issues.


You know that Microsoft patched the same stuff into 8 and 7 right? On that note, some people don't want to learn the new OS.

Like me.


----------



## iRUSH

Quote:


> Originally Posted by *Eorzean*
> 
> My GTX 1070 is on its way back to NE, was going to use the store credit and upgrade to a 1080, but now that Vulkan and its performance is real, and Nvidia and their async drivers are nowhere in sight, I'm going to wait till October and see what the 490 is all about. I'm not a fan of upgrading on a yearly basis, which is what Nvidia seemingly wants their customers to do.


It's why I moved. I changed monitors too. Even though I do hardware swaps often, Pascal rubbed me the wrong way across the board. It was enough for me to switch to red with the intention of staying here for the foreseeable future.


----------



## Kpjoslee

Quote:


> Originally Posted by *Eorzean*
> 
> My GTX 1070 is on its way back to NE, was going to use the store credit and upgrade to a 1080, but now that Vulkan and its performance is real, and Nvidia and their async drivers are nowhere in sight, I'm going to wait till October and see what the 490 is all about. I'm not a fan of upgrading on a yearly basis, which is what Nvidia seemingly wants their customers to do.


I would rather wait for Vega especially if 490 is indeed dual card solution as rumored. I am not a big fan of 1070/1080 as they stand alone in the market as of now. I prefer when competing GPUs from both vendors are available.


----------



## magnek

Quote:


> Originally Posted by *iRUSH*
> 
> Why are you stuck on W7?


Don't wanna deal with no MS Win 10 as a service nonsense. That's just one reason but it's enough of a reason.
Quote:


> Originally Posted by *looniam*
> 
> i see the floodgates opening. . . .


It's ok I ordered the damn dam be sealed shut.


----------



## Eorzean

Quote:


> Originally Posted by *iRUSH*
> 
> It's why I moved. I changed monitors too. Even though I do hardware swaps often, Pascal rubbed me the wrong way across the board. It was enough for me to switch to red with the intention of staying here for the foreseeable future.


Yup, can pick up a nice ultrawide without the gsync/nvidia tax either (eyeing the LG UC88).
Quote:


> Originally Posted by *Kpjoslee*
> 
> I would rather wait for Vega especially if 490 is indeed dual card solution as rumored. I am not a big fan of 1070/1080 as they stand alone in the market as of now. I prefer when competing GPUs from both vendors are available.


Haven't kept up with the latest rumors, but definitely want something that's at least on the 1070's level. I'll most likely be waiting for Vega, though. Guess I'll go back to my original plan with being preoccupied with my PS4.


----------



## Orthello

Its probalbly not going to happen but heres what i'l like to see happen re the 490 .

490 is a dual polaris cut, full chips - no cut downs. Binned for low leakage etc. AMD works with all - as many as possible AAA developers to get Multi GPU dx12 patches into their game engines.

Then you would have a card that for most purposes should equal the street price of the 1070 with 40-50 % more performance ...

That's a fairy tale though i think the card itself might happen but the rest would take a lot of work and money on AMDs behalf.


----------



## looniam

Quote:


> Originally Posted by *magnek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iRUSH*
> 
> Why are you stuck on W7?
> 
> 
> 
> *Don't* wanna deal with *no* MS Win 10 as a service nonsense. That's just one reason but it's enough of a reason.
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> i see the floodgates opening. . . .
> 
> Click to expand...
> 
> It's ok I ordered the damn dam be sealed shut.
Click to expand...

too late, the double negatives are flowing.


----------



## magnek

Damn logicians.


----------



## rcfc89

I'm cool with Vulcan. Was hovering around 80-90 fps with everything maxed out with OpenGL. With Vulcan I never drop below 100fps.


----------



## huzzug

Quote:


> Originally Posted by *Orthello*
> 
> Its probalbly not going to happen but heres what i'l like to see happen re the 490 .
> 
> 490 is a dual polaris cut, full chips - no cut downs. Binned for low leakage etc. AMD works with all - as many as possible AAA developers to get Multi GPU dx12 patches into their game engines.
> 
> Then you would have a card that for most purposes should equal the street price of the 1070 with 40-50 % more performance ...
> 
> That's a fairy tale though i think the card itself might happen but the rest would take a lot of work and money on AMDs behalf.


Why would a 490 have dual Polaris chip again. The x90 of gen since the beginning have been single chips for AMD and it stays the same. If they want to put out a dual GPU solution, they can use much better than Polaris


----------



## iRUSH

Quote:


> Originally Posted by *rcfc89*
> 
> I'm cool with Vulcan. Was hovering around 80-90 fps with everything maxed out with OpenGL. With Vulcan I never drop below 100fps.


That looks like fun! How do you like that monitor?


----------



## rcfc89

Quote:


> Originally Posted by *iRUSH*
> 
> That looks like fun! How do you like that monitor?


Nothing short of spectacular. I've had everything from 1080p 144hz / 1440p 144hz/ 4K 60hz. Nothing touches the feeling this monitor gives you. It's the ultimate gaming experience.


----------



## Orthello

Quote:


> Originally Posted by *rcfc89*
> 
> Nothing short of spectacular. I've had everything from 1080p 144hz / 1440p 144hz/ 4K 60hz. Nothing touches the feeling this monitor gives you. It's the ultimate gaming experience.


Hmm , i might have to take a look , what is that monitor ? predator ? i have the monitors you list above 1440p 144 hz, 4k 60 hz, so if you reckon this is better i might check it out.


----------



## Blameless

Anyone have any RX 480 benches with the throttling disabled?
Quote:


> Originally Posted by *Fancykiller65*
> 
> You know that Microsoft patched the same stuff into 8 and 7 right?


Not to the same degree and not if you didn't apply the relevant patches.
Quote:


> Originally Posted by *Fancykiller65*
> 
> On that note, some people don't want to learn the new OS.


Windows 10 is one of those things where the better I get to know it, the less I like it.


----------



## SSBrain

Quote:


> Originally Posted by *Blameless*
> 
> Anyone have any RX 480 benches with the throttling disabled?


Check out benchmarks with the RX480 "uber" (+50% Power Limit, increased temperature limit) here:
http://www.hardware.fr/articles/951-30/overclocking-quelle-marge-rx-480.html


----------



## PyroTechNiK

Quote:


> Originally Posted by *Eorzean*
> 
> My GTX 1070 is on its way back to NE, was going to use the store credit and upgrade to a 1080, but now that Vulkan and its performance is real, and Nvidia and their async drivers are nowhere in sight, I'm going to wait till October and see what the 490 is all about. I'm not a fan of upgrading on a yearly basis, which is what Nvidia seemingly wants their customers to do.


I was going to go for a 1070 next month, but reading this thread has convinced me to wait for AMDs next offerings.


----------



## Rabit

Quote:


> Originally Posted by *rcfc89*
> 
> Nothing short of spectacular. I've had everything from 1080p 144hz / 1440p 144hz/ 4K 60hz. Nothing touches the feeling this monitor gives you. It's the ultimate gaming experience.


But still VR is better experience than any monitor


----------



## flopper

Quote:


> Originally Posted by *Rabit*
> 
> But still VR is better experience than any monitor


when its ready, its not yet.


----------



## Shadowarez

Have you seen some of the vehicle games looks like there from the late to early 90's. Hope that's not actual games just 1hr coded test peices to show they got be working in engine. Stuff I seen being released looked like Dragon Lair from arcade.


----------



## Blameless

Quote:


> Originally Posted by *SSBrain*
> 
> Check out benchmarks with the RX480 "uber" (+50% Power Limit, increased temperature limit) here:
> http://www.hardware.fr/articles/951-30/overclocking-quelle-marge-rx-480.html


Thanks.


----------



## junkman

Gonna take some screens of the GTX 980 TI vs RX 480 with the same settings/same locations. Any requests?


----------



## JackCY

Quote:


> Originally Posted by *Fancykiller65*
> 
> You know that Microsoft patched the same stuff into 8 and 7 right? On that note, some people don't want to learn the new OS.
> 
> Like me.


There is nothing new in Win 10 that differs majorly from previous Win. The only difference between OSes is when you switch to Nix based OSes and need to do administrative stuff then you are majorly proded up your rear by the Nix because you won't find more unfriendly, chaotic and bug ridden OS and apps to deal with. But for normal user, gaming and what not, ha, it's all the same if the apps/games are supported on the OS.
I've toyed with Win 10 and most of the stuff still has it's Win95 and NT like equivalent windows and dialogs to find stuff at. Sure it has even more metro like new widgets to set things at but you don't have to use them it's just easier to use them when you want to disable the $ware. And the same app to disable $ware on Win10 works for 8.1 and 7.


----------



## looniam

so yeah after moaning and complaining about vulkan crashing on my green card, seems i need to tone down the OC some.


----------



## Horsemama1956

Quote:


> Originally Posted by *magnek*
> 
> DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


Has OpenGL taken off? Vulkan won't be used all that much more than regular OpenGL.


----------



## magnek

OpenGL had... issues, lots of issues. Vulkan is a clean slate and fresh new start, so I'm hopeful.


----------



## looniam

Quote:


> Originally Posted by *magnek*
> 
> OpenGL had... issues, lots of issues. Vulkan is a clean slate and fresh new start, so I'm hopeful.


come to the abyss:


----------



## Assirra

Besides a couple minor issues i don't mind the "abyss" for my personal PC tough.
Work PC? Hell no, the fact you have no permissions in some folders like program files kills that off completely.


----------



## Tgrove

Quote:


> Originally Posted by *magnek*
> 
> DX12 is completely irrelevant to me since I'll be stuck on Win7 for the foreseeable future. So yeah, definitely want need Vulkan to take off.


Im in the same boat. Really want vulkan to win so i can stay on W7. To the people questioning why, ive tried win 8, 8.1, and 10 and hated them all.


----------



## Clocknut

it can be difficult. Sony isnt using Vulkan for its PS4, Apple isnt supporting it with Mac OS either, but Microsoft is using DirectX12 on both PC & Xbox.


----------



## Samuris

Hi guys i have a problem with Doom on vulkan when crossfire is enabled, i can't launch doom with crossfire enable in amd setting, i have the last amd drivers but nothing work if i disabled amd crossfire in generam amd it's work but i have only my second card who work on doom


----------



## Blameless

Quote:


> Originally Posted by *Samuris*
> 
> Hi guys i have a problem with Doom on vulkan when crossfire is enabled, i can't launch doom with crossfire enable in amd setting, i have the last amd drivers but nothing work if i disabled amd crossfire in generam amd it's work but i have only my second card who work on doom


There is no Crossfire support for Doom.


----------



## Samuris

they said crossfire work on doom with vulkan, *** ?


----------



## Wovermars1996

Quote:


> Originally Posted by *Samuris*
> 
> they said crossfire work on doom with vulkan, *** ?


They said that Sli and Crossfire support would be added but its not in the game at the moment


----------



## mouacyk

Doom demo now has a Vulkan patch just landed, if anyone wants to try without buying the full game. My 980 TI went from 165fps to 161fps at 2560x1080 Ultra. Yay...


----------



## dagget3450

Quote:


> Originally Posted by *mouacyk*
> 
> Doom demo now has a Vulkan patch just landed, if anyone wants to try without buying the full game. My 980 TI went from 165fps to 161fps at 2560x1080 Ultra. Yay...


Less fps? Is this a new patch or the one that has been out for a bit?

I almost bought a 980ti but backed out after thinking and looking over dx12 /vulkan performance. I did still want one for benching but ill just wait.


----------



## Wovermars1996

Quote:


> Originally Posted by *dagget3450*
> 
> Less fps? Is this a new patch or the one that has been out for a bit?
> 
> I almost bought a 980ti but backed out after thinking and looking over dx12 /vulkan performance. I did still want one for benching but ill just wait.


I don't think that there's been a patch since the Vulkan one.
If you have a Nvidia card I'd recommend that you keep OpenGL on instead of Vulkan since Maxwell lacks the Async capabilities of AMD cards.


----------



## t1337dude

Quote:


> Originally Posted by *mouacyk*
> 
> Doom demo now has a Vulkan patch just landed, if anyone wants to try without buying the full game. My 980 TI went from 165fps to 161fps at 2560x1080 Ultra. Yay...


How exactly are you measuring this? Unless there's a benchmark in the demo I'm not quite sure how you compared the FPS.


----------



## ChevChelios

pretty weird how 1080 has gains from Vulkan, but 1060 & 1070 dont


----------



## SSBrain

Most of the gains are in CPU-limited scenarios.


----------



## NightAntilli

Quote:


> Originally Posted by *ChevChelios*
> 
> pretty weird how 1080 has gains from Vulkan, but 1060 & 1070 dont


It means the 1080 is CPU limited under OpenGL and the 1070 & 1060 are not. Nothing weird about it. Bump the res to 4K and I bet no gains will be seen.


----------



## akaTRAP

Quote:


> Originally Posted by *mouacyk*
> 
> Doom demo now has a Vulkan patch just landed, if anyone wants to try without buying the full game. My 980 TI went from 165fps to 161fps at 2560x1080 Ultra. Yay...


I also lose frames on my GTX 860m, though my laptop can barely run the game anyway







1366x768 on low just to hit 60fps.


----------



## mouacyk

Quote:


> Originally Posted by *t1337dude*
> 
> How exactly are you measuring this? Unless there's a benchmark in the demo I'm not quite sure how you compared the FPS.


Save game where you get consistent frame rates.


----------



## Mattbag

Hmmmm buy doom on PC for 30 bucks or barrow it from my brother on ps4.....


----------



## iRUSH

Quote:


> Originally Posted by *Mattbag*
> 
> Hmmmm buy doom on PC for 30 bucks or barrow it from my brother on ps4.....


Just borrow it for the ps4. I have both and if you're in it for just the campaign then just enjoy it for free.


----------



## mouacyk

Just to pat Id on the back for a great Vulkan patch, where is it for 30? I was waiting for the patch and a sale.


----------



## f1LL

Quote:


> Originally Posted by *Mattbag*
> 
> Hmmmm buy doom on PC for 30 bucks or barrow it from my brother on ps4.....


Can you play it with keyboard and mouse on a PS4 or are doomed to use a controller?


----------



## Mattbag

Quote:


> Originally Posted by *f1LL*
> 
> Can you play it with keyboard and mouse on a PS4 or are doomed to use a controller?


If I bought it on PC I would use my 360 controller anyway keyboard and mouse is way to uncomfortable, I prefer to layback and enjoy my games not sit at a desk like i do half of my day at work.

Saw it at both gamestop and amazon for 30 bucks for the physcially copy of the game. otherwise its 60 for digital.


----------



## Mattbag

Quote:


> Originally Posted by *iRUSH*
> 
> Just borrow it for the ps4. I have both and if you're in it for just the campaign then just enjoy it for free.


I like the way you think


----------



## f1LL

Quote:


> Originally Posted by *Mattbag*
> 
> If I bought it on PC I would use my 360 controller anyway keyboard and mouse is way to comfortable, I prefer to layback and enjoy my games not sit at a desk like i do half of my day at work.
> 
> Saw it at both gamestop and amazon for 30 bucks for the physcially copy of the game. otherwise its 60 for digital.


In that case agree with iRUSH.

But still, COULD you attach mouse and keyboard to play on a PS4?


----------



## iRUSH

Quote:


> Originally Posted by *f1LL*
> 
> In that case agree with iRUSH.
> 
> But still, COULD you attach mouse and keyboard to play on a PS4?


I don't think M/KB for games is available on most ps4 titles.


----------



## MrKoala

You can. The signal is converted to controller's protocol.

http://www.eurogamer.net/articles/2015-07-27-ps4-keyboard-mouse-controller-replicates-pc-style-gaming

http://hackaday.com/2013/12/12/modifying-a-ps4-dualshock4-controller-to-use-a-mouse-and-keyboard/.


----------



## Butthurt Beluga

I first bought Doom before the Vulkan patch, refunded because I thought it was boring.
Downloaded the demo because I wanted to see my Fury X performance with Vulkan (it was amazing) and to give the game another shot.

Don't like the Doom4. Horribly boring. Spam F entire game, guns feel lame and doinky, too many cutscene animations.
I wanted to like the game, but I just couldn't.


----------



## ToTheSun!

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Don't like the Doom4. Horribly boring. Spam F entire game, guns feel lame and doinky, too many cutscene animations.
> I wanted to like the game, but I just couldn't.


F is the new metagame. Didn't you see BF1 Alpha?


----------



## t1337dude

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I first bought Doom before the Vulkan patch, refunded because I thought it was boring.
> Downloaded the demo because I wanted to see my Fury X performance with Vulkan (it was amazing) and to give the game another shot.
> 
> Don't like the Doom4. Horribly boring. Spam F entire game, guns feel lame and doinky, too many cutscene animations.
> I wanted to like the game, but I just couldn't.


Yea, I'm having a hard time getting in line with the praise for this game. I love these types of games but this one is just not seeming worth the full price (or at least the demo). Maybe when it's $10. I'm also currently in the middle of Shadow Warrior, and I'm enjoying that a bit more.


----------



## JasmineJas

I'm not sure how much better Vulkan will be in THIS game in particular; it already runs really well (for me anyway) and normally in the early days the performance increases aren't major. It'll be different in a few years when more games are built from the ground up to take advantage of Vulkan and it's been perfected a bit more.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *t1337dude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Butthurt Beluga*
> 
> I first bought Doom before the Vulkan patch, refunded because I thought it was boring.
> Downloaded the demo because I wanted to see my Fury X performance with Vulkan (it was amazing) and to give the game another shot.
> 
> Don't like the Doom4. Horribly boring. Spam F entire game, guns feel lame and doinky, too many cutscene animations.
> I wanted to like the game, but I just couldn't.
> 
> 
> 
> Yea, I'm having a hard time getting in line with the praise for this game. I love these types of games but this one is just not seeming worth the full price (or at least the demo). Maybe when it's $10. I'm also currently in the middle of Shadow Warrior, and I'm enjoying that a bit more.
Click to expand...

It is a bit ironic that DOOM basically follows the pattern laid out by Shadow Warrior, I think I played through that game twice (can't wait for Shadow Warrior 2), but I've played DOOM for almost 100 hours now, mostly practicing to get through an Ultra Nightmare run, and I'm still a ways off from getting tired of it.
Out of my thousands of deaths I don't think more than a handful of them felt cheap. I screw up plenty, but I can always see what I did wrong and how to do better. Beating a level when you get no second chances is still a huge rush and it really feels like I've improved my playstyle by taking on the challenge.

Shadow Warrior is a good game, and maybe the gunplay is better, but the level design in DOOM is utterly excellent.
I keep comparing it with Metroid because it just feels like these environments have been crafted with a level of variety and sense of history that I haven't seen since Metroid Prime. I think it's safe to say that not a single room or hallway is a duplicate of any other.

If you haven't played it on Nigtmare I highly suggest going back and getting familiar with how the game plays when you're a couple of hits away from death at any moment.


----------



## Rabit

Testing Doom with Vukan on my cheap rig
GPU: Club 3d Series 13 7790 1GB @1345 ~1,290V / 1700 * ( this same chip have R7 260X) ** ASIC 74.7%
CPU: X4 860K @ 4.4GHz 1,495V

Recoding decrease 10-20% FPS









Settings for max detail with min Vrma consumption, but still game consume 1089MB









https://www.youtube.com/watch?v=5Ivaz0IUXNo

My cheap cpu give laggy Video sorry


----------



## iRUSH

Quote:


> Originally Posted by *Rabit*
> 
> Testing Doom with Vukan on my cheap rig
> GPU: Club 3d Series 13 7790 1GB @1345 ~1,290V / 1700 * ( this same chip have R7 260X) ** ASIC 74.7%
> CPU: X4 860K @ 4.4GHz 1,495V
> 
> Recoding decrease 10-20% FPS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Settings for max detail with min Vrma consumption, but still game consume 1089MB
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.youtube.com/watch?v=5Ivaz0IUXNo
> 
> My cheap cpu give laggy Video sorry


Awesome!

Try using OBS Studio and set it up to have the GPU do the leg work for the recording. If you have any questions about it just ask. I use it to record daily.


----------



## Rabit

Quote:


> Originally Posted by *iRUSH*
> 
> Awesome!
> 
> Try using OBS Studio and set it up to have the GPU do the leg work for the recording. If you have any questions about it just ask. I use it to record daily.


Recording: OBS VCE Enconder * AMD GPU

GPU: Asus R7 260X DC II 2GB 1100/1625 * Deafault if more than 63C on core start show artefacts







I must bump fan to keep it cool

Doom Ultra detail preset 1080P

https://youtu.be/oYYan5aHeZE


----------



## iRUSH

Quote:


> Originally Posted by *Rabit*
> 
> Recording: OBS VCE Enconder * AMD GPU
> 
> GPU: Asus R7 260X DC II 2GB 1100/1625 * Deafault if more than 63C on core start show artefacts
> 
> 
> 
> 
> 
> 
> 
> I must bump fan to keep it cool
> 
> Doom Ultra detail preset 1080P
> 
> https://youtu.be/oYYan5aHeZE


That looks good to me!


----------



## figgie

Doom,

reminds me of the original run and gun Doom, Duke Nukem and Serious sam franchise which said it best. All man, no cover.

Lots of Monsters, No health Regen, carry all weapons at same time and let me select how to dispatch the Imps, Reavers, Death Knights and Baron's of Hell.

I enjoyed Doom, it is not run and gun. It is a ballet of bullets, rockets and monsters.

I run Doom under an Asus Ares III.







smooth as silk under openGL, now I will have to try vulkan!


----------



## Rabit

CPU: A10 7850k @ 4,2Ghz default voltages
GPU: Asus R7 260X DC 1210/1300 * lowered memory clock and bios from XFX R7 with 1,35V for memory modules to solving Vram artefacts issue + added cooling on hottest modules
RAM: 4 GB 1866Mhz CL10 single channel * to run at this amount memory I need turn off all background software and even close windows explorer








SSD: Adata SP550 240GB
HDD: Samsung HD502HJ 500GB
Recording: OBS VCE Enconder * AMD GPU

https://www.youtube.com/watch?v=PozEZY0_rQY


----------



## prjindigo

Quote:


> Originally Posted by *ChevChelios*


We know the GTX680/690 are garbage, its why I gave one away to a friend who constantly gets hardware stolen yet kept my 850mhz 7950 3-slot cards...


----------



## davidelite10

Guess I'll have to give this a try when I get home.
I haven't played doom4 since I first beat it.

Would love to see how it runs since I upgraded my GTX 780s to GTX 1080s and from 1440p to 4k.

I might make a video for it.


----------



## ToTheSun!

Quote:


> Originally Posted by *davidelite10*
> 
> Would love to see how it runs


It set a new standard for optimization, IMO. Looks great and runs very smoothly, even on lower-tier hardware.


----------



## wholeeo

Was playing it upscaled to 4K with DSR and everything maxed last night on my 1080 and was amazed I was still getting over 100 fps. Vulkan voodoo.


----------



## davidelite10

Quote:


> Originally Posted by *ToTheSun!*
> 
> It set a new standard for optimization, IMO. Looks great and runs very smoothly, even on lower-tier hardware.


Agreed, was very impressed how my 780s held up against it at 1440p.
Would love to see this game fully maxed out at 4k tonight!!


----------



## plyr

My card whines on the loading screen of the game, not the map, maybe due to extreme high fps on that screen, also, the game need a multiplayer-only launcher.


----------



## ronnin426850

Just to chime in that this is the best FPS game in the entire long sad history of games. Bravo.


----------



## Mand12

Any word yet on multi-GPU support?


----------



## ronnin426850

Athlon 750K @4.5Ghz and Rx470 OC on 1080p Ultra give me 80-120 FPS


----------



## davidelite10

Forgot to try this lastnight, was having some weird stuttering issues with my 1080s for some reason. Wasn't happening from Friday till yesterday, all on the same drivers.

Might have to look into better hdmi cables and also maybe getting a better sli bridge, currently using two ribbon cables.


----------



## vodkapl

Quote:


> Originally Posted by *Mand12*
> 
> Any word yet on multi-GPU support?


The first major update to Vulkan, "Vulkan Next", will bring better multi-gpu support. I don't know what that means for DOOM (I'm a dummy), but if developers need to update DOOM to take advantage of the new functionality, and they do, then you'll get your wish granted.


----------



## davidelite10

UPDATE:

Vulkan crashed the game immediately and had a green overlay on my screen after.
Required a restart to fix.

SLI on or off made no difference.
Opengl worked just fine for me sadly.

Hmmm


----------



## moustang

Quote:


> Originally Posted by *plyr*
> 
> My card whines on the loading screen of the game, not the map, maybe due to extreme high fps on that screen, also, the game need a multiplayer-only launcher.


Many cards do that when frame rates suddenly jump well above 200fps. It's almost always limited to menus, although you can get that in some older games if you play with vsynch disabled.


----------



## plyr

Quote:


> Originally Posted by *moustang*
> 
> Many cards do that when frame rates suddenly jump well above 200fps. It's almost always limited to menus, although you can get that in some older games if you play with vsynch disabled.


On CS GO Loading que cards jumps to 900fps and there's no whine at all, its only on the Doom initial loading.


----------



## Blameless

Can't even play the demo without Steam?


----------



## Catscratch

Quote:


> Originally Posted by *ronnin426850*
> 
> Athlon 750K @4.5Ghz and Rx470 OC on 1080p Ultra give me 80-120 FPS


Yeah, dx12 and vulkan puts the work back on gpu. I hope single thread games die with this era.


----------



## TFL Replica

Tried the demo on my 970

Vulkan: very long loading times and severe micro stuttering

OpenGL: very short loading times and perfect silky smooth performance

Not sure if there's a fix for the Vulkan issues, but it's unplayable for me.


----------



## ToTheSun!

Quote:


> Originally Posted by *TFL Replica*
> 
> Tried the demo on my 970
> 
> Vulkan: very long loading times and severe micro stuttering
> OpenGL: very short loading times and perfect silky smooth performance
> 
> Not sure if there's a fix for the Vulkan issues, but it's unplayable for me.


Sounds more like 970 issues (or drivers version, anyway). Vulkan Doom runs flawlessly for me. If only all games ran this smoothly and optimally.


----------



## Robenger

Quote:


> Originally Posted by *ToTheSun!*
> 
> Sounds more like 970 issues. Vulkan Doom runs flawlessly for me.


I've also have seen a few texture loading issues on the 970's. There is a delay with the LOD system where they don't load the new up close texture fast enough so you end up staring at some LOD texture that looks good from far, but far from good a for a few seconds.


----------



## huzzug

Quote:


> Originally Posted by *Robenger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> Sounds more like 970 issues. Vulkan Doom runs flawlessly for me.
> 
> 
> 
> I've also have seen a few texture loading issues on the 970's. There is a delay with the LOD system where they don't load the new up close texture fast enough so you end up staring at some LOD texture that looks good from far, but far from good a for a few seconds.
Click to expand...

Could this be due to the VRAM limitation of the 970 ? Also, does the other Vulkan game (Talos) show similar symptoms or even DX12 having similar problems with the 970 ?


----------



## TFL Replica

Quote:


> Originally Posted by *ToTheSun!*
> 
> Sounds more like 970 issues (or drivers version, anyway). Vulkan Doom runs flawlessly for me. If only all games ran this smoothly and optimally.


Edit: You were absolutely right. Updated from 369.09 to 372.70. That fixed the stuttering and the long loading times. Input lag still feels slightly lower on OpenGL, but at least Vulkan mode is playable now.


----------



## ronnin426850

Quote:


> Originally Posted by *TFL Replica*
> 
> Edit: You were absolutely right. Updated from 369.09 to 372.70. That fixed the stuttering and the long loading times. Input lag still feels slightly lower on OpenGL, but at least Vulkan mode is playable now.


Input lag? Do you have vsync on? Definitely disable that, shooter with input lag is a joke.


----------



## TFL Replica

Quote:


> Originally Posted by *ronnin426850*
> 
> Input lag? Do you have vsync on? Definitely disable that, shooter with input lag is a joke.


Yep, vsync was on. With vsync off, the input lag is pretty much gone, and the framerates are very similar (~120). Wish Doom had a built-in benchmark tool. With vsync on, OpenGL definitely feels smoother than Vulkan. Guess I'll test Vulkan with vsync off for now.


----------



## Newbie2009

Quote:


> Originally Posted by *TFL Replica*
> 
> Yep, vsync was on. With vsync off, the input lag is pretty much gone, and the framerates are very similar (~120). Wish Doom had a built-in benchmark tool. With vsync on, OpenGL definitely feels smoother than Vulkan. Guess I'll test Vulkan with vsync off for now.


One thing on vulcan, I use adaptive vsync. It shows fps @ mid 70s on a 60hz monitor, but no tearing and runs like butter. Probably just reporting fps wrong. With vsync on it stutters.


----------



## ToTheSun!

Quote:


> Originally Posted by *TFL Replica*
> 
> Wish Doom had a built-in benchmark tool.


Well, it has a performance monitor. And it's pretty comprehensive. Makes for a decent "eyeball" bench.


----------



## Rabit

I try Doom on G645 @ 3045mhz 8gb DDR3 CL 8 1640Mhz + R7 260X 2GB 1210/1350Mhz

game run from min 19FPS up to 60 *cap in menu and some areas average was 25-30 on
Ultra settings, but because CPU was to weak OBS cannot record video during game-play,
even on lowest settings ;(

First I encounter game Freeze

To run new Doom on Vulksn API with dual-core CPU you need add this commend in steam:
advance options " +jobs_numthreads 2 "

Fixing freeze in Doom Vulkan API on Dual-Core CPU

https://www.youtube.com/watch?v=9RianeXtj54


----------



## artemis2307

Vulkan even run smoother and higher FPS on my GTX670 than openGL


----------



## TopicClocker

Vulkan in Doom is one of the best implementations of a low level API I've ever seen in a PC game.

I saw this video yesterday on YouTube with Doom running on a system with dual Xeon E5-2670 CPUs, these are 8 core 16 thread Sandybridge-E processors with a base clock of 2.6GHz.

The performance gains under the Vulkan API are absolutely incredible, the GTX 1080 is utilized fairly well by Dual CPUs throughout most of the video.

If we can get gains similar to this from Vulkan or DX12 in the future it will be amazing!


----------



## sixor

how to take screenshots with vulkan on amd??????????

steam does not work, fraps, mirillis, lol


----------



## 12Cores

I hope all future game engines run on Vulkan, I am running Doom with VSR/[email protected] at a resolution of 3200x1800 which is ridiculous for game that throws so much at you. Vulkan should bethe future so far it has shown a lot more promise than DirectX 12.


----------



## Newbie2009

Quote:


> Originally Posted by *sixor*
> 
> how to take screenshots with vulkan on amd??????????
> 
> steam does not work, fraps, mirillis, lol


Steam should work fine.


----------



## Mand12

Any word on whether any API is going to support multi-GPU setups ever?


----------



## Rabit

Quote:


> Originally Posted by *Mand12*
> 
> Any word on whether any API is going to support multi-GPU setups ever?


DX12 support
But only one game support it Ashes of the singularity
https://www.youtube.com/watch?v=XrpTwUJTVCQ


----------



## gertruude

Quote:


> Originally Posted by *Newbie2009*
> 
> Steam should work fine.


doom seems to crash when using f12 for steam screenshots on vulkan

its ok on opengl


----------



## Newbie2009

Quote:


> Originally Posted by *gertruude*
> 
> doom seems to crash when using f12 for steam screenshots on vulkan
> 
> its ok on opengl


Reinstall. Works fine for me.


----------



## gertruude

Quote:


> Originally Posted by *Newbie2009*
> 
> Reinstall. Works fine for me.


only re downloaded it again yesterday









seems alot of other users having same problems

even on the vulkan support faq it says:

While running the Vulkan API, DOOM crashes when taking a screenshot using the Steam overlay on AMD GPUs.
This is a known issue with Vulkan API and AMD GPUs


----------



## Newbie2009

Quote:


> Originally Posted by *gertruude*
> 
> only re downloaded it again yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> seems alot of other users having same problems
> 
> even on the vulkan support faq it says:
> 
> While running the Vulkan API, DOOM crashes when taking a screenshot using the Steam overlay on AMD GPUs.
> This is a known issue with Vulkan API and AMD GPUs


Oh, I guess I am a lucky one then.


----------



## gertruude

Quote:


> Originally Posted by *Newbie2009*
> 
> Oh, I guess I am a lucky one then.


it seems so









ill have a play around tomorrow see if i can get it to work


----------



## Mand12

Quote:


> Originally Posted by *Rabit*
> 
> DX12 support
> But only one game support it Ashes of the singularity
> https://www.youtube.com/watch?v=XrpTwUJTVCQ


I mean Doom, specifically. And not the DX12 different-GPU support, I'm meaning SLI/Crossfire. Doom doesn't support it, to my knowledge, and was hoping that there might be an update I missed.


----------



## NikolayNeykov

Is it needed to run Vulkan with my 980ti G1 and 6700k? Do it gives better visual quality then OpenGL?


----------



## mouacyk

Quote:


> Originally Posted by *NikolayNeykov*
> 
> Is it needed to run Vulkan with my 980ti G1 and 6700k? Do it gives better visual quality then OpenGL?


it should give you better frame variance, if anything.


----------



## ronnin426850

Quote:


> Originally Posted by *NikolayNeykov*
> 
> Is it needed to run Vulkan with my 980ti G1 and 6700k? Do it gives better visual quality then OpenGL?


Visual quality should theoretically be identical.


----------



## Zam15

Does Vulkan tank their FPS for anyone else? Running SLI 980s and its seems to drop my FPS to 40/50 at 4K while OGL keeps it above 60.. Identical settings.


----------



## mouacyk

Quote:


> Originally Posted by *Zam15*
> 
> 
> Does Vulkan tank their FPS for anyone else? Running SLI 980s and its seems to drop my FPS to 40/50 at 4K while OGL keeps it above 60.. Identical settings.


You should try running Doom with Vulkan in a window and have GPUz open to see if both GPUs are used. Based on your results, it seems like SLI is either broken or not working in Vulkan yet.


----------



## Diablosbud

Quote:


> Originally Posted by *Zam15*
> 
> 
> Does Vulkan tank their FPS for anyone else? Running SLI 980s and its seems to drop my FPS to 40/50 at 4K while OGL keeps it above 60.. Identical settings.


My friend and I tried Vulkan on DOOM on his 770 SLI and the FPS was tanking. We turned on the visual SLI indicator in the NVIDIA Control Panel and it didn't come up with Vulkan. Then when we switched it back to OpenGL it did show up. So I don't think DOOM Vulkan has SLI support yet (at least for older GPUs it seems).


----------



## Newbie2009

it doesn't


----------



## Zam15

Quote:


> Originally Posted by *mouacyk*
> 
> You should try running Doom with Vulkan in a window and have GPUz open to see if both GPUs are used. Based on your results, it seems like SLI is either broken or not working in Vulkan yet.


Quote:


> Originally Posted by *Diablosbud*
> 
> My friend and I tried Vulkan on DOOM on his 770 SLI and the FPS was tanking. We turned on the visual SLI indicator in the NVIDIA Control Panel and it didn't come up with Vulkan. Then when we switched it back to OpenGL it did show up. So I don't think DOOM Vulkan has SLI support yet (at least for older GPUs it seems).


Quote:


> Originally Posted by *Newbie2009*
> 
> it doesn't


Thanks all. I had a feeling that was the case.. Looks like I'll be sticking to OGL for a bit.


----------



## Dimaggio1103

Vulkan is a new API, no it does not affect visual fidelity from OGL to Vulkan. Tons of things don't work like FPS counters and monitoring software in game as they have to adapt to the new API code. Should not be to hard just needs time. FPS will shoot up typically on non SLI setups, but other than that, not much benefit.


----------



## ronnin426850

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Vulkan is a new API, no it does not affect visual fidelity from OGL to Vulkan. Tons of things don't work like FPS counters and monitoring software in game as they have to adapt to the new API code. Should not be to hard just needs time. FPS will shoot up typically on non SLI setups, but other than that, not much benefit.


Good thing DOOM has its own performance metrics, fps counter included, that work with every API


----------



## Sub-Zero378

When using vulkan and a nvidia gpu does any body else have like lines through terrain at a far?


----------



## Prophet4NO1

Vulkan is a nogo on my rig in triple screen. For what ever reason the left screen stays black. But the other two work and the image is laid out like it should be. Go back to OpenGL and it's fine.


----------



## TheReciever

Im really surprised with Vulcan. It allowed me to play 1920x1080 at medium settings.

DX12 on the other hand, crashes BF1. Though Im not sure if my card officially supports DX12 (Feature 11_1)


----------



## Dimaggio1103

Quote:


> Originally Posted by *ronnin426850*
> 
> Good thing DOOM has its own performance metrics, fps counter included, that work with every API


For sure, Im loving AMD lately but that's a huge oversight that ReLive does not work with Vulkan.


----------



## ronnin426850

Have you guys found this?


----------



## Prophet4NO1

found that on my first play through. lol


----------



## ToTheSun!

Quote:


> Originally Posted by *ronnin426850*
> 
> Have you guys found this?


Everyone doing achievements would have to have found that. And that's a lot of people.

edit: actually, it's not part of any achievement, but it's still fairly easy to find if you play any game like an RPG


----------



## ronnin426850

Quote:


> Originally Posted by *ToTheSun!*
> 
> Everyone doing achievements would have to have found that. And that's a lot of people.
> 
> edit: actually, it's not part of any achievement, but it's still fairly easy to find if you play any game like an RPG


I didn't find it on my first playthrough, since it's not marked on the map, and is hidden right next to another secret


----------



## ToTheSun!

Quote:


> Originally Posted by *ronnin426850*
> 
> I didn't find it on my first playthrough, since it's not marked on the map, and is hidden right next to another secret


That's why i play any game like an RPG! I assume every single nook and cranny has a secret.


----------



## JackCY

Quote:


> Originally Posted by *ToTheSun!*
> 
> That's why i play any game like an RPG! I assume every single nook and cranny has a secret.


Too bad enemies didn't use boobytraps.


----------



## ronnin426850

The Arcade mode is insanely fun! Just tried it for the fist time.


----------



## paralemptor

I just reinstalled the game to experiment with Vulkan and probably play the game from start to finish, but...

...the mouse latency with v-sync enabled is making it unplayable. When forcing 60 or 59 fps with Riva Tuner's frame limiter and siabling vsync it's almost alright, but there is significant screen tearing in the middle.

Any ideas? I'm using Radeon R9 290 4 GB.


----------



## Boinz

Quote:


> Originally Posted by *paralemptor*
> 
> I just reinstalled the game to experiment with Vulkan and probably play the game from start to finish, but...
> 
> ...the mouse latency with v-sync enabled is making it unplayable. When forcing 60 or 59 fps with Riva Tuner's frame limiter and siabling vsync it's almost alright, but there is significant screen tearing in the middle.
> 
> Any ideas? I'm using Radeon R9 290 4 GB.


adaptive vsync?


----------



## ronnin426850

Quote:


> Originally Posted by *paralemptor*
> 
> I just reinstalled the game to experiment with Vulkan and probably play the game from start to finish, but...
> 
> ...the mouse latency with v-sync enabled is making it unplayable. When forcing 60 or 59 fps with Riva Tuner's frame limiter and siabling vsync it's almost alright, but there is significant screen tearing in the middle.
> 
> Any ideas? I'm using Radeon R9 290 4 GB.


vsync always results in input latency, unfortunately. Only thing you can do is make sure you're in Fullscreen mode, not in Borderless, but idk how much of effect that will have.

What I'm doing is I keep it uncapped, and play on a 75Hz monitor. There is some tearing still, but the frames are so close together that it almost makes no difference.


----------



## Mand12

Quote:


> Originally Posted by *paralemptor*
> 
> I just reinstalled the game to experiment with Vulkan and probably play the game from start to finish, but...
> 
> ...the mouse latency with v-sync enabled is making it unplayable. When forcing 60 or 59 fps with Riva Tuner's frame limiter and siabling vsync it's almost alright, but there is significant screen tearing in the middle.
> 
> Any ideas? I'm using Radeon R9 290 4 GB.


Variable refresh displays. Your GPU is FreeSync compatible, so if you get a FreeSync display you can get rid of tearing while avoiding vsync's added input latency.

Otherwise, the choice is simple: latency, tearing. Pick one.


----------



## philhalo66

i must be unlucky, i dont see any FPS gain with vulkan at all. Even with OpenGL i get around 230 average so its not a big deal but all the hype surrounding vulkan had me expecting a big jump.


----------



## budgetgamer120

Quote:


> Originally Posted by *philhalo66*
> 
> i must be unlucky, i dont see any FPS gain with vulkan at all. Even with OpenGL i get around 230 average so its not a big deal but all the hype surrounding vulkan had me expecting a big jump.


Can your CPU handle more than 230fps?

Also Nvidia GPUs dont get as big a jump as AMD.


----------



## philhalo66

Quote:


> Originally Posted by *budgetgamer120*
> 
> Can your CPU handle more than 230fps?
> 
> Also Nvidia GPUs dont get as big a jump as AMD.


yeah cpu load is around 60-70% on all 4 cores gpu load is pegged at 99-100%. I guess that explains why i dont see any gains, pretty disappointing.


----------



## Mand12

Quote:


> Originally Posted by *philhalo66*
> 
> yeah cpu load is around 60-70% on all 4 cores gpu load is pegged at 99-100%. I guess that explains why i dont see any gains, pretty disappointing.


But to be expected, from what I understand. The whole point of the "low-level API" push was to improve performance in CPU-bound scenarios, typically when using an underpowered CPU for the workload. If you were GPU limited the results were nearly identical.


----------



## philhalo66

Quote:


> Originally Posted by *Mand12*
> 
> But to be expected, from what I understand. The whole point of the "low-level API" push was to improve performance in CPU-bound scenarios, typically when using an underpowered CPU for the workload. If you were GPU limited the results were nearly identical.


It's kinda strange my friend has a 3570K at 4.5 (most he can get) and he saw nearly double the fps in vulkan at 1440P going from 60 fps to about 100 he has a 1070 as well.


----------



## paralemptor

Quote:


> Originally Posted by *Mand12*
> 
> Variable refresh displays. Your GPU is FreeSync compatible, so if you get a FreeSync display you can get rid of tearing while avoiding vsync's added input latency.
> 
> Otherwise, the choice is simple: latency, tearing. Pick one.


Yeah, I know, but it just feels like double buffering, haven't seen such huge latency in years. Acquiring a FreeSync display is not planned at the moment


----------



## XHellAngelX

GameGPU rebench with new Driver


----------



## Hequaqua

Here is my 1060 in both API's, same settings 1080p:

Oh...wanted to add....GTX1060 stock settings +116Power limit/[email protected]

5 Minute capture:
NVIDIA GeForce GTX 1060 6GB, DOOMx64vk
Minimum: 43.57
Maximum: 293.77
Average: 120.8

NVIDIA GeForce GTX 1060 6GB, DOOMx64
Minimum: 37.45
Maximum: 375.66
Average: 121.11



1 minute capture:
I didn't save the min/avg/max results, just the graph at 1440p:



Still learning the software that does the capturing.


----------



## Mand12

So is there _any_ API that will do SLI for Doom? At all?


----------



## jmcosta

i don't think this game has proper support for multigpu.
you can try the sli compatibility, increases the frame rate to at least 30% but that might add some stutter..


----------



## ronnin426850

Quote:


> Originally Posted by *Mand12*
> 
> So is there _any_ API that will do SLI for Doom? At all?


Quote:


> Originally Posted by *Diablosbud*
> 
> My friend and I tried Vulkan on DOOM on his 770 SLI and the FPS was tanking. We turned on the visual SLI indicator in the NVIDIA Control Panel and it didn't come up with Vulkan. Then when we switched it back to OpenGL it did show up. So I don't think DOOM Vulkan has SLI support yet (at least for older GPUs it seems).


Quote:


> Originally Posted by *Zam15*
> 
> Thanks all. I had a feeling that was the case.. Looks like I'll be sticking to OGL for a bit.


Looks like OpenGL does SLI.


----------



## Arthedes

Quote:


> Originally Posted by *Newbie2009*
> 
> Comparison: I picked this part as ultra shadows kills performance so this is a minimum fps example. 290X single, stock. Oh, these are 1600p, not 1080p. Crossfire doesn't work. (but don't need now)


What monitoring program is that?


----------



## budgetgamer120

Quote:


> Originally Posted by *Arthedes*
> 
> What monitoring program is that?


That monitor is built into doom


----------



## mouacyk

Just completed campaign on Ultra-Violence, except for final boss on Hurt Me Plenty. Doom plays excellently at 120fps with Utra Low Motion Blur (ULMB). Highly recommended to play this way.


----------



## ronnin426850

Quote:


> Originally Posted by *mouacyk*
> 
> Just completed campaign on Ultra-Violence, except for final boss on Hurt Me Plenty. Doom plays excellently at 120fps with Utra Low Motion Blur (ULMB). Highly recommended to play this way.


If you managed to complete the campaign on UV, I'm pretty sure you can take the final boss as well, I finished all on UV recently, and the final boss was among the easiest parts. As long as you save up your BFG shots, hit with the Gauss, and avoid his attacks, it's pretty easy. Way easier than the 3 barons and 4 revenants at the same time you go through in the previous level


----------



## philhalo66

Quote:


> Originally Posted by *ronnin426850*
> 
> If you managed to complete the campaign on UV, I'm pretty sure you can take the final boss as well, I finished all on UV recently, and the final boss was among the easiest parts. As long as you save up your BFG shots, hit with the Gauss, and avoid his attacks, it's pretty easy. Way easier than the 3 barons and 4 revenants at the same time you go through in the previous level


the hell guards are the hardest part on UV i think


----------



## ronnin426850

Quote:


> Originally Posted by *philhalo66*
> 
> the hell guards are the hardest part on UV i think


I think they are mostly a problem on the first playthrough, when you don't know they're coming, and you waste your ammo on the guy before.


----------



## Dhoulmagus

This game runs so unbelievably well on aging hardware. locked in at 60FPS on my big 1080P screen @ ultra settings with a 280x on the Vulkan API (so far at least, I just got the game).. Wow. It would be a great service to planet earth if they released the source code to this game/ID tech 6, but I guess we'll have to wait a decade, or maybe never without Carmack.

Hopefully Creative doesn't have a patent on any of the code this time


----------



## ToTheSun!

Quote:


> Originally Posted by *ronnin426850*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mouacyk*
> 
> Just completed campaign on Ultra-Violence, except for final boss on Hurt Me Plenty. Doom plays excellently at 120fps with Utra Low Motion Blur (ULMB). Highly recommended to play this way.
> 
> 
> 
> If you managed to complete the campaign on UV, I'm pretty sure you can take the final boss as well, I finished all on UV recently, and the final boss was among the easiest parts. As long as you save up your BFG shots, hit with the Gauss, and avoid his attacks, it's pretty easy. Way easier than the 3 barons and 4 revenants at the same time you go through in the previous level
Click to expand...

I think the game stops being hard after you get Rich Get Richer. The hardest part, then, becomes finding a good vantage point from which to spam the Gauss Cannon.

Either way, after spending most of the game conserving ammo, it's deeply satisfying to just pulverize everything carelessly. I think id struck a good balance between how easy and satisfying it is with how late in the game you get it.


----------



## ronnin426850

Quote:


> Originally Posted by *Serious_Don*
> 
> This game runs so unbelievably well on aging hardware. locked in at 60FPS on my big 1080P screen @ ultra settings with a 280x on the Vulkan API (so far at least, I just got the game).. Wow. It would be a great service to planet earth if they released the source code to this game/ID tech 6, but I guess we'll have to wait a decade, or maybe never without Carmack.
> 
> Hopefully Creative doesn't have a patent on any of the code this time


Find OzTalksHw's video about "AMD FineWine technology" on YouTube, and you'll know why. 280X is maybe the best-aging gpu in history.


----------



## philhalo66

Quote:


> Originally Posted by *ronnin426850*
> 
> Find OzTalksHw's video about "AMD FineWine technology" on YouTube, and you'll know why. 280X is maybe the best-aging gpu in history.


I don't know abut that, the 8800GT and 8800GTX gives it a run for its money they lasted for a very long time.


----------



## Hueristic

Quote:


> Originally Posted by *philhalo66*
> 
> I don't know abut that, the 8800GT and 8800GTX gives it a run for its money they lasted for a very long time.


I am STILL using a Evga 8800GS for my third monitor!


----------



## ronnin426850

Quote:


> Originally Posted by *philhalo66*
> 
> I don't know abut that, the 8800GT and 8800GTX gives it a run for its money they lasted for a very long time.


Come on, I own a 8800 GTX, sits right here in my desk, and it does NOT "give it a run for it's money"







Not even close! Especially in newer games.


----------



## philhalo66

Quote:


> Originally Posted by *ronnin426850*
> 
> Come on, I own a 8800 GTX, sits right here in my desk, and it does NOT "give it a run for it's money"
> 
> 
> 
> 
> 
> 
> 
> Not even close! Especially in newer games.


did you even read what i said? i never once said it was comparable in performance. i said it lasted a very long time that's pretty much what you were saying as far as aging well goes right?


----------



## ronnin426850

Quote:


> Originally Posted by *philhalo66*
> 
> did you even read what i said? i never once said it was comparable in performance. i said it lasted a very long time that's pretty much what you were saying as far as aging well goes right?


Ah, ok then, I did read what you said, but i misunderstood what you mean, because "gives it a run for its money" can be interpreted in at least 2 different ways.


----------



## philhalo66

Quote:


> Originally Posted by *ronnin426850*
> 
> Ah, ok then, I did read what you said, but i misunderstood what you mean, because "gives it a run for its money" can be interpreted in at least 2 different ways.


eh fair enough it was worded poorly. but you have to admit the geforce 88xx series is legendary for being The first unified shader architecture graphics card ever, In alot of ways it changed everything we knew about GPU's and graphics for that matter. It wasn't until like 2012-2013 when it wasn't able to play most games anymore. thats why i say the 8800 GTX gives it a run for it's money in terms of aging well. Even then id say the GTX 580 is a close second considering it just about matches the performance of a 280x and its 7 years old since november.


----------



## ronnin426850

Quote:


> Originally Posted by *philhalo66*
> 
> eh fair enough it was worded poorly. but you have to admit the geforce 88xx series is legendary for being The first unified shader architecture graphics card ever, In alot of ways it changed everything we knew about GPU's and graphics for that matter. It wasn't until like 2012-2013 when it wasn't able to play most games anymore. thats why i say the 8800 GTX gives it a run for it's money in terms of aging well. Even then id say the GTX 580 is a close second considering it just about matches the performance of a 280x and its 7 years old since november.


I've owned 5 different 8800 variants, loved them all


----------



## Artikbot

I loved my 8800GTs, even though SLI was a pain in the posterior orifice.


----------



## Mahigan

There is no AMD "FineWine" technology. That's just a simplistic way of describing circumstances and architectural choices and how the two intersect.

1. Games: Developper relations with AMD have gotten better... in fact they're surpassing nVIDIA's developer relations thanks in large part to the Sony and Microsoft consoles. The Nintendo consoles have had no impact and will continue to have 0 impact therefore nVIDIA gaining that new Nintendo console win will not change this. Games are, for the most part, being tailored towards GCN. nVIDIA have taken notice of this and have begun, since the GTX 900 series, to move their architecture into a more similar direction to AMDs (SIMD organization wise).

2. CPU overhead: AMDs drivers are not multi-threaded for the most part. AMD have, as of late, begun optimizing their drivers for multi-core on a title per title basis (hence recent gains) but this has still not alleviated the CPU overhead. AMDs older GCN cards sit there idling, far too often, waiting for the CPU to feed them info. As CPUs become more powerful (single threaded wise) AMDs GPUs have seen an increase while nVIDIA GPUs have not (for the most part). Therefore in time, as single threaded performance has risen a tad, AMD GCN GPUs have gained performance relative to their nVIDIA counterparts.

3. More parallel architecture: As compute jobs take more and more pipeline time, as games evolve, AMD GPUs tend to be better equipped (architecturally) to tackle all of this extra compute work. Asynchronous Compute + Graphics is also helping in that dept.

Therefore the AMD "FineWine" technology, as it is called, is nothing more than a matter of circumstances which have, since end of 2015, begun to favor AMD tech. Of course AMD haven't released newer high end cards (aside from Fiji) to truly take advantage of the situation. However, now we're seeing so many AMD optimized titles (and so few nVIDIA optimized titles) that Vega will likely beat the GTX 1080 Ti and maybe even the Titan X (Pascal) under certain situations (games optimized for AMD). In other titles, AMDs Vega will likely be a match for a GTX 1080. What you have to understand is that AMDs Vega parts were downclocked during their recent showings (like their RyZen CPUs). AMD won't reveal the true performance until launch. Why? Because they've had some bad experiences in the past where they reveal the performance of a card and a competitor comes with an "Ultra" or "Ti" edition in order to retake the crown and spoil AMDs launch (yes nVIDIA love to spoil AMDs launches). So we got a 3.15 (3.4) GHz RyZen and an ES Vega on ES drivers. What we saw performance wise... will only be higher at launch.

Either way... things are about to get more competitive on both the GPU and CPU front thanks to AMD


----------



## ronnin426850

Quote:


> Originally Posted by *Mahigan*
> 
> There is no AMD "FineWine" technology. That's just a simplistic way of describing circumstances and architectural choices and how the two intersect.
> 
> 1. Games: Developper relations with AMD have gotten better... in fact they're surpassing nVIDIA's developer relations thanks in large part to the Sony and Microsoft consoles. The Nintendo consoles have had no impact and will continue to have 0 impact therefore nVIDIA gaining that new Nintendo console win will not change this. Games are, for the most part, being tailored towards GCN. nVIDIA have taken notice of this and have begun, since the GTX 900 series, to move their architecture into a more similar direction to AMDs (SIMD organization wise).
> 
> 2. CPU overhead: AMDs drivers are not multi-threaded for the most part. AMD have, as of late, begun optimizing their drivers for multi-core on a title per title basis (hence recent gains) but this has still not alleviated the CPU overhead. AMDs older GCN cards sit there idling, far too often, waiting for the CPU to feed them info. As CPUs become more powerful (single threaded wise) AMDs GPUs have seen an increase while nVIDIA GPUs have not (for the most part). Therefore in time, as single threaded performance has risen a tad, AMD GCN GPUs have gained performance relative to their nVIDIA counterparts.
> 
> 3. More parallel architecture: As compute jobs take more and more pipeline time, as games evolve, AMD GPUs tend to be better equipped (architecturally) to tackle all of this extra compute work. Asynchronous Compute + Graphics is also helping in that dept.
> 
> Therefore the AMD "FineWine" technology, as it is called, is nothing more than a matter of circumstances which have, since end of 2015, begun to favor AMD tech. Of course AMD haven't released newer high end cards (aside from Fiji) to truly take advantage of the situation. However, now we're seeing so many AMD optimized titles (and so few nVIDIA optimized titles) that Vega will likely beat the GTX 1080 Ti and maybe even the Titan X (Pascal) under certain situations (games optimized for AMD). In other titles, AMDs Vega will likely be a match for a GTX 1080. What you have to understand is that AMDs Vega parts were downclocked during their recent showings (like their RyZen CPUs). AMD won't reveal the true performance until launch. Why? Because they've had some bad experiences in the past where they reveal the performance of a card and a competitor comes with an "Ultra" or "Ti" edition in order to retake the crown and spoil AMDs launch (yes nVIDIA love to spoil AMDs launches). So we got a 3.15 (3.4) GHz RyZen and an ES Vega on ES drivers. What we saw performance wise... will only be higher at launch.
> 
> Either way... things are about to get more competitive on both the GPU and CPU front thanks to AMD


----------



## budgetgamer120

Quote:


> Originally Posted by *philhalo66*
> 
> eh fair enough it was worded poorly. but you have to admit the geforce 88xx series is legendary for being The first unified shader architecture graphics card ever, In alot of ways it changed everything we knew about GPU's and graphics for that matter. It wasn't until like 2012-2013 when it wasn't able to play most games anymore. thats why i say the 8800 GTX gives it a run for it's money in terms of aging well. Even then id say the GTX 580 is a close second considering it just about matches the performance of a 280x and its 7 years old since november.


8800 didn't last as long as 7970. Its been over 5 years since 7970. 8800 was forgotten when Ati 5800 series came out https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html

So I'm a little confused about this "legendary" talk


----------



## ronnin426850

Quote:


> Originally Posted by *budgetgamer120*
> 
> 8800 didn't last as long as 7970. Its been over 5 years since 7970. 8800 was forgotten when Ati 5800 series came out https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html
> 
> So I'm a little confused about this "legendary" talk


I agree, but it's generally better not to argue with people who bring up their all time favorite card


----------



## mtcn77

Quote:


> Originally Posted by *ronnin426850*
> 
> I agree, but it's generally better not to argue with people who bring up their all time favorite card


"Nothing" means nothing, yeah! You know, @ronnin426850, my card is the pure athlete! And your card, and mine, have a 'date with destiny' right now, yeah!
...and I will be a participant and I'm going to be watching, too, _through the videoscope!_


----------



## philhalo66

Quote:


> Originally Posted by *budgetgamer120*
> 
> 8800 didn't last as long as 7970. Its been over 5 years since 7970. 8800 was forgotten when Ati 5800 series came out https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html
> 
> So I'm a little confused about this "legendary" talk


yeah but the 5800 series didn't bring anything to the table accept DX11 and less than a year later nvidia blew them out of the water with GTX 400 series which is still usable today. i say the 8800 series was legendary because it was a game changer, the first card to use unified shaders, everything else had dedicated shaders for each task. it was also the first cards to be able to do other things than just games. It was also the first GPU to include physics engine behind it. they were first DX10 cards, it was unmatched at the time in terms of power consumption and performance.


----------



## budgetgamer120

Quote:


> Originally Posted by *philhalo66*
> 
> yeah but the 5800 series didn't bring anything to the table accept DX11 and less than a year later nvidia blew them out of the water with GTX 400 series which is still usable today. i say the 8800 series was legendary because it was a game changer, the first card to use unified shaders, everything else had dedicated shaders for each task. it was also the first cards to be able to do other things than just games. It was also the first GPU to include physics engine behind it. they were first DX10 cards, it was unmatched at the time in terms of power consumption and performance.


What? A 5870 performed 2x 4870. You call that nothing? Ok.

Edit: Oh so unified shaders is a big deal but the first dx11 gpu isn't? Bias is biased...


----------



## philhalo66

Quote:


> Originally Posted by *budgetgamer120*
> 
> What? A 5870 performed 2x 4870. You call that nothing? Ok.


i wasn't talking about performance. name one thing aside from DX11 that was revolutionary about anything the 5800 series did.


----------



## budgetgamer120

Quote:


> Originally Posted by *philhalo66*
> 
> i wasn't talking about performance. name one thing aside from DX11 that was revolutionary about anything the 5800 series did.


Sorry but performance is number one priority in the real world.

Oh so unified shaders is a big deal but the first dx11 gpu isn't? Bias is biased...


----------



## philhalo66

Quote:


> Originally Posted by *budgetgamer120*
> 
> Sorry but performance is number one priority in the real world.
> 
> Oh so unified shaders is a big deal but the first dx11 gpu isn't? Bias is biased...


are you a troll or just an amd fanboy? because the fact you even question unified shader arch is a red flag. if i need to explain why that was a big deal then im done talking. Have a good day. If performance is the case then my old 580 hydrocopper beats the 5800 series and comes damn close to the 7970 so yeah.


----------



## Kuivamaa

Unified shaders debuted with Xbox 360 (ATi Xenos GPU).


----------



## budgetgamer120

Quote:


> Originally Posted by *philhalo66*
> 
> are you a troll or just an amd fanboy? because the fact you even question unified shader arch is a red flag. if i need to explain why that was a big deal then im done talking. Have a good day. If performance is the case then my old 580 hydrocopper beats the 5800 series and comes damn close to the 7970 so yeah.


Are you the troll? I didn't downplay unified shader. You are the one downplaying DX11. Dx11 was one of the biggest step in the API bringing a whole hoax of features which brought the graphics we have today.

So quick to call someone fanboy when they don't agree with you.

As the post stated above XBox 360... Lol


----------



## JackCY

If those old GPUs were so great you would still be using them. Fortunately GPU development didn't stall as much as peripherals so in general older GPUs are being replaced to get better performance.
Plus don't forget AMD isn't releasing GPU of every tier every year like NV, so people do upgrade less often if they want to stay with an AMD card of the same tier. 7970/280x/380x ... many years passed before it was replaced with a 480, 7970/280x lasted and so did their weaker siblings because they kind of had to, there was no replacement for ages thanks to AMD's management fiasco. They are still recovering from those bad decisions.


----------



## Dhoulmagus

Quote:


> Originally Posted by *Hueristic*
> 
> I am STILL using a Evga 8800GS for my third monitor!


Quote:


> Originally Posted by *philhalo66*
> 
> I don't know abut that, the 8800GT and 8800GTX gives it a run for its money they lasted for a very long time.


Quote:


> Originally Posted by *philhalo66*
> 
> eh fair enough it was worded poorly. but you have to admit the geforce 88xx series is legendary for being The first unified shader architecture graphics card ever, In alot of ways it changed everything we knew about GPU's and graphics for that matter. It wasn't until like 2012-2013 when it wasn't able to play most games anymore. thats why i say the 8800 GTX gives it a run for it's money in terms of aging well. Even then id say the GTX 580 is a close second considering it just about matches the performance of a 280x and its 7 years old since november.


I have a bit more nostalgia for the 8800 than I do the 7970, they both had long shelf lives but in a different way. 8800 was a monster from the day it hit the shelves. 8800GTX blew every other card on the market out of the water from day one, it was the last card that had me trying out all my games on ultra just to stare at how amazing they looked and it literally was able to perform as a mainstream card for almost 5 years. 7970 hit the market with comparable performance, but drivers kept its performance comparable for years, a sleeper card and everybody who grabbed one in 2012 and held on to it had a very happy half decade run


----------



## ronnin426850

Quote:


> Originally Posted by *JackCY*
> 
> 7970/280x lasted and so did their weaker siblings because they kind of had to


Actually no, they lasted because their architecture is such that every software optimization (driver, api and engine) for new GPUs, also applies to them.


----------



## mtcn77

Quote:


> Originally Posted by *JackCY*
> 
> If those old GPUs were so great you would still be using them. Fortunately GPU development didn't stall as much as peripherals so in general older GPUs are being replaced to get better performance.
> Plus don't forget AMD isn't releasing GPU of every tier every year like NV, so people do upgrade less often if they want to stay with an AMD card of the same tier. 7970/280x/380x ... many years passed before it was replaced with a 480, 7970/280x lasted and so did their weaker siblings because they kind of had to, there was no replacement for ages thanks to AMD's management fiasco. They are still recovering from those bad decisions.


Upset and all alone... What it is, is what it is! _-But the beat goes on,_ oh yeah!


----------



## spinFX

Quote:


> Originally Posted by *MuscleBound*
> 
> Does this Vulkan work with Nvidia cards??


a friend has a 980Ti, a 5930K (or 5960X cant remember) - both under water, decent board, 16gb high frequency ram and from memory he said he lost about 10fps on min and average fps . He was running fairly high settings without V-sync on a 1440p 60hz monitor.

not sure if the fps variance may have come from differences in the scene being rendered though...


----------



## ronnin426850

Quote:


> Originally Posted by *spinFX*
> 
> a friend has a 980Ti, a 5930K (or 5960X cant remember) - both under water, decent board, 16gb high frequency ram and from memory he said he lost about 10fps on min and average fps . He was running fairly high settings without V-sync on a 1440p 60hz monitor.
> 
> not sure if the fps variance may have come from differences in the scene being rendered though...


A friend of mine with GTX 970 loses about 8FPS average from Vulkan. GTX 1060 gains, however.


----------

