# Fury X is now just as fast as GTX 980Ti in 1080p/1440p and faster in 4K



## iLeakStuff

I have to give it to AMD:
What an amazing comeback and for those that doesnt need the extra overclock headroom GTX 980Ti currently have, Fury X is a better buy with an AIO cooler.

In addition, we know that the Omega drivers from AMD are right around the corner and who knows if it further boost the Fury X performance plus maybe bring new overclocking capabilities to the card...
Well done AMD


----------



## huzzug

But you must know, the overclocker's dream is still an overclocker's dream


----------



## Remij

Stupid Nvidia... purposefully holding back the vanilla 980ti to make people want to buy the 980ti Lightning edition!!!









They're taking this gimping thing to a whole new level now


----------



## PriestOfSin

Excellent. It's a shame that when it came time for me to buy my GPU, the Fury X was:

out of stock

had pump issues out the wazoo

was slower than the 980Ti

was more expensive.

This kind of drives the point home that AMD makes good hardware, but they really.... really..... REALLY... REALLY need to get their drivers in check out of the gate. Had the Fury X been equal / faster than the 980Ti at the start, perhaps there would be fewer 980Tis out in the wild, and more Fury X's.


----------



## iLeakStuff

I hope the message gets out and the thread doesnt die out since the moderators moved it to the graphic card section. Shame that no other site have picked up on it.
Quote:


> Originally Posted by *Remij*
> 
> Stupid Nvidia... purposefully holding back the vanilla 980ti to make people want to buy the 980ti Lightning edition!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They're taking this gimping thing to a whole new level now


You are aware that GTX 980Ti Lightning is an overclocked GTX 980Ti that is expensive as hell right? Thats why it is faster








Quote:


> Originally Posted by *PriestOfSin*
> 
> Excellent. It's a shame that when it came time for me to buy my GPU, the Fury X was:
> 
> out of stock
> 
> had pump issues out the wazoo
> 
> was slower than the 980Ti
> 
> was more expensive.
> 
> This kind of drives the point home that AMD makes good hardware, but they really.... really..... REALLY... REALLY need to get their drivers in check out of the gate. Had the Fury X been equal / faster than the 980Ti at the start, perhaps there would be fewer 980Tis out in the wild, and more Fury X's.


Yes, the Fury X launch was not ideal, thats for sure. But man have AMD recovered. Availability on the market and yields are now ideal. Drivers are better than Nvidia. Rumors about Omega drivers lurking around in the corner bringing who knows what to the Fury X card. I hope they unlocked voltage and improved overclocking. Any additional performance gains will for sure make the Fury X card a no brainer vs GTX 980Ti.
To me Fury X is already a better buy right now though.


----------



## huzzug

Also, I doubt TPU are themselves aware of it since there is not mention about the improvements on the card since original testing.


----------



## TheHorse

And look at all the other cards! All the AMD cards that used to be considered slightly under NV are now a bit above them!


----------



## PriestOfSin

Quote:


> Originally Posted by *iLeakStuff*
> 
> I hope the message gets out and the thread doesnt die out since the moderators moved it to the graphic card section. Shame that no other site have picked up on it.
> You are aware that GTX 980Ti Lightning is an overclocked GTX 980Ti that is expensive as hell right? Thats why it is faster
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, the Fury X launch was not ideal, thats for sure. But man have AMD recovered. Availability on the market and yields are now ideal. Drivers are better than Nvidia. Rumors about Omega drivers lurking around in the corner bringing who knows what to the Fury X card. I hope they unlocked voltage and improved overclocking. Any additional performance gains will for sure make the Fury X card a no brainer vs GTX 980Ti.
> To me Fury X is already a better buy right now though.


Both companies drivers are total garbage right now. Nvidia dishes out garbage, AMD dishes out hot garbage. It's all garbage.


----------



## white owl

No frametimings? With out them, these are just numbers.
Numbers like these:
25
95
37
19
24
65
Quote:


> Originally Posted by *PriestOfSin*
> 
> Both companies drivers are total garbage right now. Nvidia dishes out garbage, AMD dishes out hot garbage. It's all garbage.


Latest Nvidia driver broke a clean install of 7 for me. (something crashes on start up)
Old ones work.


----------



## Pantsu

I'm sure we'll get more updated numbers after the Omega driver release. I think TPU just changed some games/settings and removed stuff like Wolfenstein which was terrible with AMD.


----------



## sage101

Maybe the R9 Fury X is a better buy over the reference 980TI but it can't touch the non reference models. Didn't know my R9 270X had so much potential, it's even on par with a 960.


----------



## Remij

Quote:


> Originally Posted by *iLeakStuff*
> 
> You are aware that GTX 980Ti Lightning is an overclocked GTX 980Ti that is expensive as hell right? Thats why it is faster


Obviously I am aware of that. It was a joke.


----------



## caenlen

The issue is the scaling of OC's, Furies barely OC and the OC doesn't scale well, and most 980 ti's hit 1500 core, and scale better than any card I have ever seen, the Lightning 980 ti is 8% faster in the benches you showed, and that isn't even 1500 core. ;D

I bought a non g1 windforce 980 ti and it hit 1506 core without any voltage change.


----------



## Desolutional

980 Ti can easily be OCed to 1450MHz core, which blows the Fury X OC (lol, what OC, AMD already gave it the greatest OC in the factory) out of the water.


----------



## BinaryDemon

Looks like AMD has done a good job tweaking their drivers.

That was always the issue with the initial round of Fury X reviews anyway - No one was on the same page as to which driver version they should be using. Some reviewers used the latest public betas, some used the drivers that shipped with the cards, and some used drivers that weren't publicly available but supplied by AMD.


----------



## white owl

Quote:


> Originally Posted by *BinaryDemon*
> 
> *Looks like AMD has done a good job tweaking their drivers.*
> 
> That was always the issue with the initial round of Fury X reviews anyway - No one was on the same page as to which driver version they should be using. Some reviewers used the latest public betas, some used the drivers that shipped with the cards, and some used drivers that weren't publicly available but supplied by AMD.


I think so too.
I really wish these benchmarks would show the frametiming so we can have the whole picture.
If AMD has made a driver that prevents stutter at 4k with with maxed detail settings due to the lack of vram, I'd be sold.









I do wish more than anything else, that there were non reference designs. It's been mentioned that the Fury X doesn't scale well with additional clock speed. Maybe this is the reason?


----------



## iLeakStuff

Not sure what TPU have done, but it seems they have changed the settings in some of the games to more demanding type?

Witcher 3 1440p:
980 Ti: 55.7FPS > 41.3FPS
Fury X: 51.7FPS > 39.6FPS

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/24.html
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/18.html


----------



## semitope

960 as fast as a 270x....
Quote:


> Originally Posted by *caenlen*
> 
> The issue is the scaling of OC's, Furies barely OC and the OC doesn't scale well, and most 980 ti's hit 1500 core, and scale better than any card I have ever seen, the Lightning 980 ti is 8% faster in the benches you showed, and that isn't even 1500 core. ;D
> 
> I bought a non g1 windforce 980 ti and it hit 1506 core without any voltage change.


people keep saying 1500Mhz, but in reality most of them don't seem to reach that. A few sites have clocked several and most do not get to 1500.

On the 980ti lightning review. 4 barely pass 1500.

http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/26.html

That lightning was boosting to 1392 Mhz out of the box.


----------



## iLeakStuff

Performance/dollar. Fury cards and Maxwell are now equal. But notice the big change between GTX 980 and R9 Fury









Before


After


----------



## Szaby59

These results are from 2 different test suites (no PCars and DA:I, but they added Metal Gear V and Mad Max in the newer one). The real comparision would be to perform the benchmarks with the same games.
Also this Ti boosts to the ~1400 MHz region and it's around 19% faster even at 4K, I think it's safe to say that any 980 Ti similar/equal to the Fury X price can accomplish that with a manual overclock, and some of them will be faster even without that out of the box...


----------



## iLeakStuff

Quote:


> Originally Posted by *Szaby59*
> 
> These results are from 2 different test suites (no PCars, but they added Metal Gear V in the newer one). The real comparision would be to perform the benchmarks with the same games.
> Also this Ti boosts to the ~1400 MHz region and it's around 17% faster even at 4K, I think it's safe to say that any 980 Ti in Fury price range can accomplish that with a manual overclock, and some of them will be faster even without that out of the box...


Still highly relevant since these are the newest games and TPU cant obviously not test 30+ games to cover everything.

But I agree, not a perfect comparison since the all the games are not the same. 980Ti overclock better for sure, but not everyone overclock plus some prefer less noise which you get from the water cooled Fury X.

WoD was one of the games that screwed up the previous game suits in Nvidias favour. But the newest driver from AMD have reduced the advantage from +35% to +16%.
Project Cars was also one where Nvidia had a huge 50% advantage, but then again it only accounted for a total of 2% in average. But in the OP you can see the total percentage in the summary without the game included anyway


----------



## alawadhi3000

*Games removed:-*
Alien Isolation
Batman Origins
Bioshock Infinite
Dead Rising 3
Dragon Age
Metro LL
Project Cars
Tomb Raider
Wolfenstein: The New Order

*Games Added:-*
Mad Max
MGS V


----------



## Szaby59

@iLeakStuff
Quote:


> But I agree, not a perfect comparison since the all the games are not the same. 980Ti overclock better for sure, but not everyone overclock plus some prefer less noise which you get from the water cooled Fury X.


Nobody will buy a ref. 980 Ti anymore (except for AIO water cooling config ofc.) now when custom's are much better and available, no need to overclock a Palit 980 Ti for example is 15% faster out of the box according to the TPU 4K summary.
As for the noise: the pump and coil noise can still occur on these cards and again I'm refering to the Palit card which has 1dB lower Load noise and 0 db idle noise thanks to the disabled fan.
The Fury X is a good card but for this price 980 TI variants are just better.


----------



## hanzy

Glad to see.
I am sure as time goes on AMD cards will pull just a bit ahead of nvidia cards, just like 290/290x vs 780/780Ti.
When I first got my 780's they were king. By the time I sold them they were getting hammered by AMD cards that used to be behind.

I went with nvidia again with these 980ti's because when Fury/x was released it was kind of underwhelming to ME.
I think Arctic Islands is going to be really good.









I do second the frametimes though.


----------



## iLeakStuff

Quote:


> Originally Posted by *Szaby59*
> 
> @iLeakStuff
> Nobody will buy a ref. 980 Ti anymore (except for AIO water cooling config ofc.) now when custom's are much better and available, no need to overclock a Palit 980 Ti for example is 15% faster out of the box according to the TPU 4K summary.
> As for the noise: the pump and coil noise can still occur on these GPU's and again I'm refering to the Palit card which has 1dB lower Load noise and 0 db idle noise thank to the disabled fan.
> The Fury X is a good card but for this price 980 TI variants are just better.


Lets not pretend that Fury X doesnt overclock either. Its not an overclockers dream like AMD said but it can still gain around 10% extra performance .

Which means an OC Fury X is 10% below 1400MHz GTX 980Ti at 1440p or 8% below at 4K. Earlier it was 20% below at 1440p and 15% below at 4K so they have gained ground.
Im not sure about price today but the AIB 980Tis with proper fan without sounding like a jet engine at 1400MHz cost $660 and upwards. Which means you get a Fury X for about 10% less, around the same as the performance difference.

Id say they are very close to each other depending on which metric you look at with 980Ti.


----------



## Cyro999

You forgot to link this one










By their numbers in that review, OC'd gm200 (lightning, even a g1) is ~44% faster than a stock Fury X @1080p*. I just don't see 44% overclocks on Fury's.

*OC = 1518/8360, 117% of baseline (which is out-of-the-box lightning). Close OC's are fairly easily achievable, 1480/7600 with a decent card at a fraction of the price and they list OC's on that same review, it's not about paying for it. It's not difficult to get ~35-40% more performance than listed for reference Fury X there @1080p when using a 980ti - but can you do it with a Fury X? Can you get even close?

I bring this up because the thread title is very aggressive
Quote:


> Fury X is now just as fast as GTX 980Ti in 1080p


and i think wrong.

Nor would i even consider using 4GB of VRAM for 4k; next year we'll have 6 and 8GB on the more midrange options and 16GB flagships.

Fury X is also only reference cooler which has significant issues and the worst whining of any GPU i've seen in a long time - i don't know who the target audience of this card is. I have strong negative feelings about pretty much every high end GPU that's reference PCB+Cooler only, it's usually a significant annoyance and/or handicap. In Nvidia's case, it often makes the cut-down-but-unlocked version actually better for 24/7 usage, however AMD has cut the Fury more than Nvidia cut the 980ti and Fiji seems a bit weaker especially at low resolutions; The non-x Fury doesn't fight well with the 980ti at 1080p, yet that's the cost of a non-reference design.


----------



## p4inkill3r

The rumors of the Fury X/non-X's demise were highly exaggerated to begin with.


----------



## geoxile

Quote:


> Originally Posted by *Cyro999*
> 
> You forgot to link this one
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By their numbers in that review, OC'd gm200 (lightning, even a g1) is ~44% faster than a stock Fury X @1080p*. I just don't see 44% overclocks on Fury's.
> 
> *OC = 1518/8360, 117% of baseline (which is out-of-the-box lightning). Close OC's are fairly easily achievable, 1480/7600 with a decent card at a fraction of the price and they list OC's on that same review, it's not about paying for it. It's not difficult to get ~35-40% more performance than listed for reference Fury X there @1080p when using a 980ti - but can you do it with a Fury X? Can you get even close?
> 
> I bring this up because the thread title is very aggressive
> and i think wrong.
> 
> Nor would i even consider using 4GB of VRAM for 4k; next year we'll have 6 and 8GB on the more midrange options and 16GB flagships.
> 
> Fury X is also only reference cooler which has significant issues and the worst whining of any GPU i've seen in a long time - i don't know who the target audience of this card is. I have strong negative feelings about pretty much every high end GPU that's reference PCB+Cooler only, it's usually a significant annoyance and/or handicap. In Nvidia's case, it often makes the cut-down-but-unlocked version actually better for 24/7 usage, however AMD has cut the Fury more than Nvidia cut the 980ti and Fiji seems a bit weaker especially at low resolutions; The non-x Fury doesn't fight well with the 980ti at 1080p, yet that's the cost of a non-reference design.


Am I missing something here? Your chart says the Lightning 980TI is 23% faster than the Fury X. Lightning performance normalized for Fury X performance, 100/0.81 = 123.45. Where is 44% coming from?


----------



## iLeakStuff

Quote:


> Originally Posted by *Cyro999*
> 
> You forgot to link this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By their numbers in that review, OC'd gm200 (lightning, even a g1) is ~44% faster than a stock Fury X @1080p*. I just don't see 44% overclocks on Fury's.
> 
> *OC = 1518/8360, 117% of baseline (which is out-of-the-box lightning). Close OC's are fairly easily achievable, 1480/7600 with a decent card at a fraction of the price and they list OC's on that same review, it's not about paying for it. It's not difficult to get ~35-40% more performance than listed for reference Fury X there @1080p when using a 980ti - but can you do it with a Fury X? Can you get even close?
> 
> I bring this up because the thread title is very aggressive
> and i think wrong.
> 
> Nor would i even consider using 4GB of VRAM for 4k; next year we'll have 6 and 8GB on the more midrange options and 16GB flagships.
> 
> Fury X is also only reference cooler which has significant issues and the worst whining of any GPU i've seen in a long time - i don't know who the target audience of this card is. I have strong negative feelings about pretty much every high end GPU that's reference PCB+Cooler only, it's usually a significant annoyance and/or handicap. In Nvidia's case, it often makes the cut-down-but-unlocked version actually better for 24/7 usage, however AMD has cut the Fury more than Nvidia cut the 980ti and Fiji seems a bit weaker especially at low resolutions; The non-x Fury doesn't fight well with the 980ti at 1080p, yet that's the cost of a non-reference design.


GTX 980Ti at 1400MHz (Lightning) is 23% faster than stock Fury X in 1080p. Not 44%.

And who use a Fury X or 980Ti at 1080p? 

Fury X HAD issues with the pump resulting in whine but that have been fixed and resolved a looong time ago. Its a non existent issue now. The card is very quiet.


----------



## rdr09

Quote:


> Originally Posted by *iLeakStuff*
> 
> GTX 980Ti at 1400MHz (Lightning) is 23% faster than stock Fury X in 1080p. Not 44%.
> 
> And who use a Fury X or 980Ti at 1080p?
> 
> Fury X HAD issues with the pump resulting in whine but that have been fixed and resolved a looong time ago. Its a non existent issue now. The card is very quiet.


there are whining 980 Tis too. prolly more 'cause there are more of them in the market.


----------



## BarneyRubble

Gosh, let me sell these two 980m TI cards I've had for months (and had a blast overclocking) and go buy 2 AMD cards!!!


----------



## TK421

People must consider that while msi is handing out 980ti lightning samples with samsung memory, the retail editions mainly come with hynix garbage.

Dirty move if you ask me.


----------



## white owl

Quote:


> Originally Posted by *TK421*
> 
> People must consider that while msi is handing out 980ti lightning samples with samsung memory, the retail editions mainly come with hynix garbage.
> 
> Dirty move if you ask me.


This.
People doing reviews don't even use LN2.
Way to stick it to the people that use them.


----------



## denman

Please release Fury X2 to replace my GTX 690


----------



## lrch

Quote:


> Originally Posted by *white owl*
> 
> No frametimings? With out them, these are just numbers. [...]


/thread

Good job AMD though, keep squeezing out performance.


----------



## Maintenance Bot

Quote:


> Originally Posted by *iLeakStuff*
> 
> In addition, you can get the cheapest GTX 980Ti today for $619 while Fury X price have been reduced to $629 but you with rebates it comes out as the cheapest for only $599!
> *
> PNY GTX 980Ti $619 @ Newegg*
> 
> Link: http://www.newegg.com/Product/Product.aspx?Item=N82E16814133611&cm_re=gtx_980ti-_-14-133-611-_-Product
> 
> *XFX Fury X $629/$599 with rebates @ Newegg*
> 
> Link: http://www.newegg.com/Product/Product.aspx?Item=N82E16814150742&cm_re=fury_x-_-14-150-742-_-Product


You can also get a Galax 980 ti for $599. http://galaxstore.net/GALAX-NVIDIA-GEFORCE-GTX-980-TI-OC-6GB_p_86.html


----------



## buildzoid

Quote:


> Originally Posted by *Maintenance Bot*
> 
> You can also get a Galax 980 ti for $599. http://galaxstore.net/GALAX-NVIDIA-GEFORCE-GTX-980-TI-OC-6GB_p_86.html


The reference 980Tis are loud and hot(42dB 84C). That Galax card is a good deal though.


----------



## TK421

Oh btw, what's the flaw with the fury x cooling system?

Some of the mosfets are not covered? And is there a workaround to improve this flaw, without buying a new cooling system?


----------



## white owl

Quote:


> Originally Posted by *buildzoid*
> 
> The reference 980Tis are loud and hot(42dB 84C). That Galax card is a good deal though.


Its also not got an effing water cooler...


----------



## Klocek001

Come on, this is just stock lighting (around 1400MHz I suppose). 1500MHz 980Ti still pulverizes the Fury X. BTW techpowerup has 8GB 970's now.


----------



## white owl

Quote:


> Originally Posted by *Klocek001*
> 
> Come on, this is just stock lighting (around 1400MHz I suppose). 1500MHz 980Ti still pulverizes the Fury X. BTW techpowerup has 8GB 970's now.


I'm trying to make a joke out of this.

How much vram does it take to keep a 970 from stuttering?


----------



## gamervivek

Considering the lead that the lightining has over the 980Ti ref, it's probably just throttling more than usual.

It was faster than Titan X in Tom's Hardware review at 4k, so dependant on game choices even at the start.


----------



## buildzoid

Quote:


> Originally Posted by *TK421*
> 
> Oh btw, what's the flaw with the fury x cooling system?
> 
> Some of the mosfets are not covered? And is there a workaround to improve this flaw, without buying a new cooling system?


All the MOSFETs are covered


However if you Vmod the card I recommend you take the cover of the card and put a fan to blow into it.


----------



## TK421

Quote:


> Originally Posted by *buildzoid*
> 
> All the MOSFETs are covered
> 
> 
> However if you Vmod the card I recommend you take the cover of the card and put a fan to blow into it.


According to a review, the vrm gets to 91c even with water cooling pipe. Not sure if the components on the back is covered either.

How about the small squares black squares near the copper pipe?

http://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/1.html


----------



## carlhil2

Once the end user gets his 980Ti home, and, OC it, all those benches are null..it's performance would be more like the Lightnings in those benches,...just saying..


----------



## Recidivism

This is mostly because of Windows 10 WDDM 2.0


----------



## buildzoid

Quote:


> Originally Posted by *TK421*
> 
> According to a review, the vrm gets to 91c even with water cooling pipe. Not sure if the components on the back is covered either.
> 
> How about the small squares black squares near the copper pipe?
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/1.html


The black squares are chokes/coils/inductors. The back of the card only has tantalum capacitors and driver FETs nothing actually important. Mind you the MOSFETs on the Fury X are rated to do 70A at 125C. 91C is basically normal operating conditions for them. Also I suspect that the 91C reading on the back of the Fury X PCB is the drivers and not the power MOSFETs. I managed to keep those same power MOSFETs on my R9 290X at 85C even with a janky little hack job heatsink that I ziptied onto them when running 1.35V to the core


----------



## iLeakStuff

Quote:


> Originally Posted by *rdr09*
> 
> there are whining 980 Tis too. prolly more 'cause there are more of them in the market.


Yeah, unless you pay premium for special components, you can get unlucky and get whining cards.
Quote:


> Originally Posted by *Maintenance Bot*
> 
> You can also get a Galax 980 ti for $599. http://galaxstore.net/GALAX-NVIDIA-GEFORCE-GTX-980-TI-OC-6GB_p_86.html


Good to see the 980Ti go down that much in price too.








I used Newegg since its a big retailer plus I could find both cards there. There are always someone having a sale somewhere


----------



## TK421

Quote:


> Originally Posted by *buildzoid*
> 
> The black squares are chokes/coils/inductors. The back of the card only has tantalum capacitors and driver FETs nothing actually important. Mind you the MOSFETs on the Fury X are rated to do 70A at 125C. 91C is basically normal operating conditions for them. Also I suspect that the 91C reading on the back of the Fury X PCB is the drivers and not the power MOSFETs. I managed to keep those same power MOSFETs on my R9 290X at 85C even with a janky little hack job heatsink that I ziptied onto them when running 1.35V to the core


Is the mostfet a higher quality compared to ref 980ti/tx?

Why does the squares not need to be cooled?


----------



## buildzoid

Quote:


> Originally Posted by *TK421*
> 
> Is the mostfet a higher quality compared to ref 980ti/tx?
> 
> Why does the squares not need to be cooled?


The black squares are literally a piece of going through or around a piece of iron. They really aren't sensitive to temperatures.

The MOSFETs on the R9 Fury and Fury X are really really high end IR 6894 MOSFETs. Those are the same MOSFETs that are used in the EVGA E-power, the GTX 980 Ti Classified, GTX 980 Classified, GTX 780Ti Classified and K|NGP|N, GTX 780 Classified, GTX 680 Classified, R9 290X Lightning, R9 290X reference, R9 290 reference. All of those use the 6894 MOSFETs. This is their datasheet.


----------



## UtopiA

TPU changed from Windows 7 to Windows 10 with the new benchmarks, they also changed up their gaming suite by adding Mad Max & MGSV.

Perhaps the new driver model in Win10 accounts for a lot of that performance boost. We already know AMD has issues with DX11 in 7/8.


----------



## Serandur

"Fury X as fast or faster."

[shows charts with aftermarket 980 Ti clearly 19-22% faster]

"But that's a Lightning."

[Guru3D's Lightning 980 Ti review show other, cheaper aftermarket 980 Tis almost every bit as fast at stock; additionally, TPU's own review shows said aftermarket models able to overclock just as high (or very close) as the Lightning, which itself still has a fair amount of headroom left:]


Spoiler: Warning: Spoiler!







"But the Fury X still made gains"

[but if you look closely, you see the reference 980 Ti lost some ground versus the Titan X (potential variance in reference card throttling/boost speed) and notice that TPU removed Project Cars from their testing suite, which contributes to the Fury X's relative positioning.]

Conclusion: The OP's title and assertion are wrong. Reference 980 Tis at their low stock and throttling speeds are still pretty much matching the Fury X, but any aftermarket or overclocked model blows an OC Fury X out of the water for a marginal price increase or manual tweaking of like two sliders.

For instance:

http://www.bhphotovideo.com/bnh/controller/home?O=&sku=1160257&gclid=COW7zu6R8MgCFQoSHwod-pIFeQ&is=REG&m=Y&Q=&A=details

The G1 980 Ti is almost as aggressively clocked as the Lightning is and $669 there, which is about ~10% more than the cheapest Fury X, yet easily beats it by 15-20% at stock or when both are overclocked, has 50% more VRAM, has far superior DX11 CPU overhead management, and has the typical Nvidia-exclusive features. It's $20 more on Newegg, but there comes with a free game and a $20 rebate offer. Bang for buck wise, aftermarket 980 Tis are still the way to go in this price range. The MSI Gaming 980 Ti is similar in performance and also comes with a game, but is only $659 on Newegg:

http://m.newegg.com/Product/index?itemnumber=N82E16814127889

AMD haven't recovered anything; the Fury X still doesn't properly compete with Nvidia's cut-down GM200 (which is cheaper to manufacture); not in performance, price, model variety, overclocking, VRAM, driver efficiency, manufacturing cost, or vendor-exclusive features. Regardless of whether a person overclocks or not, aftermarket 980 Tis are the superior choice in every metric.

If a CLC is necessary for some reason, the Hybrid 980 Ti is now $725 on Newegg, is still superior to the Fury X in bang for buck (a bit less than 20% more money for ~20% more performance and the other 980 Ti benefits), and comes with a free game still.


----------



## TK421

Quote:


> Originally Posted by *buildzoid*
> 
> The black squares are literally a piece of going through or around a piece of iron. They really aren't sensitive to temperatures.
> 
> The MOSFETs on the R9 Fury and Fury X are really really high end IR 6894 MOSFETs. Those are the same MOSFETs that are used in the EVGA E-power, the GTX 980 Ti Classified, GTX 980 Classified, GTX 780Ti Classified and K|NGP|N, GTX 780 Classified, GTX 680 Classified, R9 290X Lightning, R9 290X reference, R9 290 reference. All of those use the 6894 MOSFETs. This is their datasheet.


Damn, didn't expect the mosfets to be this high up in the quality department.

Is there any benefits to cooling the squares though?


----------



## buildzoid

Quote:


> Originally Posted by *TK421*
> 
> Damn, didn't expect the mosfets to be this high up in the quality department.
> 
> Is there any benefits to cooling the squares though?


I guess you might drop the overall VRM temps a little but the only card I've seen that cools those is the R9 290(X) Windforce and that card has crap VRM temps because the heatsink is anemic.


----------



## st0necold

Good for red team!


----------



## ooxxy

Just as fast as reference. Non-reference 980 TI's are quite a bit ahead. Even out of the box with no manual OC.


----------



## TK421

Quote:


> Originally Posted by *buildzoid*
> 
> I guess you might drop the overall VRM temps a little but the only card I've seen that cools those is the R9 290(X) Windforce and that card has crap VRM temps because the heatsink is anemic.


can you give me a picture of the heatsink?

can't seem to find it anywhere


----------



## dmasteR

Quote:


> Originally Posted by *iLeakStuff*
> 
> GTX 980Ti at 1400MHz (Lightning) is 23% faster than stock Fury X in 1080p. Not 44%.
> 
> And who use a Fury X or 980Ti at 1080p?
> 
> Fury X HAD issues with the pump resulting in whine but that have been fixed and resolved a looong time ago. Its a non existent issue now. The card is very quiet.


Plenty of people do to drive 120+fps.


----------



## xzamples

i'm mad that nvidia stopped showing kepler love so fast

it really sucks, i was hoping to sli it in the future when i first got it because i didn't have enough money straight off the bat for sli

but now people are saying don't even bother sli

i honestly truly believe they deliberately gimp cards via drivers so the masses purchase new ones, and their gameworks tactics are anti-consumer... i'm glad more and more people are coming to the realization that nvidia is anti-consumer


----------



## white owl

Quote:


> Originally Posted by *xzamples*
> 
> i'm mad that nvidia stopped showing kepler love so fast
> 
> it really sucks, i was hoping to sli it in the future when i first got it because i didn't have enough money straight off the bat for sli
> 
> but now people are saying don't even bother sli
> 
> i honestly truly believe they deliberately gimp cards via drivers so the masses purchase new ones, and their gameworks tactics are anti-consumer... i'm glad more and more people are coming to the realization that nvidia is anti-consumer


Why would they invest money in good silicon to gimp it?
They never did a rebadge of Kelper did they? The baby titan (780ti) _was_ the gimped titan much like it is now.
Did they run better on launch than they do now?
I wish I had kept mine. Bought a 980 FTW and a 780 SC for $550 several months ago but never used the 780. =(


----------



## The Robot

Quote:


> Originally Posted by *Serandur*
> 
> "Fury X as fast or faster."
> 
> [shows charts with aftermarket 980 Ti clearly 19-22% faster]
> 
> "But that's a Lightning."


There's even a Lightning LE for $679. Insane deal.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127910


----------



## TK421

Quote:


> Originally Posted by *The Robot*
> 
> There's even a Lightning LE for $679. Insane deal.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127910


The LE is an antithesis of the Lightning design (for the 980Ti).

Since the only factor determining max overclock on the 980Ti is the binning quality (considering cooling performance is the same), the LE isn't binned that high, so the overclocking headroom is lost.


----------



## Cyro999

For those asking where the 44% came from: That's the gap between an OC'd lightning and a stock fury x according to that article, when they ran it at manually OC'd clocks. I was asking if a Fury X could OC enough to get ~44% more performance than reference.

Can it even do 30%? It seems to me like Fury/FuryX would fall far short of 980ti/TitanX at 1080p.

And a ton of people use those cards at that res. Have you tried maxing stuff at 1440p, 150fps with 2x MSAA with a single GPU? No? Well do it on 1080p, far easier. You still fall short even with those cards often.


----------



## rogergamer

lol I buy a 980 ti and this comes up, not that I wasn't rooting for AMD though, was considering getting a fury, but the 980 beats it in every way


----------



## GorillaSceptre

Quote:


> Originally Posted by *Cyro999*
> 
> For those asking where the 44% came from: That's the gap between an OC'd lightning and a stock fury x according to that article, when they ran it at manually OC'd clocks. I was asking if a Fury X could OC enough to get ~44% more performance than reference.
> 
> Can it even do 30%? It seems to me like Fury/FuryX would fall far short of 980ti/TitanX at 1080p.
> 
> And a ton of people use those cards at that res. Have you tried maxing stuff at 1440p, 150fps with 2x MSAA with a single GPU? No? Well do it on 1080p, far easier. You still fall short even with those cards often.


Are we looking at the same thing? Where are you seeing 44%?

If your numbers are from the graphs in this thread, then your math is way off..


----------



## michaelius

Quote:


> Originally Posted by *rogergamer*
> 
> lol I buy a 980 ti and this comes up, not that I wasn't rooting for AMD though, was considering getting a fury, but the 980 beats it in every way


It still beats it in every single way just a bit less








And with Nvidia you are not forced to bend over to Microsoft and instal windows 10.


----------



## Lass3

Try overclock Fury X and a good custom 980 Ti to the maximum stable clocks, and then compare them.


----------



## UtopiA

Quote:


> Originally Posted by *Cyro999*
> 
> Can it even do 30%? It seems to me like Fury/FuryX would fall far short of 980ti/TitanX at 1080p.


Fury X does maybe 5-10%.


----------



## Ajvar

AMD always had weak drivers and in 3 years increased performance of GPU up to 30% thanks to those. While Nvidia cares about its GPUs only for 1-2 years now at best.


----------



## xzamples

Quote:


> Originally Posted by *Ajvar*
> 
> AMD always had weak drivers and in 3 years increased performance of GPU up to 30% thanks to those. While Nvidia cares about its GPUs only for 1-2 years now at best.


pretty much this, got a 760 2gb in 2013... 2 years later it's a POS compared to others and kepler got gimped, i definitely learned my lesson


----------



## Proxish

I'd love to know how an overclocked R9 290 & R9 290X holds up against an overclocked GTX 970 both with the latest drivers.


----------



## Klocek001

Quote:


> Originally Posted by *xzamples*
> 
> pretty much this, got a 760 2gb in 2013... 2 years later it's a POS compared to others and kepler got gimped, i definitely learned my lesson


yeah buying a mid-range card for two years or more is not a fruitful endeavor


----------



## Just a nickname

After seeing the graph value per dollars. I can totally confirm this. Just bought a R9 290 used for 120$ two weeks ago. Been selling and buying again a couple R9 290 and I have to say this is so far the best card I've had. Once you put a custom cooler on it, it runs as cool as my 5870, 7970... but a bit on the high side for vrms.
Custom cooler being a 5870 accelero xtreme modded.


----------



## Lass3

Quote:


> Originally Posted by *xzamples*
> 
> pretty much this, got a 760 2gb in 2013... 2 years later it's a POS compared to others and kepler got gimped, i definitely learned my lesson


You buy a mid-end card and expect it to last 4-5 years..?


----------



## p4inkill3r

Quote:


> Originally Posted by *Lass3*
> 
> You buy a mid-end card and expect it to last 4-5 years..?


2016 - 2013=3 years


----------



## HeadlessKnight

Quote:


> Originally Posted by *xzamples*
> 
> pretty much this, got a 760 2gb in 2013... 2 years later it's a POS compared to others and kepler got gimped, i definitely learned my lesson


Nvidia has improved the performance of the Keplers with the latest few drivers...









GTX 780 Ti is pretty close to the GTX 980 in those benchmarks. But only in Witcher 3 it performs worse than 970. But that could be attributed to the better bandwidth utilization of Maxwell architecture and better tessellation performance.
With newest drivers the 760 performs like it should.


----------



## Klocek001

Quote:


> Originally Posted by *p4inkill3r*
> 
> 2016 - 2013=3 years


but he expected more.
I remember buying 7870GHz in late 2013, I was satisfied at that time but I wonder whether it would run 2015 titles like TW3 or GTA5 as well as it ran games then. Mid range cards are just too weak to last you 3 years or more if you're playing latest titles. 2 years is the right time to get an upgrade, like 760 -> 960/280X, or invest in a high end GPU, get twice as good performance now and worry less about the future.


----------



## p4inkill3r

Quote:


> Originally Posted by *Klocek001*
> 
> but he expected more.
> I remember buying 7870GHz in late 2013, I was satisfied at that time but I wonder whether it would run 2015 titles like TW3 or GTA5 as well as it ran games then. Mid range cards are just too weak to last you 3 years or more if you're playing latest titles.


A mid range card lasting 2-3 years is something I'm OK with, but he misrepresented it by stretching that to 4-5 years.


----------



## Assirra

Quote:


> Originally Posted by *xzamples*
> 
> pretty much this, got a 760 2gb in 2013... 2 years later it's a POS compared to others and kepler got gimped, i definitely learned my lesson


people need to learn the definition of gimped around here.

If it doesn't run worse than it did it is not gimped.


----------



## Klocek001

don't taunt people like this, I saw this title and thought to myself immediately "maybe sell the 980ti and get a CLC'd Fury X"
then I saw this


----------



## Bahska

I wish they didn't use 358.50 WHQL and instead used the hotfix (not saying it would have made a difference but...) 358.50 broke a lot and is very unstable compared to the hotfix. In 358.50 i couldn't oc at all with out constant crashing in the hotfix im back up to 1509 mhz with my hybrid 980ti.
Im ok with them using reference cards to test. Im pretty sure not many would buy a reference 980ti at this point though. A windforce is the same price.

That said im glad to see Amd software has gotten better. A little competition is great for all of us


----------



## criminal

Fury has made some strides, but the 980Ti is still the better buy. Especially if you want to overclock.


----------



## Klocek001

and btw IDK how they're getting 52fps avg on that Lightning in TW3. When I run with HW and all settings on ultra I drop to 52 fps, but the avg wouId be 60 for sure, or maybe even a bit higher. I run my 980Ti @1500MHz, this is maybe around 1400MHz, but 100MHz doesn't make that much difference. Unelss the whole bench sequence was fighting 10 enemies in the swamp or forest.


----------



## darealist

You can't even tell a difference between high and absolute max settings with everything on in Witcher 3 or any others. I tried doing that with screenshots and it's still impossible. It's like max setting runs hidden scripts on the background just to encourage you to upgrade.


----------



## Lass3

Quote:


> Originally Posted by *darealist*
> 
> You can't even tell a difference between high and absolute max settings with everything on in Witcher 3 or any others. I tried doing that with screenshots and it's still impossible. It's like max setting runs hidden scripts on the background just to encourage you to upgrade.


I can tell the difference.. Slightly better/sharper textures, *more foilage/grass* and slightly better shadows/lightning. But it's not a major difference, foilage/grass most noticeable.


----------



## Vlada011

Fury X is good but it's not faster than GTX980Ti. And I don't know how AMD think to recover if they sell weaker cards without PhysX for same money as NVIDIA.
And even GeForce GTX980Ti have advantage because better overclocking.
Fury X should cost 500$. Special because every normal person will remove stock pump and install waterblock and that's +100/110$ more.
Fury X with waterblock worth as reference GTX980Ti.

I looked yesterday video clips ASUS R9-390X 8GB GTA5 gameplay and picture is better than on GeForce cards.
Look almost as reality.


----------



## criminal

Quote:


> Originally Posted by *Vlada011*
> 
> Fury X is good but it's not faster than GTX980Ti. And I don't know how AMD think to recover if they sell weaker cards without PhysX for same money as NVIDIA.
> And even GeForce GTX980Ti have advantage because better overclocking.
> Fury X should cost 500$. Special *because every normal person will remove stock pump and install waterblock and that's +100/110$ more.*
> Fury X with waterblock worth as reference GTX980Ti.
> 
> I looked yesterday video clips ASUS R9-390X 8GB GTA5 gameplay and picture is better than on GeForce cards.
> Look almost as reality.


LOL... not ever normal person will do that. Custom water cooling is not the norm. Fury X should be $599, Fury $499 and Nano should be $499.


----------



## Vlada011

You are right, watercooling loop is still for 10% of population who look on PC/gaming/hardware as hobby.
But I only want to say that Fury X + waterblock in my eyes worth as GTX980Ti reference.
OK I would like to try Radeon now again after many years but Fury X is too expensive for performance and fps they offer.
It's much easier for someone to pay same money for 980Ti OC 10-15% or 20% and play with PhysX.
It's not easy when one hardware not support something constantly and cost same.
ASUS R9-390X 8GB Strix is good card only no available waterblock and such card fast increase temperature inside case and every hardware become hotter 7-10C.
She cost about 420 euro in my area.Soon for 600 euro people will build R9-390X 8GB CF.


----------



## tajoh111

Something seems off with the numbers. Atleast the gtx 980 ti original numbers.

I just have a hard time believing a factory overclocked model being 20-25% faster than the original. That's a huge gap and the real world clock difference should be about 15%. Basically 1150 vs 1330 or so.

I suspect wizard has multiple machines to bench with and he didn't update his gtx 980 ti drivers.


----------



## MerkageTurk

I experience a lot of stuttering with my 980 ti, stock overclock, drivers old vs new

with my 290x gta v felt so smooth

I am thinking of amd


----------



## white owl

Quote:


> Originally Posted by *MerkageTurk*
> 
> I experience a lot of stuttering with my 980 ti, stock overclock, drivers old vs new
> 
> with my 290x gta v felt so smooth
> 
> I am thinking of amd


Is that with a clean install of windows? With my 980 I don't get a hint of stutter.
Vsync?
Use RTSS and AB to show your frametimings. If they are going crazy with a clean install, it may be a bad card.


----------



## MerkageTurk

Clean install twice, tried everything from stock, no msi burner or any other programs.

CPU even stock to no avail

290x was super smooth i was amazed to be honest.


----------



## Vlada011

I tell you I saw GTA5 on ASUS R9-390X 8GB Strix that's like reality.
Better AMD to prepare lot of R9-390X because people will choose that cards and for 12 months when they become cheaper.
Special for CF. With 8GB Video memory they don't need to worry for limited video memory.
Now with 450$ per card price for CF is close to TITAN X. And TITAN X is almost as R9-390X CF and single card and 12GB Video Memory and support PhysX. AMD without even lower price will continue to go in deeper problems. They need to start to offer cards with shocking price, low and to back old customers. 30% of NVIDIA customers now are ex Radeon fans who start with Radeon and who bought only Radeon years before NVIDIA become so popular and better option.
How Radeon to sell one card with pump whine, without PhysX, weaker than GTX980Ti with less video memory for same price.
If they want to sell bigger percent, close to NVIDIA, not more than NVIDIA, AMD need stronger mid and stronger high segment for at least 7-8% where no one can explain that as NVIDIA win and for cheaper price. Because sooner or later every Radeon owner will go on NVIDIA site to download PhysX only to start game, sooner or later will figure out that for some buggs only NVIDIA customers with Control Panel or NVIDIA Inspector have solution. Some old game without AA could borrow profile from other game with AA, everything only on GeForce and that reflect on one percent of people next time to buy GeForce. Soon AMD customers will stay only people who lose confidence in GeForce and who had problems with GeForce and people who can't afford GeForce. With such profile of customers no one can stay on market long time.
I looked some games on Radeon cards and that's far better than HD6970. HD7970 ...much less stuttering.
GeForce image quality is inferior to Radeon sharpness, details on long distance, but superior in effects, tessellation, PhysX, 3D Vision, TXAA.

I swear NVIDIA probably optimized their hardware graphic processor with drivers not to present 10% of details and cover that with blur. Because many customers can't notice such small difference and with 10% lower details and resolution they could get smoother gameplay. Maybe I didn't explain that on best way but they done something similar and because of that even fonts and text is sharper on Radeon cards before someone start game.


----------



## semitope

why do you care about physx so much?


----------



## Vlada011

Quote:


> Originally Posted by *semitope*
> 
> why do you care about physx so much?


I can't explain that... for me is that similar like Radeon can't show one color example.
Simply cripple for one important option. PhysX is not only option in game to hair look better or water or similar thing...
They could improve that so far and games to become funny on Radeon compare to GeForce only if they want.
Imagine you have Radeon and near your driver you need to download PhysX from NVIDIA site only to stay installed because game will not start without that.
For gamers who lot pay attention on details and play one game longer time to enjoy in details that's very important and look very bad for Radeon owners and number of Radeon owners is prove for that.

From other side NVIDIA done some tricky things with drivers or during design graphic processor and on their chip resolution and details will look 10% lower than on Radeon.
No other explanation for bunch of people who notice that and special when someone Radeon owners decide to invest more money for premium NVIDIA and suddenly he recognize lower image quality than on 300-400$ cheaper Radeon. Immediately after Windows Boot people recognize difference fonts are not sharp and clear, more blur,...


----------



## rickcooperjr

Quote:


> Originally Posted by *Vlada011*
> 
> Fury X is good but it's not faster than GTX980Ti. And I don't know how AMD think to recover if they sell weaker cards without PhysX for same money as NVIDIA.
> And even GeForce GTX980Ti have advantage because better overclocking.
> Fury X should cost 500$. Special because every normal person will remove stock pump and install waterblock and that's +100/110$ more.
> Fury X with waterblock worth as reference GTX980Ti.
> 
> I looked yesterday video clips ASUS R9-390X 8GB GTA5 gameplay and picture is better than on GeForce cards.
> Look almost as reality.


I have noticed in alot of games you can tell AMD has the better image quality many others have noticed this also and usually it is on Nvidia gameworks titles this kind of throws a redflag as of why this is star wars battlefront there was a few of the Fury / fury X / 390x vs GTX 980 TI and well the image quality was better on the AMD cards I have noticed this alot maybe Nvidia are turning down image quality to get better performance I don't know but it seems this way.

I have noticed AMD cards to have a better image quality for past few gens than Nivida I do not know why even if run identical settings the AMD setups are crisper and more vibrant along with seem to have all around better color representation.

here is a good thread on just this http://www.overclock.net/t/1462291/amd-vs-nvidia-image-quality


----------



## DrFPS

Quote:


> Originally Posted by *MerkageTurk*
> 
> I experience a lot of stuttering with my 980 ti, stock overclock, drivers old vs new
> 
> with my 290x gta v felt so smooth
> 
> I am thinking of amd


You must have some kind of a problem. I'm playing gtav 2560x1440 @65 to 75 FPS

Quote:


> Originally Posted by *rickcooperjr*
> 
> I have noticed in alot of games you can tell AMD has the better image quality many others have noticed this also and usually it is on Nvidia gameworks titles this kind of throws a redflag as of why this is star wars battlefront there was a few of the Fury / fury X / 390x vs GTX 980 TI and well the image quality was better on the AMD cards I have noticed this alot maybe Nvidia are turning down image quality to get better performance I don't know but it seems this way.
> 
> I have noticed AMD cards to have a better image quality for past few gens than Nivida I do not know why even if run identical settings the AMD setups are crisper and more vibrant along with seem to have all around better color representation.
> 
> here is a good thread on just this http://www.overclock.net/t/1462291/amd-vs-nvidia-image-quality


Actually its Nvidia looks much better than amd, a lot. This is 290x at 2560x1440 looks bad, looks real bad. Look at the background pop up, like an xbox. Left is AMD. Right is Nvidia same [email protected] IMHO Nvidia 10x better.


----------



## phenom01

Quote:


> Originally Posted by *MerkageTurk*
> 
> I experience a lot of stuttering with my 980 ti, stock overclock, drivers old vs new
> 
> with my 290x gta v felt so smooth
> 
> I am thinking of amd


No.


----------



## semitope

Quote:


> Originally Posted by *DrFPS*
> 
> You must have some kind of a problem. I'm playing gtav 2560x1440 @65 to 75 FPS
> Actually its Nvidia looks much better than amd, a lot. This is 290x at 2560x1440 looks bad, looks real bad. Look at the background pop up, like an xbox. Left is AMD. Right is Nvidia same [email protected] IMHO Nvidia 10x better.
> 
> 
> Spoiler: Warning: Spoiler!


not really how you compare image quality.


----------



## Vlada011

Quote:


> Originally Posted by *rickcooperjr*
> 
> I have noticed in alot of games you can tell AMD has the better image quality many others have noticed this also and usually it is on Nvidia gameworks titles this kind of throws a redflag as of why this is star wars battlefront there was a few of the Fury / fury X / 390x vs GTX 980 TI and well the image quality was better on the AMD cards I have noticed this alot maybe Nvidia are turning down image quality to get better performance I don't know but it seems this way.
> 
> I have noticed AMD cards to have a better image quality for past few gens than Nivida I do not know why even if run identical settings the AMD setups are crisper and more vibrant along with seem to have all around better color representation.
> 
> here is a good thread on just this http://www.overclock.net/t/1462291/amd-vs-nvidia-image-quality


Exactly! Many people tell me that I have excellent observation skill, I always know to notice counterfeits and original things, everything, and people only could talk to self that picture is same.
No picture is better on Radeon and before 10 years difference was even bigger. Many people suspect that NVIDIA on that way get more fps long ago. If you play on 1680x1050 off course your fps will be better than on 1920x1080. On high settings fps will be better than on Ultra high. On that way NVIDIA get status of smoother and better cards because more people look in fps and you need to used on Radeon some time and than to cross on NVIDIA to notice worse pictures. Or from NVIDIA to cross on some premium Radeon and there are very small number of such people.
I didn't notice better color represantation I even think better colors are on NVIDIA. But AMD definitely on AMD you can notice 10% more details on around 10% longer distance and same resolution look 10% higher on AMD. If someone want to talk to self that's not true OK. I have nothing against them. Their money they could buy what they want I only say 20-30% people if now remove GTX980Ti from computer and install R9-290X will notice better picture even in windows, maybe even before driver installation. I remember when NVIDIA default resolution without driver was 1280x1024 Radeon had 1920x1080 and before you install drivers. If someone don't want to pay Radeon could borrow from someone and try and if they think maybe is something little better I can say It's not maybe they notice that. You have bunch of Radeon users who used on Radeon and cross on NVIDIA they think something is wrong, install drivers 50 times, reinstall OS, games several times and think that picture is not good. They don't need to think ...that's true.

But like you see all brands make better models GeForce... Now If I want Radeon best I could buy is ASUS Fury X reference or ASUS R9-390X Strix... Off course I would wait ARES 4 but I could pay half of that card.
From other side I GeForce I could choose Matrix 980Ti, 980Ti Poseidon, 980Ti Strix or even ASUS TITAN X.
Radeon choice is poor compare to GeForce. ASUS GTX980Ti Matrix is probably about 20% maybe more stronger than Fury X. I mean she is 20% stronger than reference GTX980Ti than imagine how much is better than Fury X.


----------



## mcg75

Do any of you really think AMD and Nvidia aren't cross checking each other constantly in the image department?

We've seen accusations and proof in the past from legit sources which caused it to be fixed.

When AMD themselves come out and provide proof of this then it's time to believe. This would be a far more damaging weapon than slamming Gameworks yet we've heard nothing.

The last set of videos that were "proof" were retracted by the user because he did find his settings were in error. Despite this and his admission of such, the videos were still being provided by others as proof.


----------



## criminal

Quote:


> Originally Posted by *mcg75*
> 
> Do any of you really think AMD and Nvidia aren't cross checking each other constantly in the image department?
> 
> We've seen accusations and proof in the past from legit sources which caused it to be fixed.
> 
> When AMD themselves come out and provide proof of this then it's time to believe. This would be a far more damaging weapon than slamming Gameworks yet we've heard nothing.
> 
> The last set of videos that were "proof" were retracted by the user because he did find his settings were in error. Despite this and his admission of such, the videos were still being provided by others as proof.


Fact.


----------



## vallonen

Quote:


> Originally Posted by *rickcooperjr*
> 
> I have noticed AMD cards to have a better image quality for past few gens than Nivida I do not know why even if run identical settings the AMD setups are crisper and more vibrant along with seem to have all around better color representation.


ATI always had better image quality.


----------



## Vlada011

Quote:


> Originally Posted by *vallonen*
> 
> ATI always had better image quality.


Don't tell them, they are blind obviously.
There are people who can't buy forged things because naturally have sense to recognize original only when they see once and other people who will never notice difference until someone show them.
Same as special kind of people is only capable to work on aroma selections for perfume and for other people never can't recognize some difference.
Image on Radeon is sharper and that is not visible only in games, that's visible in games menu, Windows even BIOS Post Screen.
Enough is only 100 people to maybe notice something like that and immediately every smart person will ask How so many people notice similar things if that is not true.
Because It's true.

Look ASUS R9-390X and compare with your NVIDIA you will see how much is picture sharper and cleaner and search later for same games on GeForce TITAN X and GTX980Ti....

https://youtu.be/wrMHX1kqwY8

Some people will rather convince yourself if NVIDIA is expensive than picture can't be worse and they believe.
NVIDIA always try to hide and delete information where people explain that picture is better on ATI and AMD.
Answer is probably and deep in graphic processor design, they make graphics on completely different way and we know what is NVIDIA ready to do only to benchmark show better fps. But now when lot of ex Radeon customers who used years on clean Radeon picture start to buy GeForce last years with inferior pictures they can't hide any more like that. And we must be fair NVIDIA improve image quality special after GTX580. Before difference was much bigger.

AMD have prove for that and NVIDIA is always loudest and try to delete posts of people who notice that.
But it's not possible to everyone notice difference between high and ultra high quality example.
Difference is 5-10% but enough to NVIDIA offer smoother fps because less details on screen.
And NVIDIA have other advantages as more realistic effects and AMD is not great in advertising and presentation of weak side of their competitor.
I can bet that 10-20% people notice same among NVIDIA customers and can't say nothing they pretend like that's not true or they see something but they are not sure.


----------



## white owl

Claims ATI produces better image quality...

...backs it up with compressed youtube video.


----------



## hout17

I've tried to hang with AMD as I do like the cards (when they work) but driver support at least in my experience has been subpar to the point that I don't know if I will give them another shot. It's sad really







. Thank God for AMD's sake that Omega drivers come out every once in a while.


----------



## criminal

Quote:


> Originally Posted by *Vlada011*
> 
> Don't tell them, they are blind obviously.
> There are people who can't buy forged things because naturally have sense to recognize original only when they see once and other people who will never notice difference until someone show them.
> Same as special kind of people is only capable to work on aroma selections for perfume and for other people never can't recognize some difference.
> Image on Radeon is sharper and that is not visible only in games, that's visible in games menu, Windows even BIOS Post Screen.
> Enough is only 100 people to maybe notice something like that and immediately every smart person will ask How so many people notice similar things if that is not true.
> Because It's true.
> 
> Look ASUS R9-390X and compare with your NVIDIA you will see how much is picture sharper and cleaner and search later for same games on GeForce TITAN X and GTX980Ti....
> 
> https://youtu.be/wrMHX1kqwY8
> 
> Some people will rather convince yourself if NVIDIA is expensive than picture can't be worse and they believe.
> NVIDIA always try to hide and delete information where people explain that picture is better on ATI and AMD.
> Answer is probably and deep in graphic processor design, they make graphics on completely different way and we know what is NVIDIA ready to do only to benchmark show better fps. But now when lot of ex Radeon customers who used years on clean Radeon picture start to buy GeForce last years with inferior pictures they can't hide any more like that. And we must be fair NVIDIA improve image quality special after GTX580. Before difference was much bigger.
> 
> AMD have prove for that and NVIDIA is always loudest and try to delete posts of people who notice that.
> But it's not possible to everyone notice difference between high and ultra high quality example.
> Difference is 5-10% but enough to NVIDIA offer smoother fps because less details on screen.
> And NVIDIA have other advantages as more realistic effects and AMD is not great in advertising and presentation of weak side of their competitor.
> I can bet that 10-20% people notice same among NVIDIA customers and can't say nothing they pretend like that's not true or they see something but they are not sure.


LOL. Do you have real proof of what you speak besides a YouTube video?


----------



## Clovertail100

I'd say the 980 Ti is still superior, with overclocking.

However, I know it's by a much more narrow margin than people would like to believe. NV's "boost" overclocks are very superficial. It'll get you higher frames when frames are high, but won't do much for you otherwise. I'm sure Maxwell is no exception; we all saw the reviews.

If you're going to look at overclocked results, you're going to want to look at minimums. Ideally, you'll want to look at a graphic of FPS over a timeline, to see if you're consistently getting higher frames. Boost gets you the most extra frames when you really don't need them, and I'm sorry, but that makes me see it as litte more than a benchmark bolsterer. I won't take NV's "performance gains" from overclocking at face value anymore; not since boost showed up.


----------



## GorillaSceptre

Yup, i'd choose the Ti if i could right now.

But down the road it may be the same situation as the 290x vs 780 Ti. Unfortunately it's always to little to late with AMD, personally i prefer AMD as they've always delivered what i want, but it doesn't take a rocket scientist to work out why Nvidia wrecks them in sales.

The new Radeon Software shows that they may of finally got a clue..


----------



## Robilar

I went with the 980ti because i wanted the fastest card capable of g-sync...

In my case even if the amd card was slightly faster i still would have chosen nvidia.

Its nice that they trade blows, competition drives down prices but g-sync is superior to freesync. For me that was the deciding factor.


----------



## Klocek001

no it's not as fast as 980ti, it's as fast as the reference model, which runs 1220MHz. My 6G Gaming runs 1380MHz out of the box. Then it runs TW3 stable at 1510MHz with stock vcore, which is +300MHz compared to the reference with no vcore bump. This card just takes whatever you throw at it.
I've had dozens of GPUs, the first being Voodoo AGP, and I can't say any of them even come close to 980Ti in terms of OC capabilities. 13% OC out of the box, another 11% with no extra voltage. Gotta say Fury X has it rough, if 980ti had little to no oc readroom I'd surely be #1 pick right now.


----------



## funfordcobra

This whole argument is moot.. There's a reason why there are ZERO fury X in 3dmark top 100 and its full of titanX and 980Ti scores.

http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/1+gpu

fanboy amd all you want, here are facts.


----------



## Noufel

why so much hate toward AMD when they see a glimps of light finaly, i have 2x980ti G1 in my main rig and a 390X triX in the second one and it performs like a charm ( an don't bring me the consomation or the temps arguments )
i'd love to see arctic islands trading blows with pascal it will bring ( i hope ) prices down


----------



## Klocek001

AMD rule #1:
*it takes 1 game benchmark against reference 980ti for Fury X to be considered as fast as 980ti, and all game benchmarks with at least 10% difference in favor for nvidia to be considered better than Fury X. Of course aftermarket doesn't count. Only the 200MHz slower reference model.

why else would we see a thread like this one ?


----------



## rcfc89

Quote:


> Originally Posted by *Desolutional*
> 
> 980 Ti can easily be OCed to 1450MHz core, which blows the Fury X OC (lol, what OC, AMD already gave it the greatest OC in the factory) out of the water.


This is key. Sure if you're going to keep them at stock clocks the Fury is comparable in some benchmarks. But reality is the 980ti is capable of giving you a considerable boost in fps with a very stable OC. While the Fury is pretty much tapped out. The performance differences then are pretty substantial. As far as price goes I picked up my Lightning on Newegg for $750 after rebate and it came with AC Syndicate. For a single card on Air it demolishes a Fury X with that annoying AIO cooler.


----------



## ebduncan

Quote:


> Originally Posted by *Robilar*
> 
> I went with the 980ti because i wanted the fastest card capable of g-sync...
> 
> In my case even if the amd card was slightly faster i still would have chosen nvidia.
> 
> Its nice that they trade blows, competition drives down prices but g-sync is superior to freesync. For me that was the deciding factor.


They both have their pros and cons. I wouldn't go and say G-sync is hands down better.
Quote:


> Originally Posted by *funfordcobra*
> 
> This whole argument is moot.. There's a reason why there are ZERO fury X in 3dmark top 100 and its full of titanX and 980Ti scores.
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/1+gpu
> 
> fanboy amd all you want, here are facts.


Actually AMD QuadFire scales better than Nvidia Quad Sli, thanks to XDMA Crossfire. Most benchmarks reflect this given all 4 cards are actually used.


----------



## funfordcobra

Wrong again.

http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/4+gpu

Out of top 100 x4 crossfire / SLI there is ONE AMD entry at #99 and its quadfire 290x lol. Sooo yea all the rest are 780ti/980/980ti/TitanX.


----------



## looniam

Quote:


> Originally Posted by *rdr09*
> 
> it might reoccur. lol


naw, big difference in the reference vrms - or you mean how it was blown out of proportion for the sake of clickbait . .? . .?

still, it's _surprising_ you'd admit looking at a 670 since you were busy posting that BF3 lsd screen shot everywhere.


----------



## ebduncan

Quote:


> Originally Posted by *funfordcobra*
> 
> Wrong again.
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/4+gpu
> 
> Out of top 100 x4 crossfire / SLI there is ONE AMD entry at #99 and its quadfire 290x lol. Sooo yea all the rest are 780ti/980/980ti/TitanX.


http://www.techpowerup.com/forums/threads/an-epic-fury-x-review-quad-fury-x-vs-quad-titan-x.214231/

so yea i fail to see your point. You are referring to one canned synthetic benchmark. XDMA Crossfire > SLI when it comes to scaling. This was a mix of like 13 different games.


----------



## rdr09

Quote:


> Originally Posted by *looniam*
> 
> naw, big difference in the reference vrms - or you mean how it was blown out of proportion for the sake of clickbait . .? . .?
> 
> still, it's _surprising_ you'd admit looking at a 670 since you were busy posting that BF3 lsd screen shot everywhere.


iirc, 320.18 was not the first time.









oh yah, that lsd. that happened in bf4 as well.


----------



## funfordcobra

Quote:


> Originally Posted by *ebduncan*
> 
> http://www.techpowerup.com/forums/threads/an-epic-fury-x-review-quad-fury-x-vs-quad-titan-x.214231/
> 
> so yea i fail to see your point. You are referring to one canned synthetic benchmark. XDMA Crossfire > SLI when it comes to scaling. This was a mix of like 13 different games.


So would you prefer and ORGANIC benchmark over SYNTHETIC? lmao..

3Dmark11 is pretty much the standard that EVERYONE goes for. Sure you can pick and choose yo0ur benches and find ONE that you get a good score at but you will never be able to constantly beat out FX vs TX it 980TI.

Anyone can make a graph, but 3dmark compares hundreds of thousands of users online. I can get my 780TIs to outperform my 980Tis in valley, do they perform better, not even by 1/2.

When I constantly see fury X breaking even with the higher end NVidia models ill give some credit, but its just not there now except propaganda from AMD people.


----------



## looniam

Quote:


> Originally Posted by *rdr09*
> 
> iirc, 320.18 was not the first time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh yah, that lsd. that happened in bf4 as well.


nope it wasn't; 267.52 toasted a 590 on sweclockers.





after the vrms in 500 series were taking a beating, at least according to the press, folks got mad about the green light program. :\


----------



## ebduncan

Quote:


> Originally Posted by *funfordcobra*
> 
> So would you prefer and ORGANIC benchmark over SYNTHETIC? lmao..
> 
> 3Dmark11 is pretty much the standard that EVERYONE goes for. Sure you can pick and choose yo0ur benches and find ONE that you get a good score at but you will never be able to constantly beat out FX vs TX it 980TI.
> 
> Anyone can make a graph, but 3dmark compares hundreds of thousands of users online. I can get my 780TIs to outperform my 980Tis in valley, do they perform better, not even by 1/2.
> 
> When I constantly see fury X breaking even with the higher end NVidia models ill give some credit, but its just not there now except propaganda from AMD people.


3dmark 11 is a standard benchmark, but it is just that a benchmark. People play games not benchmarks. Fury X in quad fire scales better in most games, thus giving it higher numbers vs Quad 980 Ti/ Titan X. I just gave you benchmarks of 12 different games including at different resolutions and such.



I am not saying the Fury X is faster than the 980TI. I am just saying that in Quadfire the story changes to favor the Fury cards thanks to better scaling. Single and Dual card solutions still favor the 980 Ti/ Titan X. And lets face it a overclocked 980ti torches a Fury X in single card configurations.


----------



## Klocek001

PCLab tested TW3 on the newest patch in the new expansion, AMD runs 15.10 here
http://pclab.pl/art66374-10.html
I guess for the AMD side a lot depends on the location they tested, look at this


versus


----------



## funfordcobra

The percentage of quadfire is soo small that it doesn't even matter. You are talking about less that 1% of pc gaming population. Even trifire is niche at best. The place to compare is 1 or 2 cards.


----------



## guambra

Witcher 3 is an Nvidia biased game.


----------



## Klocek001

Quote:


> Originally Posted by *guambra*
> 
> Witcher 3 is an Nvidia biased game.


if by runs better on geforce you mane it's nvidia biased then yes.


----------



## funfordcobra

I have and amd rig too and always use NVidia for gaming. The AMD got turned into a HTPC.


----------



## st0necold

I love both company's. I say if you disregard clocks, and just play the damn games then the Fury X/ 980ti should be indistinguishable without a monitoring program.


----------



## rickcooperjr

Quote:


> Originally Posted by *st0necold*
> 
> I love both company's. I say if you disregard clocks, and just play the damn games then the Fury X/ 980ti should be indistinguishable without a monitoring program.


Thus all used to be common knowledge the problem is in most cases now you got gameworks and such or Nvidia sponsored game titles / engines along with proprietary graphics settings which as far as I know all of AMD specific graphics settings are open for Nvidia to use but Nvidia don't reciprocate. This makes things more biased against AMD across table and with AMD's 25% or so market share there isn't alot of fight AMD can do against all of it but again if you run unbiased games / engines you will notice AMD image quality is superior to Nvidia alot of it has to do with Nvidias color compression and texture remapping features from my understanding.


----------



## Robilar

Quote:


> Originally Posted by *ebduncan*
> 
> They both have their pros and cons. I wouldn't go and say G-sync is hands down better.
> Actually AMD QuadFire scales better than Nvidia Quad Sli, thanks to XDMA Crossfire. Most benchmarks reflect this given all 4 cards are actually used.


I didn't say it was "hands down better", I indicated it was superior and it is. I've tried both and found the g-sync array to be more consistent in performance. Mind you g-sync is more expensive overall so that is to be expected.


----------



## semitope

Quote:


> Originally Posted by *Klocek001*
> 
> PCLab tested TW3 on the newest patch in the new expansion, AMD runs 15.10 here
> http://pclab.pl/art66374-10.html
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I guess for the AMD side a lot depends on the location they tested, look at this
> 
> 
> versus


HBAO+ should be off. still nvidia tech that runs better on nvidia hardware. Hard to tell if ssao or hbao+ looks better
Quote:


> Originally Posted by *Klocek001*
> 
> if by runs better on geforce you mane it's nvidia biased then yes.


means it has nvidia code in there
Quote:


> Originally Posted by *Robilar*
> 
> I didn't say it was "hands down better", I indicated it was superior and it is. I've tried both and found the g-sync array to be more consistent in performance. Mind you g-sync is more expensive overall so that is to be expected.


g-sync is not better. freesync is arguably better because it lacks the latency from the module, but the comparison between identical monitors using either tech should show very similar performance when freesync is on


----------



## Robilar

Ah so you also have both setups and have tested them with matching games? Would love to see rig pics of both.


----------



## semitope

Quote:


> Originally Posted by *Robilar*
> 
> Ah so you also have both setups and have tested them with matching games? Would love to see rig pics of both.


based on the tomshardware thing from what i remember. its only "inferior" when its not actually running. i.e depends on the monitor and its range.


----------



## Roboyto

Quote:


> Originally Posted by *Proxish*
> 
> I'd love to know how an overclocked R9 290 & R9 290X holds up against an overclocked GTX 970 both with the latest drivers.


It's not the latest drivers, but I did this back in January: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320


----------



## silent-circuit

Quote:


> Originally Posted by *iLeakStuff*
> 
> You are aware that GTX 980Ti Lightning is an overclocked GTX 980Ti that is expensive as hell right? Thats why it is faster
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, the Fury X launch was not ideal, thats for sure. But man have AMD recovered. Availability on the market and yields are now ideal. Drivers are better than Nvidia. Rumors about Omega drivers lurking around in the corner bringing who knows what to the Fury X card. I hope they unlocked voltage and improved overclocking. Any additional performance gains will for sure make the Fury X card a no brainer vs GTX 980Ti.
> To me Fury X is already a better buy right now though.


That's one heck of a jump. No brainer? You can get a 980Ti for $620 from Gigabyte with their Windforce cooler and a small factory OC (1102Mhz). This can likely be pushed farther with a few minutes of tweaking, even without touching voltage. Maybe not quite to the Lightning's ~1200Mhz clocks, but close. In many cases people get higher than 1200Mhz -- I've seen 1400Mhz on air, higher. Cheaper, no rebate to deal with, no pump issues, no having to mount a radiator... I fail to see how that makes any sense.


----------



## Klocek001

Quote:


> Originally Posted by *semitope*
> 
> HBAO+ should be off. still nvidia tech that runs better on nvidia hardware. Hard to tell if ssao or hbao+ looks better
> means it has nvidia code in there
> g-sync is not better. freesync is arguably better because it lacks the latency from the module, but the comparison between identical monitors using either tech should show very similar performance when freesync is on


I don't think people should turn off hbao+ on nvidia cards just to make it even for amd. you don't see people claiming aots should be done without async, quite the contrary, they demand more. if you compare $650 cards I think hbao+ should be on.


----------



## ku4eto

Quote:


> Originally Posted by *Klocek001*
> 
> I don't think people should turn off hbao+ on nvidia cards just to make it even for amd. you don't see people claiming aots should be done without async, quite the contrary, they demand more. if you compare $650 cards I think hbao+ should be on.


Actually, AOTS runs without Async Compute on nVidia cards...just because it completly sucks with it. Same should be with TW3 and HBAO.


----------



## Klocek001

Quote:


> Originally Posted by *ku4eto*
> 
> Actually, AOTS runs without Async Compute on nVidia cards...just because it completly sucks with it. Same should be with TW3 and HBAO.


yes but it doesn't reduce visual quality. Nvidia cards still do the post processing features in aots, just without the help of parelell compute. Turning off gw will reduce image quality. I'm all for having GW turned off on mid range cards to see which one, e.g. 390 or 970 will let you run 60fps without GW. But on $650 cards - most of GW should probably be enabled, I mean if you spend so much on a GPU you can expect it to run with most of them enabled. People who claim GW is a way of gaining the edge over AMD may be right, but why do they fail to see that most GW features have a very nice impact on visual quality (e.g. check out FC4 with HBAO+ and Enhanced godrays vs SSAO/SSBC and no Godrays, now tell me there's no difference)


----------



## Szaby59

Exactly... And nobody will care "how" is it achieved if the results/performance are the same or even better...

Recent AOTS perf. test


----------



## white owl

Quote:


> Originally Posted by *Klocek001*
> 
> yes but it doesn't reduce visual quality. Nvidia cards still do the post processing features in aots, just without the help of parelell compute. Turning off gw will reduce image quality. I'm all for having GW turned off on mid range cards to see which one, e.g. 390 or 970 will let you run 60fps without GW. But on $650 cards - most of GW should probably be enabled, I mean if you spend so much on a GPU you can expect it to run with most of them enabled. People who claim GW is a way of gaining the edge over AMD may be right, but why do they fail to see that most GW features have a very nice impact on visual quality (e.g. check out FC4 with HBAO+ and Enhanced godrays vs SSAO/SSBC and no Godrays, now tell me there's no difference)


Agreed.
Games HBAO+ does alot to make the game pop. It looks pretty flat without it.
The fur looks fine without hairworks but it one of my favorites.


----------



## Klocek001

the thing is, I treat GW as an extra feature, which is always nice to have. I'd really like games like FC3 to have a few of them, even in post release patches, so I could replay with better quality than on my first playthrough in 2013.
I really like most of them, although I agree that running them all might be excessive in most cases. But buying a top tier I expect, better, demand from the GPU manufacturer to let me use most of these extra features.
When I had 290 1100/1600 with 4900MHz i5 2500K @1080p I couldn't even use enhanced godrays or hbao+ alone, not to mention both of them. On 980 I could use them both even though I was already running @1440p (still talking FC4 here). Such is the degree, with GW features nvidia crushes AMD.


----------



## semitope

Quote:


> Originally Posted by *white owl*
> 
> Agreed.
> Games HBAO+ does alot to make the game pop. It looks pretty flat without it.
> The fur looks fine without hairworks but it one of my favorites.


The game looks flat without ambient occlusion, yes. But the difference between ssao and hbao+ is hard to tell in motion and sometimes ssao looks better to me while sometimes HBAO+ looks better..
Quote:


> Originally Posted by *Klocek001*
> 
> yes but it doesn't reduce visual quality. Nvidia cards still do the post processing features in aots, just without the help of parelell compute. Turning off gw will reduce image quality. I'm all for having GW turned off on mid range cards to see which one, e.g. 390 or 970 will let you run 60fps without GW. But on $650 cards - most of GW should probably be enabled, I mean if you spend so much on a GPU you can expect it to run with most of them enabled. People who claim GW is a way of gaining the edge over AMD may be right, but why do they fail to see that most GW features have a very nice impact on visual quality (e.g. check out FC4 with HBAO+ and Enhanced godrays vs SSAO/SSBC and no Godrays, now tell me there's no difference)


Using SSAO instead of HBAO+ is not necessarily decreasing visual quality.

SSAO vs HBAO+ in witcher 3

http://www.geforce.com/whats-new/guides/the-witcher-3-wild-hunt-graphics-performance-and-tweaking-guide#the-witcher-3-wild-hunt-nvidia-hbao-plus

sometimes one looks better sometimes the other looks better. But there is a definite performance penalty.


----------



## Ashura

WCCFTECH


----------



## ku4eto

Quote:


> Originally Posted by *semitope*
> 
> The game looks flat without ambient occlusion, yes. But the difference between ssao and hbao+ is hard to tell in motion and sometimes ssao looks better to me while sometimes HBAO+ looks better..
> Using SSAO instead of HBAO+ is not necessarily decreasing visual quality.
> 
> SSAO vs HBAO+ in witcher 3
> 
> http://www.geforce.com/whats-new/guides/the-witcher-3-wild-hunt-graphics-performance-and-tweaking-guide#the-witcher-3-wild-hunt-nvidia-hbao-plus
> 
> sometimes one looks better sometimes the other looks better. But there is a definite performance penalty.


The difference between visual quality on HBAO and SSAO is.... truly low. Yet the performance tanks hard on both vendors (but mainly AMD, cuz of overused Tesselation).

One good thing with the new 15.11 Beta drivers that i saw is that Tesselation is AMD Optimized in the 3D Application settings by default.


----------



## mcg75

Quote:


> Originally Posted by *ku4eto*
> 
> The difference between visual quality on HBAO and SSAO is.... truly low. Yet the performance tanks hard on both vendors (but mainly AMD, cuz of overused Tesselation).
> 
> One good thing with the new 15.11 Beta drivers that i saw is that Tesselation is AMD Optimized in the 3D Application settings by default.


http://www.hardocp.com/article/2015/01/12/far_cry_4_graphics_features_performance_review/4#.Vj_BGvmrTIE

There are plenty of examples out there that HBAO+ does not penalize AMD cards such as this one.

IMHO, it's the only truly useful feature of Gameworks.


----------



## ku4eto

Quote:


> Originally Posted by *mcg75*
> 
> http://www.hardocp.com/article/2015/01/12/far_cry_4_graphics_features_performance_review/4#.Vj_BGvmrTIE
> 
> There are plenty of examples out there that HBAO+ does not penalize AMD cards such as this one.
> 
> IMHO, it's the only truly useful feature of Gameworks.


When you are comparing HBAO vs SSAO vs off, you can clearly notice the difference versus OFF, but you have to specifically look at the details to know if its HBAO or SSAO. The difference can hardly be made, unless you know what you are looking for, general look is the same, as most of the time you are looking in one specific point, instead of taking a look at the scenic view. The amount of performance/quality is still dependent on the developers, although AMD CCC now has AMD Optimized Tesselation option in the 3D application settings (by default), and you still can reduce the used AA manually.


----------



## Klocek001

Quote:


> Originally Posted by *mcg75*
> 
> http://www.hardocp.com/article/2015/01/12/far_cry_4_graphics_features_performance_review/4#.Vj_BGvmrTIE
> 
> There are plenty of examples out there that HBAO+ does not penalize AMD cards such as this one.
> 
> IMHO, it's the only truly useful feature of Gameworks.


this, godrays and depth of field. hw is great in some scenes, not so much in other ones.
I have no idea what soft shadows do.


----------



## Proxish

Quote:


> Originally Posted by *Roboyto*
> 
> It's not the latest drivers, but I did this back in January: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320


Thanks for the link, I appreciate it.


----------



## neurotix

1. I'm sick of this "people play games not benchmarks" crap. What is the basis for this? If a card gets a certain score in a benchmark, you can expect a certain level of performance from it in gaming. 3dmark even has various numbers from their own internal tests giving scores for you to compare to, such as "Low End Laptop", "4K Gaming PC" and so on. I'm an enthusiast benchmarker and pretty highly ranked on HWBOT. You give me a Fire Strike score, I can probably tell you what games it will run and what resolution is appropriate. The game performance generally reflects the benchmark performance and vice versa.









2. As much as I love AMD and want them to succeed, unfortunately the Fury X (and maybe the Fury) are failures. Improved benchmark scores matter little when everyone looking to buy a card in that tier already likely has a 980 or 980ti. I would never recommend the Fury X over the 980ti if someone has the money for it because of one simple fact: 64 ROPs vs 96 ROPs. This is why the 980ti can blow away the Fury X at lower resolutions, it has a much greater pixel fill rate. The Fury X should have had 128 ROPs, or perhaps 96 to at least be competitive with the 980ti. On top of that, the Fury X is an abysmal overclocker. This means you can't really squeeze any extra performance out of it. Stock to stock, the 980ti is already faster than the Fury X due to ROPs as stated. But it also overclocks +300-400mhz and once you do that, it surpasses the Fury X, which can't be overclocked to keep up. On top of that, it seems like nearly every other game has *Nvidia GameWorks™* which gimps pretty much every major release on AMD cards. I don't really remember the last time a game was optimized for AMD, but I don't buy and play every game that comes out. It seems like maybe Tomb Raider (2013) and Sleeping Dogs (2012?) are the last games I remember that actually ran better on AMD.

If I'm going to spend my money on two $600 cards this year (or possibly next), I want a card that isn't gimped in almost every game and that actually overclocks well. I've used AMD exclusively since the 4670, and then a 6870 and I'm tired of this crap. (Btw, I was lucky to overclock the core on the 6870 by a measly 100mhz- situation hasn't changed!) I'm about the biggest AMD fanboy and diehard that you'll meet but even I'm convinced the ship is sinking fast, and turds like the Fury X do nothing to help the cause. Strongly considering spending my money on Team Green this time. I already ditched my AMD processor for Intel after being sick of getting trash benchmark scores and unexplainable huge FPS dips in nearly every game. (How's Nvidia Surround nowadays?)


----------



## Farih

Quote:


> Originally Posted by *neurotix*
> 
> 1. I'm sick of this "people play games not benchmarks" crap. What is the basis for this? If a card gets a certain score in a benchmark, you can expect a certain level of performance from it in gaming. 3dmark even has various numbers from their own internal tests giving scores for you to compare to, such as "Low End Laptop", "4K Gaming PC" and so on. I'm an enthusiast benchmarker and pretty highly ranked on HWBOT. You give me a Fire Strike score, I can probably tell you what games it will run and what resolution is appropriate. The game performance generally reflects the benchmark performance and vice versa.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. As much as I love AMD and want them to succeed, unfortunately the Fury X (and maybe the Fury) are failures. Improved benchmark scores matter little when everyone looking to buy a card in that tier already likely has a 980 or 980ti. I would never recommend the Fury X over the 980ti if someone has the money for it because of one simple fact: 64 ROPs vs 96 ROPs. This is why the 980ti can blow away the Fury X at lower resolutions, it has a much greater pixel fill rate. The Fury X should have had 128 ROPs, or perhaps 96 to at least be competitive with the 980ti. On top of that, the Fury X is an abysmal overclocker. This means you can't really squeeze any extra performance out of it. Stock to stock, the 980ti is already faster than the Fury X due to ROPs as stated. But it also overclocks +300-400mhz and once you do that, it surpasses the Fury X, which can't be overclocked to keep up. On top of that, it seems like nearly every other game has *Nvidia GameWorks™* which gimps pretty much every major release on AMD cards. I don't really remember the last time a game was optimized for AMD, but I don't buy and play every game that comes out. It seems like maybe Tomb Raider (2013) and Sleeping Dogs (2012?) are the last games I remember that actually ran better on AMD.
> 
> If I'm going to spend my money on two $600 cards this year (or possibly next), I want a card that isn't gimped in almost every game and that actually overclocks well. I've used AMD exclusively since the 4670, and then a 6870 and I'm tired of this crap. (Btw, I was lucky to overclock the core on the 6870 by a measly 100mhz- situation hasn't changed!) I'm about the biggest AMD fanboy and diehard that you'll meet but even I'm convinced the ship is sinking fast, and turds like the Fury X do nothing to help the cause. Strongly considering spending my money on Team Green this time. I already ditched my AMD processor for Intel after being sick of getting trash benchmark scores and unexplainable huge FPS dips in nearly every game. (How's Nvidia Surround nowadays?)


Very much agree with your first point but in your 2nd point mentioning AMD cards never been good at overclocking is wrong tbh.

My 4890 went from 850mhz to 1010mhz
My 5870 went from 850mhz to 1050mhz
My 260x went from 1050mhz to 1335mhz (and 2075mhz on memory !)
My 270x went from 1000mhz to 1362mhz
My 7950 went from 860mhz to 1125mhz
My 290x went from 1000mhz to 1220mhz

Whats so bad about this, seem like pretty good overclocks to me (260x and 270x even more then good)


----------



## HZCH

Well,I might agree with @neurotix. I've seen those tests, I was thinking that it's great for AMD to show a Fury X able to beat a 980 Ti, but...

All of this is useless: the enthusiast crowd is irrealistically drooling on the next GPU generation, and has already bought a 980 (I did) or a 980 ti, because it was better when it had to - when you want to upgrade your PC.

You can't say "Hey, hopefully our team showed it might be best after all" almost 1 year after they failed to win the competition. Like with any company that has tried to delivered some products but failed in a row and couldn't deliver its promises, I don't think AMD will win back the trust of people that unfortunately count - I mean the investors. From a financial point of view (a.k.a as someone too poor to invest), I wouldn't care less about that news.

At least Fury X owners are happy now ?


----------



## neurotix

Quote:


> Originally Posted by *Farih*
> 
> Very much agree with your first point but in your 2nd point mentioning AMD cards never been good at overclocking is wrong tbh.
> 
> My 4890 went from 850mhz to 1010mhz
> My 5870 went from 850mhz to 1050mhz
> My 260x went from 1050mhz to 1335mhz (and 2075mhz on memory !)
> My 270x went from 1000mhz to 1362mhz
> My 7950 went from 860mhz to 1125mhz
> My 290x went from 1000mhz to 1220mhz
> 
> Whats so bad about this, seem like pretty good overclocks to me (260x and 270x even more then good)


You have better clocking cards than I've had.

The best I've gotten is my "golden" 270X that did 1270mhz (84% ASIC). All others did around 1200mhz tops, and even then, a lot weren't stable at all. (E.g. my 290s- top card would do 1200, bottom had to be at 1150.)

This isn't the first time I've seen you post in response to me claiming quite high clock speeds on AMD cards. That's fine that you do, but please realize it isn't the general rule. Most people are lucky to even get 1200mhz on most AMD cards even with the voltage slider maxed without seeing artifacts or crashing. I've even seen some unfortunate people who got 290s that couldn't even do 1150mhz in the 290 thread and various other places on the forum.

GCN overclocks much better in general than older cards (e.g. my 6870 and the 4870x2 I had). However, nobody can deny that the Fury X (and the Fury??) overclock like crap. Worse, they were advertised as being "excellent overclockers" in a post-Maxwell world.

I was basically comparing the overclockability of newer GCN cards (esp. Fury X) to that of Maxwell, which overclocks extraordinarily well, as we all likely know.

I have fun overclocking my cards, I have bought and resold AMD cards just to bench them, and I love running 3D benches on AMD cards, but truth be told, AMD cards don't overclock worth a damn. It's kinda always been this way. I recall people hitting 1300mhz+ easily on 680s, 770s and 780s, even on air. If I'm forking out < $500 for a card, I want it to overclock. Not just 50, 100 or 150mhz. That's what we all want, right? AMD failed to deliver this with the Fury X. Worse, the 980ti overclocks so damn well that it surpasses the Fury X. Just look at the graphs at the beginning of this thread and look at the 980ti Lightning as compared to the Fury X (According to Newegg the Lightning boosts to 1304mhz, so I assume that's the speed it was at in the graphs at the beginning, where it surpasses the Fury X by nearly 20%. What happens if it's at 1500mhz but the Fury X is still at 1050mhz or whatever because you can't overclock it?)


----------



## Farih

Quote:


> Originally Posted by *neurotix*
> 
> You have better clocking cards than I've had.
> 
> The best I've gotten is my "golden" 270X that did 1270mhz (84% ASIC). All others did around 1200mhz tops, and even then, a lot weren't stable at all. (E.g. my 290s- top card would do 1200, bottom had to be at 1150.)
> 
> This isn't the first time I've seen you post in response to me claiming quite high clock speeds on AMD cards. That's fine that you do, but please realize it isn't the general rule. Most people are lucky to even get 1200mhz on most AMD cards even with the voltage slider maxed without seeing artifacts or crashing. I've even seen some unfortunate people who got 290s that couldn't even do 1150mhz in the 290 thread and various other places on the forum.
> 
> GCN overclocks much better in general than older cards (e.g. my 6870 and the 4870x2 I had). However, nobody can deny that the Fury X (and the Fury??) overclock like crap. Worse, they were advertised as being "excellent overclockers" in a post-Maxwell world.
> 
> I was basically comparing the overclockability of newer GCN cards (esp. Fury X) to that of Maxwell, which overclocks extraordinarily well, as we all likely know.
> 
> I have fun overclocking my cards, I have bought and resold AMD cards just to bench them, and I love running 3D benches on AMD cards, but truth be told, AMD cards don't overclock worth a damn. It's kinda always been this way. I recall people hitting 1300mhz+ easily on 680s, 770s and 780s, even on air. If I'm forking out < $500 for a card, I want it to overclock. Not just 50, 100 or 150mhz. That's what we all want, right? AMD failed to deliver this with the Fury X. Worse, the 980ti overclocks so damn well that it surpasses the Fury X. Just look at the graphs at the beginning of this thread and look at the 980ti Lightning as compared to the Fury X (According to Newegg the Lightning boosts to 1304mhz, so I assume that's the speed it was at in the graphs at the beginning, where it surpasses the Fury X by nearly 20%. What happens if it's at 1500mhz but the Fury X is still at 1050mhz or whatever because you can't overclock it?)


Tbh, i think most people dont know how to get the max out of there cards.
I almost always get numbers like that (sometimes you buy a lemon)

Since this topic is now AMD i quote my high AMD overclocks but with Nvidia cards its no different for me really, i managed high overclocks on them to.

I think the difference is most people just move sliders and dont get into adjusting voltage and/or bios modding wich is key to get high overclocks IMO.

With the Fury cards you seem to be right though, they dont seem to overclock nicely at all but thats just 1 line of cards in there history for me all the way back to 3xxx series, i wasnt much of an overclocker before that









Another point with how people compare OC's on GPU's nowdays:
they say a 290x clocks crappy compared to say a 970 but people then look at base clocks while 1 of the cards (the 970) has a boost.
When you look at stock boost speed first and then look at you overclocked boost those overclocks are not that high at all.

A 970 boost wel over 1300mhz stock, some even close to 1400mhz
Lets take the middle(1350mhz), 1350mhz to a overclock of 1550mhz isnt all that big then
A 390x at 1050mhz to 1200mhz is just as big as an overclock as 970 at 1350mhz to 1550mhz.

But yeah, the Fury line seems to be poor overclockers, i can agree on that









My poorest clocker atm is my 960 ( I dont have a Fury yet)
Stock boost 1342mhz, max overclock boost 1512mhz


----------



## neurotix

Were your cards on water?

I do all my benching using air cooling on my cards, and realistically, there are a TON of gamers out there that just want to play games and don't water cool their cards.

This could explain the difference.

Also, I looked you up on HWBOT and you have 0 points. If you're getting such high OCs why don't you make submissions there?

I mean, considering I have 30 gold cups on HWBOT I kinda think I know what I'm doing, also having been overclocking AMD cards since 2008 and submitting benches to HWBOT since 2011.

Further, of course I mess with BIOS and adjust voltage







What are you trying to imply? Not everyone is a novice or an idiot. My HWBOT profile speaks for itself. And again, I'll state that AMD cards (on air) overclock like crap, and Nvidia has always had great overclockers.

All the rest, I agree with. Unfortunately, the Fury X is a bad overclocker. This is why the 980ti really pulls ahead of it.


----------



## Farih

Quote:


> Originally Posted by *neurotix*
> 
> Were your cards on water?
> 
> I do all my benching using air cooling on my cards, and realistically, there are a TON of gamers out there that just want to play games and don't water cool their cards.
> 
> This could explain the difference.
> 
> Also, I looked you up on HWBOT and you have 0 points. If you're getting such high OCs why don't you make submissions there?
> 
> I mean, considering I have 30 gold cups on HWBOT I kinda think I know what I'm doing, also having been overclocking AMD cards since 2008 and submitting benches to HWBOT since 2011.
> 
> Further, of course I mess with BIOS and adjust voltage
> 
> 
> 
> 
> 
> 
> 
> 
> What are you trying to imply? Not everyone is a novice or an idiot. My HWBOT profile speaks for itself.
> 
> All the rest, I agree with. Unfortunately, the Fury X is a bad overclocker. This is why the 980ti really pulls ahead of it.


Its all hobby no HWbot needed really but.... maybe i should someday though....
And all these cards are air cooled

I always try to adjust the BIOS on GPU's wich helps alot in overclocking allthough sometimes hardware on the card can be limiting to like with my 960, i just cant seem to properly overvolt it.

I have results stored on 3Dmark but many benches show 0mhz core and 0mhz memory








Do have a few though.
This is the 270x


This the 260x


Rest of the OC's i mentioned are pretty common IMO.

Also, i didnt mean with "most people just moving sliders" that you have no knowledge, its targetted in general not you in person.


----------



## neurotix

Here's my 270X: http://hwbot.org/submission/2599416_neurotix_3dmark11___performance_radeon_r9_270x_10048_marks

1270mhz, only 400 points lower than yours.

That's off topic though, we should wrap it up.

I'm just gonna state again that I really think your overclocks for AMD are basically extremely lucky and uncommon.

I've only had 1 GCN card out of probably 10 I've tested and benched that could pass benches above 1250mhz. And this was often after messing with BIOS and changing the voltage to the max.









Again, I see a ton of people in the 280X/270X club, Tonga club, 290X club, and 7970 club and most people on air coolers are lucky to exceed 1200mhz regardless of the card. I've followed some of those threads for years now and it's extremely uncommon to see AMD cards at 1300mhz and above. I can basically count the number of times I've seen benches passed at those speeds on air on one hand among all the clubs. And if you don't believe me, go ask the guys in the 280X/270X owners club about common overclocks on air. They know me. I've posted in the thread since it was made.

Anyway, regards to you, we should probably get back on track discussing the Fury X and how it performs (or fails to perform) against the 980ti.









EDIT: Hell, you can even look at the owners list in the original post for some of these cards and tell me how many people stated their overclocks were over 1200mhz, let alone 1300mhz. I certainly see a lot of 1120s, 1150s, etc.


----------



## Farih

Quote:


> Originally Posted by *neurotix*
> 
> Here's my 270X: http://hwbot.org/submission/2599416_neurotix_3dmark11___performance_radeon_r9_270x_10048_marks
> 
> 1270mhz, only 400 points lower than yours.
> 
> That's off topic though, we should wrap it up.
> 
> I'm just gonna state again that I really think your overclocks for AMD are basically extremely lucky and uncommon.
> 
> I've only had 1 GCN card out of probably 10 I've tested and benched that could pass benches above 1250mhz. And this was often after messing with BIOS and changing the voltage to the max.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Again, I see a ton of people in the 280X/270X club, Tonga club, 290X club, and 7970 club and most people on air coolers are lucky to exceed 1200mhz regardless of the card. I've followed some of those threads for years now and it's extremely uncommon to see AMD cards at 1300mhz and above. I can basically count the number of times I've seen benches passed at those speeds on air on one hand among all the clubs. And if you don't believe me, go ask the guys in the 280X/270X owners club about common overclocks on air. They know me. I've posted in the thread since it was made.
> 
> Anyway, regards to you, we should probably get back on track discussing the Fury X and how it performs (or fails to perform) against the 980ti.


You should not be hammering on frequency
What does it matter ?
Nothing, its the performance that counts
If 1 cards beats the other and has a lower frequency then ?

like i said a 390x at 1050mhz overclocked to 1200mhz is just a big of an overclock as a 970 from 1350mhz to 1550mhz.
On average the 390x is about 10% faster then a 970 so what does frequency say then ?

Stock 7970 is 1000mhz, to 1200mhz is 20% OC...
a 20% OC on a 970 would make a boost of 1620mhz, this is more uncommen then a 7970 at 1200mhz...
Its not about frequency but about max overclocks in percantage and even more important the percentage in gained performance.
If i had a new card here that performs better then any card in world today who cares if it runs 800mhz or 2000mhz ?

Hope i explained it wright.....

And back on topic, yes i still would get a 980ti over a Fury-X any day.
Even if i got one for free i would prolly sell it and try and get a 980ti.


----------



## neurotix

I know what you mean. This is why my 290s at 1150mhz were as fast as 970s at 1500mhz+. (I can back this up if needed.) Unfortunately they're both gone now and I miss them









But again, we should probably stop and get back on topic.


----------



## Farih

Quote:


> Originally Posted by *neurotix*
> 
> I know what you mean. This is why my 290s at 1150mhz were as fast as 970s at 1500mhz+. (I can back this up if needed.) Unfortunately they're both gone now and I miss them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But again, we should probably stop and get back on topic.


I think we pretty much agree on everything though.

Nice to be able to have a talk about GPU's without fighting and calling eachother names, i thank you for that


----------



## neurotix

It's all good.


----------



## huzzug

Quote:


> Originally Posted by *Farih*
> 
> Quote:
> 
> 
> 
> Originally Posted by *neurotix*
> 
> I know what you mean. This is why my 290s at 1150mhz were as fast as 970s at 1500mhz+. (I can back this up if needed.) Unfortunately they're both gone now and I miss them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But again, we should probably stop and get back on topic.
> 
> 
> 
> I think we pretty much agree on everything though.
> 
> Nice to be able to have a talk about GPU's without fighting and calling eachother names, i thank you for that
Click to expand...

What do you mean done with the discussion ? What do I do with these popcorn ?

OT: agree with what you said on how its the performance it is that matters and not the frequency. And seeing how tech is progressing, we may see a lot of cases where frequency's stay lower but performance increase due to advantages in uarch and node shift.


----------



## yesitsmario

Fury X is now as fast as Titan X @ 4K. AMD cards always seem to catch up eventually.

https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_Matrix/23.html


----------



## Charcharo

Quote:


> Originally Posted by *Klocek001*
> 
> yeah buying a mid-range card for two years or more is not a fruitful endeavor


The old ATI 5770 played games from 2015 OK. I played all of GTA V and Witcher 3 on it. So a mid range card that did work for 6 years. It still works I just upgraded.
The higher tier GTX 760 wont do the same it seems. Hence why people are upset.

Quote:


> Originally Posted by *funfordcobra*
> 
> The percentage of quadfire is soo small that it doesn't even matter. You are talking about less that 1% of pc gaming population. Even trifire is niche at best. The place to compare is 1 or 2 cards.


To be fair, the R Fury, 980, 980 Ti, Fury X and Titan X are all cards that less than 1% of the PC Gamer population would buy. Technically (and this will rustle some feathers), in the grand scheme of things... they are not relevant.

Not saying comparing them is wrong. It is not. But lets be realistic here.


----------

