# [Various] AMD Radeon R9 Fury-X Reviews



## TheMentalist

*Articles*

TechPowerUp
PC Perspective
Tom's hardware
Hardware Canucks
OC3D
HardOCP
Guru3D
Bit-Tech
Hexus
VMod
TechFrag
Hardware FR
Techreport
PCWorld
PC Gamer
Forbes
Hispazone
JagatReview
Hexus
Hardware.info
Sweclockers
Tweaktown
SemiAccurate
AnandTech
AnandTech (Benchmarks only)

*Videos*

PC Perspective
OC3D
LinusTechTips
Hardware Canucks
JayzTwoCents

*Quick Overview*


Spoiler: Specifications










Spoiler: Product Images













Spoiler: Various 4K Benchmark Images















Spoiler: Various 1440p Benchmark Images














_One thread for various reviews of this Glorious card, post new reviews in the comments and I'll add it here._
_Let's keep it friendly folks







_


----------



## Kane2207

I don't get the OC3D review. I've looked through most of the benches and looks like it's being kicked around by a 980 in some scenarios, yet they sing it's praises at the end giving it a gold award?

Maybe I just expected more, I dunno...


----------



## Mozz13

LTT youtube review is up. https://www.youtube.com/watch?v=-CDFNOTZy8o


----------



## BackwoodsNC

pcper is up


----------



## Alatar

TPU: http://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html


----------



## Serandur

The OC3D review pretty much says the 980 Ti is superior. I wonder if pcper say differently.


----------



## CasualCat

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested


----------



## DMHernandez

http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed


----------



## BigMack70

This card is nothing more than AMD playing _"me too!!!!!"_ but with less VRAM, no HDMI 2.0, and a few months late to the party ....









Anyways on a more positive note... hope you guys who buy the card enjoy it!


----------



## Serandur

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested


Only 1155 MHz overclock, pcper seem disappointed in its OC capabilities.


----------



## XxOsurfer3xX

Mmmm, disappointing, but the OC3D review is not the best, no OC and not all games tested @ 4K which is where this card should pull ahead...


----------



## BigMack70

Quote:


> Originally Posted by *Serandur*
> 
> Only 1155 MHz overclock, pcper seem disappointed in its OC capabilities.


Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.


----------



## criznit

Well looks like I will be returning my monitor and getting a 1440p gsync ips monitor instead


----------



## Shion314

Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> Mmmm, disappointing, but the OC3D review is not the best, no OC and not all games tested @ 4K which is where this card should pull ahead...


Seems it still is still behind even at 4K. Disappointing to me so far. Hopefully drivers or something can bring it up closer to the 980ti.


----------



## Serandur

Quote:


> Originally Posted by *BigMack70*
> 
> Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.


They also mentioned pump noise being annoying, even on idle:

"One thing that is not shown in this graph though is the high pitched whine that is present 100% of the time with our review sample powered up. The sound is clearly part of the pump mechanism and I know from discussions with other reviewers that this is a common problem in the launch samples. AMD addressed this to me in an email, stating that the issue was limited to the initial batch of engineering samples and that the issues had "been resolved and a fix added" for all production parts going on sale to the public. Obviously we'll have to wait for reports from the field to verify that.

I will say that installing the R9 Fury X into a chassis, or even just putting a piece of cardboard between the open test bench and my head resulted in the sound nearly being imperceptible. Still, this is a concern worth keeping an ear on."

Not sure what to make of this:


----------



## Alatar

My personal opinion of this card after looking at the performance numbers and considering the memory amount is that it should be priced at $599 and then it'd be at the perfect spot for the average high end buyer.

Only real exception being SFF where even higher pricing would be a non issue.


----------



## LancerVI

980 ti time.

Glad I waited.


----------



## CasualCat

Quote:


> Originally Posted by *Serandur*
> 
> Only 1155 MHz overclock, pcper seem disappointed in its OC capabilities.


I was disappointed PCPer didn't have OC charts given how thorough everything else was.


----------



## Im Batman

Eh points for trying, at least we have options now within the 980ti / Fury price bracket.


----------



## Kane2207

On the plus side - I only have one thread to subscribe to now


----------



## joeh4384

Seems they need to unlock the voltage. 100mhz on stock voltage is really good for an AMD card.


----------



## rt123

Quote:


> Originally Posted by *BigMack70*
> 
> Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.


Stock Voltage.


----------



## maarten12100

Quote:


> Originally Posted by *rt123*
> 
> Stock Voltage.


I hope otherwise it is time to climb in the pen and write some angry but polite emails to AMD which they will once again not respond to. About how they pictured an unreal image of "an overclockers dream"


----------



## Serandur

Quote:


> Originally Posted by *rt123*
> 
> Stock Voltage.


Stock voltage seems to be the only option. I'm confused, did AMD lock down the core voltage?


----------



## TAr

This card has no hdmi 2.0?


----------



## Serandur

Quote:


> Originally Posted by *TAr*
> 
> This card has no hdmi 2.0?


Nope, confirmed not to by AMD directly.


----------



## SKYMTL

Quote:


> Originally Posted by *Shion314*
> 
> Seems it still is still behind even at 4K. Disappointing to me so far. Hopefully drivers or something can bring it up closer to the 980ti.


In our testing it actually pulled pretty even with the GTX 980 Ti at 4K but lost at 1440P. Odd but it could be HBM's advanced algorithms kicking in....or just the fact that the GTX 980 Ti.


----------



## XxOsurfer3xX

Quote:


> Originally Posted by *maarten12100*
> 
> I hope otherwise it is time to climb in the pen and write some angry but polite emails to AMD which they will once again not respond to. About how they pictured an unreal image of "an overclockers dream"


Yup, they definitely dropped the ball on this one, they dind't learn anything from the bulldozer fiasco. If you create a lot of hype, you gotta deliver, if not people are going to be pissed.


----------



## BoredErica

Lol @ all the people trying to be first with their news stories.

It's FINALLY OUT.


----------



## Tivan

Quote:


> Originally Posted by *Kane2207*
> 
> I don't get the OC3D review. I've looked through most of the benches and looks like it's being kicked around by a 980 in some scenarios, yet they sing it's praises at the end giving it a gold award?
> 
> Maybe I just expected more, I dunno...


Not sure, but he mentions he's using catalyst 15.5 in the video, he either mixed something up there or is using the wrong driver. lel


----------



## Rickles

Oh man, this thing is below a 780 Ti in project cars...


----------



## DFroN

I am disappoint. Waiting to see what happens when the voltage is unlocked. What is the point of a dual bios when everything except core clock is locked?


----------



## SKYMTL

Quote:


> Originally Posted by *Serandur*
> 
> Nope, confirmed not to by AMD directly.


Quote:


> Originally Posted by *TAr*
> 
> This card has no hdmi 2.0?


It has been confirmed by AMD directly. No HDMI 2.0.
Quote:


> Originally Posted by *Serandur*
> 
> Stock voltage seems to be the only option. I'm confused, did AMD lock down the core voltage?


Not necessarily, since none of the tools properly supports it right now. Only time will tell.


----------



## GMcDougal

My heart is hurting looking through these reviews. I really thought AMD was going to bring it to the house this time. I will say tho that I think the card itself is very powerful, if not more powerful than the 980ti but the drivers just aren't there.


----------



## rt123

Quote:


> Originally Posted by *maarten12100*
> 
> I hope otherwise it is time to climb in the pen and write some angry but polite emails to AMD which they will once again not respond to. About how they pictured an unreal image of "an overclockers dream"


I will just shake my head & walk away in disappointment.








Quote:


> Originally Posted by *Serandur*
> 
> Stock voltage seems to be the only option. I'm confused, did AMD lock down the core voltage?


Traditionally its unlike them, but with HBM thrown in the mix, nothing is guaranteed.


----------



## CasualCat

Quote:


> Originally Posted by *SKYMTL*
> 
> In our testing it actually pulled pretty even with the GTX 980 Ti at 4K but lost at 1440P. Odd but it could be HBM's advanced algorithms kicking in....or just the fact that the GTX 980 Ti.


http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives.html

Thanks for actually including OC charts. Only thing I wish we'd see is OC of at least a couple of the comparison cards as well.


----------



## Valenz

Quote:


> Originally Posted by *LancerVI*
> 
> 980 ti time.
> 
> Glad I waited.


I was so so excited by all the hype AMD was putting out and yesterday I said screw it and went with 2 MSI 980ti gaming cards and I am glad I did because the results are disappointing.

I am sure over time, with better drivers the fury x will be a beast and the air cooled fury will be a great card at a great price , but AMD really needs to get it together.


----------



## Ashura

waiting for kitguru's review....


----------



## toncij

A great card IMHO. Months ago I've posted engineering sample results and it seems it hasn't moved that much ahead, but - unlike engineering sample that clocked max to 1113MHz, the retail seems to clock better. It is a nice card for the price.


----------



## toncij

Quote:


> Originally Posted by *Ashura*
> 
> waiting for kitguru's review....


You won't see that one soon. He got skipped by AMD.


----------



## Evil Penguin

Eh, I was expecting between 980 Ti and Titan X performance...


----------



## maarten12100

Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> Yup, they definitely dropped the ball on this one, they dind't learn anything from the bulldozer fiasco. If you create a lot of hype, you gotta deliver, if not people are going to be pissed.


Thing is they didn't really hype it up and certainly not Bulldozer since this is GPUs those are different companies almost. I just don't understand how you can be proud about something with a leading edge technology losing to the competition that is still on old tech.

Unless this thing OCs to the moon I'm going to call it not as good as expected. It is not even price competitive when priced the same as 980 Ti which they planned on so... Fury Nano still has some potential but the top end unless OC proofs me wrong is a bust.


----------



## Noufel

guru3d








http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,1.html


----------



## Casey Ryback

Quote:


> Originally Posted by *Serandur*
> 
> The OC3D review pretty much says the 980 Ti is superior. I wonder if pcper say differently.


Techpowerup benchmarks show the fury to be highly competitive at 1440p+


----------



## Kane2207

Quote:


> Originally Posted by *toncij*
> 
> A great card IMHO. Months ago I've posted engineering sample results and it seems it hasn't moved that much ahead, but - unlike engineering sample that clocked max to 1113MHz, the retail seems to clock better. It is a nice card for the price.


37MHz better, hardly 'knocking it out of the park', is it?


----------



## Serandur

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives.html


Oh sweet, they go into detail on Fiji's microarchitecture. It's GCN 1.2 for sure; curious if HBM and the AIO were that essential in keeping power consumption down.


----------



## VSG

I will wait to see if overvolting is allowed and how it scales before buying anything. At this point, the 980Ti looks more favorable for the money to me personally since I would have slapped on a waterblock anyway and non reference versions are incoming with voltage control options.


----------



## $ilent

OCUK has fury x up for £509 for preorder to be delivered on 17th July. Worth noting that about 6 gtx 980ti have been dropped to exactly £509 too lol...

No suprises there from nvidia.


----------



## Casey Ryback

Quote:


> Originally Posted by *maarten12100*
> 
> I just don't understand how you can be proud about something with a leading edge technology losing to the competition that is still on old tech.


Because it's an innovation of memory, please stop confusing it with core performance.

HBM doesn't magically make your gpu cores the fastest in the world


----------



## Olivon

http://www.hardware.fr/articles/937-1/amd-radeon-r9-fury-x-gpu-fiji-memoire-hbm-test.html

Really disapointing results. Almost on par with 980 G1 Gaming on 1440p is a complete wreck.

HBM + AIO + estimated TDP around 385W for this level of performance = Are you nuts AMD ?


----------



## KyadCK

Quote:


> Originally Posted by *BigMack70*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Serandur*
> 
> Only 1155 MHz overclock, pcper seem disappointed in its OC capabilities.
> 
> 
> 
> Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.
Click to expand...

Oooor...

And hear me out here...

It's because they have absolutely no voltage control yet. That's stock volts with a power target change on all of them.
Quote:


> Originally Posted by *Serandur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> Stock Voltage.
> 
> 
> 
> Stock voltage seems to be the only option. I'm confused, did AMD lock down the core voltage?
Click to expand...

Afterburner and so on simply don't know how to control voltage on them yet. CCC itself does not do voltage. They'll get to it soon.


----------



## Ha-Nocri

This is just what I expected ~5% slower than 980ti. GTA5 results seem odd. It's barely faster than 290x


----------



## BigMack70

Quote:


> Originally Posted by *Casey Ryback*
> 
> Techpowerup benchmarks show the fury to be highly competitive at 1440p+


Until you factor in overclocking...


----------



## Kane2207

Quote:


> Originally Posted by *Casey Ryback*
> 
> Because it's an innovation of memory, please stop confusing it with core performance.
> 
> HBM doesn't magically make your gpu cores the fastest in the world


Granted, but AMD on the whole has given the impression that elements of the card, including HBM, would give us unrivaled performance - which doesn't appear to be the case.


----------



## DFroN

Quote:


> Originally Posted by *$ilent*
> 
> OCUK has fury x up for £509 for preorder to be delivered on 17th July. Worth noting that about 6 gtx 980ti have been dropped to exactly £509 too lol...
> 
> No suprises there from nvidia.


OCUK had the same Sapphire models for £650 buy it now or £510 if you wait til 17/7. Atleast they're open about how much you're paying to get it day one.

Scan had them for £520, obviously sold out now.

Good news on the Ti prices.


----------



## maarten12100

Best choice of words in a review on this card yet. It is like the NK-33 innovative but without the strong production and simulation capabilities the US had you won't have it land on the moon before the Germans have theirs land on the moon.


----------



## keikei

Fury X has comparable frames to the comp, but the factors that impressed me would be the load temps and the fact that it is still early tech. The drivers will give the card better performance. I had similar circumstances with the 7970, it only got better with time. Also more options now for gamers. Will there be non-reference Fury X's?


----------



## Casey Ryback

Quote:


> Originally Posted by *Olivon*
> 
> Really disapointing results. Almost on par with 980 G1 Gaming on 1440p is a complete wreck.
> 
> HBM + AIO + estimated TDP around 385W for this level of performance = Are you nuts AMD ?


Please look through the 1440p benchmarks before trying to suggest AMD belong in a nut house









Also the estimated TDP is actually 275W. You're probably looking at system power consumption................sigh.

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/7.html

From what I can see it beats a 980 consistently, and competes with the 980ti/titan X.


----------



## SKYMTL

I think the question is really, simple: why is it that with every AMD launch every one of us reviewers points out driver immaturity? It shouldn't be that way. IMO at least....


----------



## Alatar

The results are whatever, no one should really care from a performance standpoint whether it's 1% slower than 980Ti or 1% faster than Titan X. Everything in that range currently (fury X, 980Ti and titan x) are within 5% of each other (at stock at least).

Since the performance is so similar for reference cards what it comes down to is the extra pros and cons. If you like the cooling solution or the small form factor then Fury is probably pretty good and if you want a normal card with more memory then a 980Ti is solid.

The annoying news here is that there will be no non reference cards for Fury X. Just like with Titans AMD is locking down the design and it's reference only.


----------



## DMHernandez

Quote:


> Originally Posted by *keikei*
> 
> Fury X has comparable frames to the comp, but the factors that impressed me would be the load temps and the fact that it is still early tech. The drivers will give the card better performance. I had similar circumstances with the 7970, it only got better with time. Also more options now for gamers. Will there be non-reference Fury X's?


No, there won't be non-reference fury X's, there should be non reference fury's though, it remains to be seen if the regular fury has same specs as the X just with air cooling (I expect a cut down version or, at least, an underclocked one).


----------



## Slink3Slyde

Dissapointing. It's priced the same as a cards that beats it and has more Vram to boot. There's no way to spin that as a win.

Speculating I know but I cant see it overclocking as well as Maxwell cards either. The non X on custom air might be the best value option.


----------



## toncij

Quote:


> Originally Posted by *Kane2207*
> 
> 37MHz better, hardly 'knocking it out of the park', is it?


Well, true. But keep in mind engineering drivers were much worse than current ones. Also, we had issues...


----------



## Casey Ryback

Quote:


> Originally Posted by *BigMack70*
> 
> Until you factor in overclocking...


Yep, that's a big question on everyone's minds. Will voltage control come and when?

It has so much temperature headroom to work with. They are so close to being at the top right now it's actually quite amazing given how much harder it would be with lower funds.


----------



## SKYMTL

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives.html
> 
> Thanks for actually including OC charts. Only thing I wish we'd see is OC of at least a couple of the comparison cards as well.


Sorry. This review was REALLY limited on time. Hence why I am still working dilligently on the HBM section.


----------



## Serandur

Quote:


> Originally Posted by *SKYMTL*
> 
> I think the question is really, simple: why is it that with every AMD launch every one of us reviewers points out driver immaturity? It shouldn't be that way. IMO at least....


It definitely shouldn't be, if that is the case. Fiji's basically a scaled-up Tonga which itself isn't so heavily altered from the GCN AMD introduced in late 2011. There's no excuse for immature drivers and it hurts both AMD and the consumers by muddying up test results and launch performance.


----------



## Georgey123

I thought AMD would have released a card with more grunt after such a long time, bit dissapointed really. I havent found any overclocked results yet though. So if you were put the Fury X and a 980 Ti / Titan X hybrid up against eachother, I still think nvidia would comfortably be ahead. Still looking for some more overclocked benchmarks.


----------



## mav451

Quote:


> Originally Posted by *Im Batman*
> 
> Eh points for trying, at least we have options now within the 980ti / Fury price bracket.


I'm with [H]ardOCP's Brent, in that the technology had to come, but at another pricepoint where it wasn't necessarily limiting the frame buffer.

Particularly, his conclusions regarding the upgrade in performance over the 290X:
Quote:


> On the other hand, the AMD Radeon R9 Fury X was a major upgrade from the AMD Radeon R9 290X. We saw at times very large performance increases over the R9 290X, depending on the game.
> 
> *The problem was though that the performance increases were very erratic*. There wasn't a predictable percentage of performance increase with the R9 Fury X over the R9 290X. This is because the R9 Fury X is faster in some cases where games use heavy tessellation, since tessellation got a big upgrade in R9 Fury X. However, if a game doesn't use some feature that was beefed up in the R9 Fury X the performance increase over R9 290X is much smaller.


For me, this is hard to accept for a flagship product, particularly when the GPU camp has been waiting for so long.

As for the pump noise? I'm a bit less surprised at that haha.
From PCPer's sound level page:
Quote:


> One thing that is not shown in this graph though is the high pitched whine that is present 100% of the time with our review sample powered up. The sound is clearly part of the pump mechanism and I know from discussions with other reviewers that this is a common problem in the launch samples. AMD addressed this to me in an email, stating that the issue was limited to the initial batch of engineering samples and that the issues had "been resolved and a fix added" for all production parts going on sale to the public. Obviously we'll have to wait for reports from the field to verify that.


Adding all these things up does not form a compelling flagship narrative. And that's an issue, when this is designed to capture mindshare that AMD so sorely needs.


----------



## BinaryDemon

Wow it's amazing how accurate Nvidia was at judging Fury X performance a month before it was released. The FuryX and 980TI basically a performance match. Any of these reviews show frametimes? I'd like to see if HBM helps AMD perform more smoothly.

Edit: I see pcper has frametime graphs although its tough make anything other than a very general estimate from the overlapping lines. Seems compareable. I gotta wonder if Fury X would have done similar when paired with GDDR5.


----------



## Kuivamaa

Ok someone to bench BF4 with mantle now, it still is my main game and the reviews tell me absolutely nothing vs my 290X, same for DA:I.


----------



## Casey Ryback

Quote:


> Originally Posted by *Alatar*
> 
> The results are whatever, no one should really care from a performance standpoint whether it's 1% slower than 980Ti or 1% faster than Titan X. Everything in that range currently (fury X, 980Ti and titan x) are within 5% of each other (at stock at least).
> 
> Since the performance is so similar for reference cards what it comes down to is the extra pros and cons. If you like the cooling solution or the small form factor then Fury is probably pretty good and if you want a normal card with more memory then a 980Ti is solid.
> 
> The annoying news here is that there will be no non reference cards for Fury X. Just like with Titans AMD is locking down the design and it's reference only.


+1


----------



## BoredErica

Quote:


> Originally Posted by *Alatar*
> 
> The results are whatever, no one should really care from a performance standpoint whether it's 1% slower than 980Ti or 1% faster than Titan X. Everything in that range currently (fury X, 980Ti and titan x) are within 5% of each other (at stock at least).
> 
> Since the performance is so similar for reference cards what it comes down to is the extra pros and cons. If you like the cooling solution or the small form factor then Fury is probably pretty good and if you want a normal card with more memory then a 980Ti is solid.
> 
> The annoying news here is that there will be no non reference cards for Fury X. Just like with Titans AMD is locking down the design and it's reference only.


Hmmm... You mentioned at stock... I'm wondering about overclocked. Time to do more digging.









Must...get...all...the...frames...

Quote:


> Originally Posted by *BinaryDemon*
> 
> Wow it's amazing how accurate Nvidia was at judging Fury X performance a month before it was released. The FuryX and 980TI basically a performance match. Any of these reviews show frametimes? I'd like to see if HBM helps AMD perform more smoothly.


Linus mentioned that with past cards. He said that Nvidia already knew the performance of the Fury X. I bet that was true when Nvidia was deciding on the timing and price of the 980 ti.


----------



## rt123

Quote:


> Originally Posted by *Alatar*
> 
> The results are whatever, no one should really care from a performance standpoint whether it's 1% slower than 980Ti or 1% faster than Titan X. Everything in that range currently (fury X, 980Ti and titan x) are within 5% of each other (at stock at least).
> 
> Since the performance is so similar for reference cards what it comes down to is the extra pros and cons. If you like the cooling solution or the small form factor then Fury is probably pretty good and if you want a normal card with more memory then a 980Ti is solid.
> 
> *The annoying news here is that there will be no non reference cards for Fury X. Just like with Titans AMD is locking down the design and it's reference only.*


There were rumors about this, but do you have a proper source about this.?


----------



## maarten12100

With DX12 this card would be a better long term choice a counter argument would be the VRAM capacity.
AMD has more to gain from DX12 than Nvidia that is for sure.


----------



## Alatar

Quote:


> Originally Posted by *Kuivamaa*
> 
> Ok someone to bench BF4 with mantle now, it still is my main game and the reviews tell me absolutely nothing vs my 290X, same for DA:I.


It's weird how mantle testing seems to be missing. [H] has their usual Siege of Shanghai tests but no mantle:


----------



## joesaiditstrue

Quote:


> Originally Posted by *rt123*
> 
> Stock Voltage.


compared to 980ti OC's on stock voltage.

not even close


----------



## maarten12100

Quote:


> Originally Posted by *rt123*
> 
> There were rumors about this, but do you have a proper source about this.?


Huddy said it in that "interview" where he also talked about HDMi not being 2.0


----------



## FLCLimax

TPU benchmarks.


Spoiler: Warning: Spoiler!


----------



## Serandur

Quote:


> Originally Posted by *rt123*
> 
> There were rumors about this, but do you have a proper source about this.?


----------



## iLeakStuff

Hardwarecanucks review. Add it to OP.

They found GTX 980Ti to be 8% faster.


----------



## $ilent

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yep, that's a big question on everyone's minds. Will voltage control come and when?
> 
> It has so much temperature headroom to work with. They are so close to being at the top right now it's actually quite amazing given how much harder it would be with lower funds.


Well I hope I'm wrong but I can't see a voltage unlock giving that much more over locking headroom if these cards can't even manage 50mhz at stock. Heck even my 970s do almost +150mhz on stock voltage.

Seems to be a common trend with new AMD gpus that they have already squeezed almost everything out of the gpu before release.

Like I say I hope I'm wrong.


----------



## Alatar

Quote:


> Originally Posted by *rt123*
> 
> There were rumors about this, but do you have a proper source about this.?


W1zzard mentions it in TPU's conclusion:
Quote:


> Just like Titan X, AMD does not allow any custom variants of the Fury X, but NVIDIA's GTX 980 Ti is being customized by all of their board partners, which means individual products can be targeted a more specific needs of a smaller customer segment.


http://www.techpowerup.com/reviews/AMD/R9_Fury_X/36.html


----------



## Olivon

Quote:


> Originally Posted by *Casey Ryback*
> 
> Please look through the 1440p benchmarks before trying to suggest AMD belong in a nut house
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also the estimated TDP is actually 275W. You're probably looking at system power consumption................sigh.
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X/7.html
> 
> From what I can see it beats a 980 consistently, and competes with the 980ti/titan X.


*TBP* in not equal to *TDP*.
AMD has communicated a 300W TDP for the GPU only, complete TDP card is around 385W :
Quote:


> AMD does not communicate TDP nor PowerTune consumption limit and merely a magic number called "Typical Board Power", which allows to cover their tracks on the Radeon R9 Fury the consumption level X. Besides this TBP 275W, we have a global consumption limit of the card which is +/- 385W


Quote:


> For Fiji, the principle is similar to Hawaii, except that now controls all PowerTune GPU + HBM. Thus, AMD told us that the use-level packaging was set at 300W for R9 Fury X, which corresponds to a total limit of the card of about 385W. PowerTune always measures VDDC channel, we do not know if the rest, including any regarding the HBM memory is also measured or if there is an estimated parameter that fits into the overall algorithm.


----------



## Smanci

Definitely not a bad release, just nothing extraordinary.


----------



## Vesimas

The next thing to do now is to wait the Fury X2 and see how much it cost against a 980 Ti SLI


----------



## CasualCat

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review#.VYqk2vlVhBc


----------



## Rickles

A definite win for SFF, but that power draw is crazy...


----------



## Casey Ryback

Well techpowerup state a peak power consumption of 280W.


----------



## keikei

Now, i want to see the Fury Pro....


----------



## Alatar

Quote:


> Originally Posted by *Vesimas*
> 
> The next thing to do now is to wait the Fury X2 and see how much it cost against a 980 Ti SLI


The age of well priced dual cards was over after the 6990 and 590.

These days if you have room you're better off with two individual cards since the dual versions are priced as niche luxury cards instead of value propositions like they were in the past.


----------



## gigafloppy

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review#.VYqk2vlVhBc


Wow. That conclusion! Hardocp really doesn't like the card. Not enough VRAM for 4K and slower than 980-Ti.


----------



## Booty Warrior

For a product this late, and with this much hype, I can't find much positive to take away from this launch. Nvidia has held the performance crown virtually uninterrupted for over 2 years now. And while flagships aren't high volume products, they do help define the brand. Perception means a lot.

With water cooling out of the box, HBM and all the extra time they've had to prepare, I was hoping for a lot more. Other than super niche SFF builds, I'm not seeing a lot of incentive to take one of these over a 980 Ti.


----------



## p4inkill3r

I'll be ordering one today.









I feel that it will only get more competitive as driver maturation sets in and being within 8-10% of nvidia's hottest card is good enough for me.


----------



## rt123

Quote:


> Originally Posted by *joesaiditstrue*
> 
> compared to 980ti OC's on stock voltage.
> 
> not even close


Where do you see 980Ti OCed on stock voltage.
Quote:


> Originally Posted by *maarten12100*
> 
> Huddy said it in that "interview" where he also talked about HDMi not being 2.0


Quote:


> Originally Posted by *Serandur*


Quote:


> Originally Posted by *Alatar*
> 
> W1zzard mentions it in TPU's conclusion:
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X/36.html


Got it. Thanks guys.


----------



## Valenz

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review#.VYqk2vlVhBc


Wow, seems that the x has problems with GTAV and Battlfield on every review. Very sad news.


----------



## Serandur

Quote:


> Originally Posted by *CasualCat*
> 
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review#.VYqk2vlVhBc


TDLR: 980 TI is superior, *4 GBs of VRAM is actually limiting Fiji* in some of those weird outlier results like GTA V and Dying Light.


----------



## Ashura

Techreport
http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed


----------



## Kuivamaa

Good card just like the 980Ti. From a gamer;s point of view is yet again the same story. Check If your favorite titles are Green or Red and buy accordingly. Win 10 standoff will be interesting but I do not expect any surprises there.


----------



## $ilent

Quote:


> Originally Posted by *Valenz*
> 
> Wow, seems that the x has problems with GTAV and Battlfield on every review. Very sad news.


As with the 290x launch this is probably down to the drivers. I cannot fathom how AMD does not think its a good idea to have drivers ready for a big release.

It does seem strange that AMD went with 4gb ram when they are surely aiming this Gpu at 4k where 4gb is probably not enough.


----------



## SKYMTL

Quote:


> Originally Posted by *Valenz*
> 
> Wow, seems that the x has problems with GTAV and Battlfield on every review. Very sad news.


Makes you wonder what will happen with Battlefront...


----------



## KyadCK

Quote:


> Originally Posted by *$ilent*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Casey Ryback*
> 
> Yep, that's a big question on everyone's minds. Will voltage control come and when?
> 
> It has so much temperature headroom to work with. They are so close to being at the top right now it's actually quite amazing given how much harder it would be with lower funds.
> 
> 
> 
> Well I hope I'm wrong but I can't see a voltage unlock giving that much more over locking headroom if these cards can't even manage 50mhz at stock. Heck even my 970s do almost +150mhz on stock voltage.
> 
> Seems to be a common trend with new AMD gpus that they have already squeezed almost everything out of the gpu before release.
> 
> Like I say I hope I'm wrong.
Click to expand...

They do +100Mhz on stock...

1050Mhz to 1150Mhz in almost every review.


----------



## MegaSkot

R.I.P. Fury


----------



## SKYMTL

Quote:


> Originally Posted by *$ilent*
> 
> As with the 290x launch this is probably down to the drivers. I cannot fathom how AMD does not think its a good idea to have drivers ready for a big release.
> 
> It does seem strange that AMD went with 4gb ram when they are surely aiming this Gpu at 4k where 4gb is probably not enough.


4GB of HBM is more than enough. I have yet to encounter a situation that uses more than 4GB of memory which doesn't cause a GPU bottleneck far before the memory becomes a limiting factor.


----------



## CasualCat

Quote:


> Originally Posted by *Ashura*
> 
> Techreport
> http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed


Love how they clearly show all their game settings for their benchmarks.


----------



## Vintage

The regular Fury may be a killer card if AIB's are not locked down to a reference design and the core is not much slower than the X.


----------



## Natskyge

Disapointed.


----------



## TheMentalist

Quote:


> Originally Posted by *SKYMTL*
> 
> Makes you wonder what will happen with Battlefront...


Oh boy...it will be fun..


----------



## Blameless

I'm not particularly disappointed. This is more or less what was expected and it's a competitive product.

OCing potential is still rather unknown at this point, and that could be a make or break factor for many of the people here.
Quote:


> Originally Posted by *BigMack70*
> 
> Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.


I would not have expected it to be a great OCer, at least not on stock voltage. If it was, they would have clocked it higher by default and sold it for more money.

Will have to see how it does with more voltage and how the cooling handles the added heat load.
Quote:


> Originally Posted by *SKYMTL*
> 
> In our testing it actually pulled pretty even with the GTX 980 Ti at 4K but lost at 1440P. Odd but it could be HBM's advanced algorithms kicking in....or just the fact that the GTX 980 Ti.


Fury's memory bandwidth advantage could well explain why the gap is smaller at 4k than 1440p.

I'd be surprised if most of these tests were using enough VRAM to make any of AMD's "advanced algorithms" matter.

I'd be even more surprised if NVIDIA didn't have similarly good memory management algorithms, given their greater experience with asymmetrically sized memory channels on parts going back quite a ways as well as with the GTX 970's memory topology.
Quote:


> Originally Posted by *Casey Ryback*
> 
> HBM doesn't magically make your gpu cores the fastest in the world


Indeed.
Quote:


> Originally Posted by *SKYMTL*
> 
> I think the question is really, simple: why is it that with every AMD launch every one of us reviewers points out driver immaturity? It shouldn't be that way. IMO at least....


I agree.
Quote:


> Originally Posted by *mav451*
> 
> Particularly, his conclusions regarding the upgrade in performance over the 290X:
> For me, this is hard to accept for a flagship product, particularly when the GPU camp has been waiting for so long.


Fury seems to be ROP and perhaps TMU limited.

Hawaii and Fiji both have 64 ROPs. In purely fill rate limited scenarios they should perform pretty similarly at similar clocks.

Everyone focuses on SPs, but fill rate still matters. There is a reason why GM200 has 96 ROPs.

If Fury had the 128 ROPs mentioned in early rumors, it would have a dominating lead. With 64 ROPs, that memory bandwidth from HBM is going to be mostly wasted.


----------



## AmericanLoco

Quote:


> Originally Posted by *$ilent*
> 
> As with the 290x launch this is probably down to the drivers. I cannot fathom how AMD does not think its a good idea to have drivers ready for a big release.
> 
> It does seem strange that AMD went with 4gb ram when they are surely aiming this Gpu at 4k where 4gb is probably not enough.


The 4GB limit comes from their decision to use HBM. HBM1 is limited in the maximum capacity per stack. If rumors are to be believed, HBM delays are also the reason why this card is launching now instead of 3-4 months ago.


----------



## maarten12100

Do we already have submissions of those bench squads that go kingpin style on their cards. Shattering records would do a great deal of image repair to this not so great launch. If it can't do good there then how is this my dream!


----------



## $ilent

Quote:


> Originally Posted by *SKYMTL*
> 
> 4GB of HBM is more than enough. I have yet to encounter a situation that uses more than 4GB of memory which doesn't cause a GPU bottleneck far before the memory becomes a limiting factor.


Look at the reviews though. They say 4gb is the limiting factor


----------



## Vesimas

Quote:


> Originally Posted by *Alatar*
> 
> The age of well priced dual cards was over after the 6990 and 590.
> 
> These days if you have room you're better off with two individual cards since the dual versions are priced as niche luxury cards instead of value propositions like they were in the past.


Don't know how to translate the italian saying but: hope it does not cost anything







Btw i'll waiting for Skylake then i'll decide what to buy (Z170 or X99, Amd or Nvdia, Gsync or Freesync)


----------



## maarten12100

Quote:


> Originally Posted by *$ilent*
> 
> Look at the reviews though. They say 4gb is the limiting factor


He is a reviewer though.


----------



## Cyclonic

The biggest question is, how will it perform in DX12, wil it get some extra boost because of the HBM? Or will it all just stay the same with DX12 games? It would be more future proof then then a 980 TI, but if you upgrade anyway each year there is no point and you can buy next years Nvidia chip with HBM 2.


----------



## CasualCat

Quote:


> Originally Posted by *$ilent*
> 
> Look at the reviews though. They say 4gb is the limiting factor


Are they proving it though or is that their assumption? I suspect it is the latter.


----------



## Olivon

http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html

VRM are really hot for a reference AMD card.


----------



## TheMentalist

*Added a poll to the thread, let's see what OCN thinks about this card!*


----------



## criminal

Some results still seem off. Hoping future drivers help with some performance issues. I will definitely wait for the regular Fury before making a decision. I mean Fury X is barely faster than 980 in some situations, so how much worse is the Fury going to perform? Card should be $599 tops. I am disappointed.


----------



## BigMack70

It's really weird they are not letting board partners make custom variants, given that they are competing with the 980 Ti (which allows custom boards) and not the Titan X (which doesn't). Hopefully they will at least bring some voltage control to the table.


----------



## Ha-Nocri

Soon we'll have win10 and dx12 games to check
Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot for a reference AMD card.


Damn, that's like Titan X hot. Both 100c+. Expected more from WC'ed VRMs. With extra voltage it could end bad quickly.


----------



## Rickles

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot completely normal for a reference AMD card.


Fixed that for you.

Wow, and I thought the 980 Ti had some hot vram.


----------



## 47 Knucklehead

So basically, after all the wait and all the hype.

Titan X is still king.

Fury X and GTX 980Ti trade blows, for the same price, with most of the games that I would play, the GTX 980Ti edging out the Fury X by 1-2 FPS.


----------



## SKYMTL

Quote:


> Originally Posted by *$ilent*
> 
> Look at the reviews though. They say 4gb is the limiting factor


I did look at the reviews. 4GB isn't a limiting factor.
Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot for a reference AMD card.


*Removes backplate*

*Complains about temperatures*










The backplate allows for proper dissipation through the PCB of some of that heat. Not a great situation but the "heat rises" rule is in effect here.


----------



## PureBlackFire

this card looks very inconsistent. in some titltes beating or matching the 980ti and in others falling way behind where this trend doesn't happen to other AMD cards. drivers? BIOS? hardware problems? by my estimate this card should flat out be 100-120% faster than the R9 285/R9 380 and it is in some titles. but what's making this card so slow in others?


----------



## TheMentalist

Quote:


> Originally Posted by *47 Knucklehead*
> 
> So basically, after all the wait and all the hype.
> 
> Titan X is still king.
> 
> Fury X and GTX 980Ti trade blows, for the same price, with most of the games that I would play, the GTX 980Ti edging out the Fury X by 1-2 FPS.


Yep, so true..


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot for a *WATER COOLED* reference AMD card.


Fixed for you.


----------



## Serandur

Quote:


> Originally Posted by *BigMack70*
> 
> It's really weird they are not letting board partners make custom variants, given that they are competing with the 980 Ti (which allows custom boards) and not the Titan X (which doesn't). Hopefully they will at least bring some voltage control to the table.


I think they were trying to establish their own Titan X equivalent and maybe even price it almost as high (like that rumored $850), but that doesn't make sense because it would assume AMD couldn't guess Nvidia have cut-down GM200 waiting. But then if they knew that, it doesn't make sense why they tried to establish a Titan-like, no-number moniker and product with reference cooling only. Nothing makes sense.


----------



## maarten12100

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot for a reference AMD card.


I don't even...
How?
There is a waterblock/pipe running over them I don't get it...


----------



## Blameless

Quote:


> Originally Posted by *BinaryDemon*
> 
> Wow it's amazing how accurate Nvidia was at judging Fury X performance a month before it was released.


No it's not.

If they had halfway accurate specs, they could easily have predicted final performance. After all, they do design GPUs themselves and have a firm grasp of what makes them work.
Quote:


> Originally Posted by *Serandur*
> 
> TDLR: 980 TI is superior, *4 GBs of VRAM is actually limiting Fiji* in some of those weird outlier results like GTA V and Dying Light.


Does seem to be that way.
Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.hardware.fr/articles/937-8/temperatures-nuisances-sonores.html
> 
> VRM are really hot for a reference AMD card.


That is hotter than I would have expected given the copper pipe they have soldered over the VRM on the cooling plate.

Hopefully that's an anomaly with that particular sample. Perhaps loose screws, or a bad solder/TIM interface?

I tighten down all the mounting screws on any GPU I use or test...many have come to me quite loose.
Quote:


> Originally Posted by *47 Knucklehead*
> 
> Fixed for you.


Well, it's not a full cover block. It's a GPU block and a plate with the exit flow directed through a copper pipe that attaches over the VRM area.


----------



## rt123

Quote:


> Originally Posted by *maarten12100*
> 
> Do we already have submissions of those bench squads that go kingpin style on their cards. Shattering records would do a great deal of image repair to this not so great launch. If it can't do good there then how is this my dream!


No Pro OC results yet.

On the other hand, Tomshardware really liked the power consumption & compared it to Maxwell.
Quote:


> We're pleasantly surprised at the power results we recorded. Going in, the assumption was that a larger, more complex GPU would make Hawaii look tame in comparison. But it seems like a combination of HBM, plus a refined approach to PowerTune, domesticates the GCN architecture and, at least under average workloads, helps Fury X put up impressive numbers against GeForce GTX 980 Ti. Even under load, our sophisticated power setup makes it quite clear that Fiji is a different beast entirely. It's only when you launch into a stress test that Fiji really chugs power. You just aren't going to see that when you're playing games, though.


----------



## provost

I will be getting one to for my SFF which should be fun. However not sure if these would be replacing my main build consisting of always crashing and continually downgrading sub optimized (thanks to Nvidia drivers







) Quad Titans.... Lol
But, I will wait to see about the voltage unlock or what not before making the final determination......


----------



## RagingCain

Quote:


> Originally Posted by *Alatar*
> 
> My personal opinion of this card after looking at the performance numbers and considering the memory amount is that it should be priced at $599 and then it'd be at the perfect spot for the average high end buyer.
> 
> Only real exception being SFF where even higher pricing would be a non issue.


That's if they weren't desperate for market share. This card should be priced $549 or $499.


----------



## MonarchX

Just as I've argued before - vRAM speed (HBM) does not improve performance as much as a good GPU, like one in Titan X and 980 Ti. HBM helps in 4K situations, but 4K is still beyond reach for AMD cards because a single Fury X won't sustain 60fps in most games, at least not @ Ultra settings. You would need several Fury X cards for 60fps @ 4K, but we all know how much Cross-Fire sucks due to drivers...

All I can hope for is that AMD was targeting the cheapo market and never attempted to compete with Titan X and GTX 980 Ti. However, Fury X is about 10% or so faster than overclocked GTX 980, which costs less than Fury X and even has HDMI 2 support. It makes little sense to buy Fury X over GTX 980 at this point.

4GB limit is also quite pathetic, considering that some games already go beyond 4GB usage @ 1080p! Its as if AMD does not live in the real world, but the world they wish it would be... I was HOPING that Fury X would be a Titan X rival and if such was the case, then I could possibly switch from nVidia to AMD, but after seeing these reviews, I'd have to say - NO WAY!

OC potential at the moment is also pathetic. The cooler could've been MUCH better and could've covered the entire card, including VRM's, and other chips.

Doesn't AMD have some insight into how fast nVidia cards are going to be? Why not develop and release a GPU that actually COMPETES with nVidia's top cards?


----------



## criminal

Quote:


> Originally Posted by *PureBlackFire*
> 
> this card looks very inconsistent. in some titltes beating or matching the 980ti and in others falling way behind where this trend doesn't happen to other AMD cards. drivers? BIOS? hardware problems? by my estimate this card should flat out be 100-120% faster than the R9 285/R9 380 and it is in some titles. but what's making this card so slow in others?


My thoughts exactly. Something is off. And I am not just saying that because I wanted Fury X to be awesome.


----------



## Kuivamaa

Quote:


> Originally Posted by *Valenz*
> 
> Wow, seems that the x has problems with GTAV and Battlfield on every review. Very sad news.


Quote:


> Originally Posted by *SKYMTL*
> 
> Makes you wonder what will happen with Battlefront...


Nothing. BF4 is meant to be run with Mantle anyway, I doubt it receives any DX11 care at all for radeons.


----------



## Cool Mike

I was ready to press the buy button when available. I have serious doubts now.


----------



## Casey Ryback

Quote:


> Originally Posted by *criminal*
> 
> My thoughts exactly. Something is off. And I am not just saying that because I wanted Fury X to be awesome.


+1

Also The 390X looks pretty amazing in these reviews. Not sure how it can be performing that much better than a 290X.

Maybe I should just get a 290X and OC it and be happy with the old price/performance ratio of the what is now the midrange.


----------



## zalbard

Quote:


> Originally Posted by *p4inkill3r*
> 
> I feel that it will only get more competitive as driver maturation sets in and being within 8-10% of nvidia's hottest card is good enough for me.


People said the same thing about drivers when Radeon 69x0 came out. Magic drivers are not going to happen.


----------



## PureBlackFire

Quote:


> Originally Posted by *criminal*
> 
> My thoughts exactly. Something is off. And I am not just saying that because I wanted Fury X to be awesome.


I'm thinking something's wrong with the memory bandwidth or ROP limited, like Tahiti. it's the only thing that makes sense. this card has an even bigger advantage in SP count and bandwidth vs Tonga than Hawaii has over Pitcairn and yet performance scaling is worse. seems like they could benefit from doubling the ROP count.


----------



## BoredErica

Quote:


> Originally Posted by *CasualCat*
> 
> Are they proving it though or is that their assumption? I suspect it is the latter.


http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V

If you look at the frame consistency of the Fury X on 4k with GTA V it's very bad, and there's even a crazy FPS ditch in the benchmark.


----------



## zGunBLADEz

So poor overclocks, even poorer overclock scaling. Good for out of the box and thats it..


----------



## rt123

Man did Toms get a golden sample..?

*Lower power Consumption than 980Ti...?*


----------



## ZealotKi11er

Disappointed. I have a feeling we got Fury X came but Drivers will come in a 1 year.


----------



## batmanwcm

Disappointing to say the least. There was so much hype behind this card. I was planning to make the Fury non-X my next upgrade but I guess I'll just wait it out until Pascal releases unless I get a good deal on a 295x2 w/ a block.


----------



## p4inkill3r

Quote:


> Originally Posted by *zalbard*
> 
> People said the same thing about drivers when Radeon 69x0 came out. Magic drivers are not going to happen.


Except that GCN _has_ matured along with its drivers.


----------



## maarten12100

Quote:


> Originally Posted by *rt123*
> 
> Man did Toms get a golden sample..?
> 
> *Lower power Consumption than 980Ti...?*


Apparently it ramps up power consumption the drops of. The average is therefore not all bad. Are they using a frame limiter maybe?


----------



## benbenkr

"Overclocker's dream".

Lol.


----------



## upload420

that and i keep hearing drivers this drivers that. I never remember my titan x suffering from driver issues on release. Nor do i remember having any problem overclocking it. I actually remember going holy crap i hit over 1300mhz core with no voltage increase. Also back to the drivers what is to say nvidia can't churn out more performance with drivers as well. Not like they have been push to in the last several months. I am not happy about the results of the fury x in anyway. I only buy nvidia cards but i need amd to stay competitive or else my prices never drop and tend to go up. So much for catching a titan x on sale for cheap. Anyways here is to hoping the fury x has some hidden power somewhere if it be oc or driver improvements.


----------



## Deepblue77

I wouldnt be surprised if this is their last high-end GPU release. The Fury X will not sell when it is priced at parity with a 980 ti.


----------



## Xuper

hmm , Impressive if it's true.

http://techreport.com/r.x/radeon-r9-fury-x/power-load.gif

Edit: Fixed Wrong link


----------



## maarten12100

Quote:


> Originally Posted by *p4inkill3r*
> 
> Except that GCN _has_ matured along with its drivers.


The 7970 matured to GTX 780 level pretty much but is that maturing of AMD or neglect from Nvidia.








Quote:


> Originally Posted by *zGunBLADEz*
> 
> So poor overclocks, even poorer overclock scaling. Good for out of the box and thats it..


stock voltage


----------



## Casey Ryback

Quote:


> Originally Posted by *rt123*
> 
> Man did Toms get a golden sample..?
> 
> *Lower power Consumption than 980Ti...?*


That's odd.
Quote:


> Originally Posted by *benbenkr*
> 
> "Overclocker's dream".
> 
> Lol.


Disappointing statement right there.


----------



## RagingCain

I honestly can't recommend this card at it's current price, it's not the best choice for the price.


----------



## criminal

Fury X cards popped up at this link for a second:

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&N=8000&Order=BESTMATCH&Description=PPSSPYMKCFJSSG&icid=319818

Anybody still getting one right off the bat might find this link useful.









They are there!


----------



## 364901

Quote:


> Originally Posted by *BigMack70*
> 
> Most reviews are showing that AMD vastly overstated how well this card will overclock... unless a lot of the press just got dud samples, this does not look like a very good overclocker at least with initial silicon.


I'm hearing whispers about a new driver a month or so out from now that will include voltage adjustments. Or at least, other OC software will have that exposed eventually.

I'm wondering about all the weird results with GTA V and Project Cars, and the lower-than-expected tesselation performance. It may look like GCN 1.2 on the surface, but I'm pretty sure that it isn't identical to Tonga. ROP counts may also be holding things back a bit, according to Techreport's resuilts at 4K.


----------



## Xtreme21

Quote:


> Originally Posted by *Casey Ryback*
> 
> +1
> 
> Also The 390X looks pretty amazing in these reviews. Not sure how it can be performing that much better than a 290X.
> 
> Maybe I should just get a 290X and OC it and be happy with the old price/performance ratio of the what is now the midrange.


+1

Something doesn't seem right.


----------



## zalbard

Quote:


> Originally Posted by *p4inkill3r*
> 
> Except that GCN _has_ matured along with its drivers.


So did Kepler and Maxwell.


----------



## DiceAir

So let me get this straight. Compared to 980ti reference it's slower, Doesn't come with hdmi 2.0 and no DVI. This is just sad. For my korean monitor i need DVI otherwise I need a new monitor and I'm super happy with my qnix [email protected] I though HBM will make the thing so much faster but i was expecting memory not to change it that much. You need more gpu power than memory power but I guess it's the first card to come with HBM. I'm still very sad that they don't have Dual link DVI on the card.

My local pc shop has a specail pre-order for the 980ti HOF air cooled edition and I know for a fact that thing will be much better than reference 980ti and will allow for some great overclocking as well. But maybe with some driver tweaks and/or DX12 it might just win Nvidia in all aspects. We will have to wait and see. I will still wait for dx12 games to come out to see performance.


----------



## Blameless

Quote:


> Originally Posted by *SKYMTL*
> 
> I did look at the reviews. 4GB isn't a limiting factor.
> *Removes backplate*
> 
> *Complains about temperatures*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The backplate allows for proper dissipation through the PCB of some of that heat.


Is the backplate in physical contact with the VRM area via thermal pads/gap filler? I haven't seen good images yet, so this is an honest question.
Quote:


> Originally Posted by *SKYMTL*
> 
> Not a great situation but the "heat rises" rule is in effect here.


Heat rises in fluids because hot fluids expand and become less dense, thus they float.

Heat doesn't "rise" in solids (solids don't flow); it's conducted in all directions pretty evenly.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Deepblue77*
> 
> I wouldnt be surprised if this is their last high-end GPU release. The Fury X will not sell when it is priced at parity with a 980 ti.


Fury (Air cooled) will be $50 more then GTX 980 and be faster. That should sell. There are so many things AMD got wrong with this card and some many things wrong at the same time.


----------



## upload420

yea it needed to overclock well out of box. It is a high end gpu aimed at enthusiast with a water cooler yet again aimed at enthusiast. I would never buy anything water cooled if it didn't overclock decent out of box.


----------



## ondoy




----------



## mav451

Quote:


> Originally Posted by *Blameless*
> 
> Fury seems to be ROP and perhaps TMU limited.
> 
> Hawaii and Fiji both have 64 ROPs. In purely fill rate limited scenarios they should perform pretty similarly at similar clocks.
> 
> Everyone focuses on SPs, but fill rate still matters. *There is a reason why GM200 has 96 ROPs*.
> 
> If Fury had the 128 ROPs mentioned in early rumors, it would have a dominating lead. With 64 ROPs, that memory bandwidth from HBM is going to be mostly wasted.


If we can whittle it down to costs, what would it have cost AMD to make Fury X with 96 or 128 ROPs?


----------



## maarten12100

*FX* -> *F*ury *X*
Coincidence?

I think not this reeks of FX branding.


----------



## rt123

Quote:


> Originally Posted by *maarten12100*
> 
> Apparently it ramps up power consumption the drops of. The average is therefore not all bad. Are they using a frame limiter maybe?


Unfortunately they don't specify. Just say gaming power consumption.


----------



## tajoh111

As I said earlier since the beginning of the year, you can't expect the card to scale linearly with components and clockspeed.

i.e you can't just take 45% more shaders, add about 5% more clocks and get your overall speed like wccftech or people like Raghu were claiming(on top of added additions from improvements to GCN. We are not getting 55-65% increase over 290x, we are getting more along the lines of 40% and maybe a tad less. The reason being is that we can't scale up indefinitely and expect linear scaling. Otherwise, manufactures could stay on the same architecture forever and get basically a 100% increase with every node shrink. Companies have to change architecture to maintain scaling.

AMD is getting to the point where they have too many shaders and adding more, is going to yield way less return on investment in performance. The good news is cut down fury should perform about 5%-8 slower clock for clock compared to 14.6% which the shader difference suggest.

From the reviews, at best it is matching the gtx 980 ti at 4k(within 2 percent), consistently losing to it at 1440p and in most reviews, is losing to the gtx 980 ti overall. Add in the overclocking headroom advantage and it it's obvious Nvidia has the faster card.

Although kind of a moral victory that AMD has come so close to Nvidia's gtx 980 ti, it not a financial one. Nvidia's titan x is likely cheaper for nvidia to produce than a fury x, but AMD has to sell the fury x at gtx 980 ti pricing and this is where they are going to run into trouble. With public branding preference for nvidia, the only reason why fury is going to sell out is because of short stock. I expect price drops on AMD lineup coming within a couple months if supply is sorted out.


----------



## TheMentalist

Quote:


> Originally Posted by *maarten12100*
> 
> *FX* -> *F*ury *X*
> Coincidence?
> 
> I think not this reeks of FX branding.


Same story again...


----------



## Clukos

So let me get this straight, it is as powerful as a 980ti in the best case scenario, it overclocks poorly in comparison (maxwell overclocks like a beast), has 2gb less ram than the 980ti and costs as much as a regular 980ti. So at this point the only benefit of getting the Fury X over the 980ti is for the watercooling kit which keeps it cooler. So if you are thinking of overclocking your card you are much better off just grabbing a 980ti with an aftermarket cooling solution on it like the G1.

This should have been priced at least 50-100 dollars less to be competitive in comparison. The 980ti also comes with features such as shadowplay and less overall power consumption. I don't really understand what AMD are trying to do with this card.


----------



## Rickles

One thing we can all agree on is that the backplate does some serious work.



Whoops, helps to have the right picture


----------



## Vegasus

I think AMD did good considering they went for new tech. That 4gb HBM is nice on 4k and about equal as 6GB on 980 TI. It's not future proof but what is? TX 3 SLI? I'm not that crazy.
Can't wait for next gen


----------



## maltamonk

Brand new tech...check
History of increasing performance after launch....check
History of decreasing prices after launch....check

Seems pretty good to me

Here's hoping that the plain fury will have 3rd party vendor editions at the oh so nice $550ish price point that retain performance.


----------



## Vesku

We can see why AMD waited, looks like they still have more driver work to do to get full use out of Fury X. Glad I don't need to consider a GPU upgrade until Fallout 4.


----------



## bvsbutthd101

Quote:


> Originally Posted by *Rickles*
> 
> One thing we can all agree on is that the backplate does some serious work.


Is that even the backplate? Looks like the underside of the card.


----------



## rt123

Quote:


> Originally Posted by *zalbard*
> 
> So did *Kepler* and Maxwell.


Yup. Dropping the 780Ti down to equal or less than 290 (non-X) level of performance is called "Driver Maturity".


----------



## 1337LutZ

However dissapointed i am impressed on how it scales on high resolutions, its barely flinching.


----------



## tconroy135

Quote:


> Originally Posted by *upload420*
> 
> that and i keep hearing drivers this drivers that. I never remember my titan x suffering from driver issues on release. Nor do i remember having any problem overclocking it. I actually remember going holy crap i hit over 1300mhz core with no voltage increase. Also back to the drivers what is to say nvidia can't churn out more performance with drivers as well. Not like they have been push to in the last several months. I am not happy about the results of the fury x in anyway. I only buy nvidia cards but i need amd to stay competitive or else my prices never drop and tend to go up. So much for catching a titan x on sale for cheap. Anyways here is to hoping the fury x has some hidden power somewhere if it be oc or driver improvements.


Hell I don't even overclock my Titan and it almost hits 1400 with kboost


----------



## Alatar

Quote:


> Originally Posted by *maarten12100*
> 
> Do we already have submissions of those bench squads that go kingpin style on their cards. Shattering records would do a great deal of image repair to this not so great launch. If it can't do good there then how is this my dream!


The page to follow is this one:

http://hwbot.org/hardware/videocard/radeon_r9_fury_x/
Quote:


> Originally Posted by *bvsbutthd101*
> 
> Is that even the backplate? Looks like the underside of the card.


Yes it's the front side of the card. Backplate isn't visible in that picture.


----------



## Casey Ryback

Quote:


> Originally Posted by *maltamonk*
> 
> Brand new tech...check
> History of increasing performance after launch....check
> History of decreasing prices after launch....check
> 
> Seems pretty good to me
> 
> Here's hoping that the plain fury will have 3rd party vendor editions at the oh so nice $550ish price point that retain performance.


+1

Hopefully those positives shine through, going on recent history they should.

Buying one right now instead of a 980ti is arguably not the best move, then again you'll still have a very nice GPU and be supporting the underdog.


----------



## illusive snpr

Looks like Fury X almost doubles what the 290x can do in most places...guess its time to move on from my 280x xfire setup...since one card can now beat it.


----------



## ThePath

What huge disappointment. After all this hype about HBM, this what we get !!

Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU

Let us hope Zen won't be overhyped as well.


----------



## BigMack70

Quote:


> Originally Posted by *maltamonk*
> 
> Here's hoping that the plain fury will have 3rd party vendor editions at the oh so nice $550ish price point that retain performance.


Yeah... based on these lackluster reviews and the fact that there will be no custom Fury X cards, I think that custom Fury Pro cards are going to be the sweet spot... will probably do just as good a job competing with custom 980 Ti cards and will hold a $100 cheaper price point.
Quote:


> Originally Posted by *ThePath*
> 
> What huge disappointment. After all this hype about HBM, this what we get !!
> 
> Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU
> 
> Let us hope Zen won't be overhyped as well.


This is not really comparable to faildozer.... remember, Phenom II was *faster* than bulldozer when it launched. If Fury X were slower than the 290X, then we could compare the two. As it is, this is infinitely more of a success for AMD.


----------



## 1337LutZ

Quote:


> Originally Posted by *ThePath*
> 
> What huge disappointment. After all this hype about HBM, this what we get !!
> 
> Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU
> 
> Let us hope Zen won't be overhyped as well.


Tbh HBM is showing to keep up on 4k resolutions even with only 4gb of memory. Not sure if that is a dissapointing?


----------



## PureBlackFire

hardocp:
Quote:


> Limited VRAM for a flagship $649 video card, sub-par gaming performance for the price, and limited display support options with no HDMI 2.0 and no DVI port. To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for? The AMD Radeon Fury X is a confusing product, like a technology demo not fully realized, a showcase for HBM only but with no real substance. The AMD Radeon Fury X looks to be a great marketing showcase, but its prowess starts waning when you consider its value to gamers and hardware enthusiasts.


brutal conclusion.


----------



## 1337LutZ

Quote:


> Originally Posted by *PureBlackFire*
> 
> hardocp:
> brutal conclusion.


haha, guess they wont get any future AMD card for testing


----------



## rt123

@TheMentalist Where did the poll go.? I was curious about the results.


----------



## joeh4384

Guru3d's imaging doesn't show the same results.

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


----------



## p4inkill3r

Quote:


> Originally Posted by *PureBlackFire*
> 
> hardocp:
> brutal conclusion.


Yep, they don't like it that much.


----------



## darkwizard

Anybody took a look at this? may give a different perspective, they do test more games in there and it seems to be fairing better in other games.

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/


----------



## AmericanLoco

Quote:


> Originally Posted by *PureBlackFire*
> 
> this card looks very inconsistent. in some titltes beating or matching the 980ti and in others falling way behind where this trend doesn't happen to other AMD cards. drivers? BIOS? hardware problems? by my estimate this card should flat out be 100-120% faster than the R9 285/R9 380 and it is in some titles. but what's making this card so slow in others?


It's gotta be drivers. TPU had some instances where it pulled ahead of the TitanX @ 4K. I think the horsepower is there, but the software is letting it down. Hopefully those 15.20 drivers bring some life to it.


----------



## TheMentalist

Quote:


> Originally Posted by *rt123*
> 
> @TheMentalist Where did the poll go.? I was curious about the results.


Poll had to be removed, this is still a news article(polls are not allowed in those).


----------



## michaelius

Quote:


> Originally Posted by *benbenkr*
> 
> "Overclocker's dream".
> 
> Lol.


I never understood why people are taking AMD marketing promises seriously after all those years.


----------



## MadRabbit

Quote:


> Originally Posted by *PureBlackFire*
> 
> hardocp:
> brutal conclusion.


Wow...subpar gaming performance and it mostly sits 5% range of the Ti...Another site to strike off my list. Thanks.


----------



## Alatar

Quote:


> Originally Posted by *rt123*
> 
> @TheMentalist Where did the poll go.? I was curious about the results.


News threads (especially big and important ones like review threads) shouldn't have opinion polls. The other sections of the forum are for that purpose.


----------



## Blameless

Quote:


> Originally Posted by *mav451*
> 
> If we can whittle it down to costs, what would it have cost AMD to make Fury X with 96 or 128 ROPs?


I couldn't really even guess how many transistors this would take, but it would probably be costly.
Quote:


> Originally Posted by *Rickles*
> 
> One thing we can all agree on is that the backplate does some serious work.


Yeah, but is it cool because it's a good heatsink or a good insulator?


----------



## BigMack70

Quote:


> Originally Posted by *MadRabbit*
> 
> Wow...subpar gaming performance and it mostly sits 5% range of the Ti...Another site to strike off my list. Thanks.


hardocp hasn't done quality GPU review analysis for a long time... their reviews are nice to check for some different data points since they have a somewhat unique testing methodology, but their conclusions are often really silly and often contradicted by their own data


----------



## Rickles

Quote:


> Originally Posted by *bvsbutthd101*
> 
> Is that even the backplate? Looks like the underside of the card.


Yea, linked the wrong one, good catch.


----------



## maarten12100

Quote:


> Originally Posted by *MadRabbit*
> 
> Wow...subpar gaming performance and it mostly sits 5% range of the Ti...Another site to strike off my list. Thanks.


In reality price and OC performance will make or break this card. For the ordinary bloke price more than OC performance.


----------



## bvsbutthd101

Quote:


> Originally Posted by *Rickles*
> 
> Yea, linked the wrong one, good catch.


haha, someone posted a link to the page. Runs cool on the backplate side too.


----------



## Casey Ryback

Quote:


> Originally Posted by *PureBlackFire*
> 
> hardocp:
> brutal conclusion.


Yet if it was $50 cheaper, had unlocked voltage right now and there was proof of it overclocking nicely.............there'd be waves of people stepping over each other to get one.
Quote:


> Originally Posted by *michaelius*
> 
> I never understood why people are taking AMD marketing promises seriously after all those years.


AMD has only stated that memory will be locked, so for all we know voltage could be unlocked for these cards very shortly.


----------



## Alatar

Quote:


> Originally Posted by *MadRabbit*
> 
> Wow...subpar gaming performance and it mostly sits 5% range of the Ti...Another site to strike off my list. Thanks.


Well H looked at their own tests where it did seem somewhat underwhelming for the price considering that the competing card sits at the same price point, has 50% more memory and comes in custom versions that are actually quite a bit faster than the reference model.
Quote:


> Originally Posted by *Blameless*
> 
> I couldn't really even guess how many transistors this would take, but it would probably be costly.
> Yeah, but is it cool because it's a good heatsink or a good insulator?


Air is a good insulator and there's air under the front plate


----------



## Xuper

Quote:


> Originally Posted by *ThePath*
> 
> What huge disappointment. After all this hype about HBM, this what we get !!
> 
> Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU
> 
> Let us hope Zen won't be overhyped as well.


What do you mean by Huge ?


----------



## p4inkill3r

Quote:


> Originally Posted by *michaelius*
> 
> I never understood why people are taking AMD marketing promises seriously after all those years.


You realize that there aren't any tools in the wild yet to unlock voltage, yes?


----------



## sugalumps

Few people in here eating crow, wont name any names


----------



## $ilent

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fury (Air cooled) will be $50 more then GTX 980 and be faster. That should sell. There are so many things AMD got wrong with this card and some many things wrong at the same time.


To me this is the same story as the 290X release. Some half decent reviews put it on par with NV but it was priced high with poor driver support. Then the 290 came out at a much lower price point and it was good value for money. The fury will probably be the same.

Thankfully im not going to be stung this time around since I have no intention of picking up a fury X.


----------



## joeh4384

Is Fiji made at GF or TSMC? Also, what kind of DP does it have? I am curious how HBM works in professional applications and Fiji could make some good in roads there since Maxwell does not do DP.


----------



## forthedisplay

Yeah, dunno.

It's on par, with a smaller size, but only 4GB VRAM and no HDMI 2.0.

I can even buy a GTX 980 TI for a little bit cheaper in the EU atm. They're still >700€ both, though, so it's a no-go for me at this point. I'll keep the r9 290 for now, these things are just too much for me to justify buying these things in the summer.

It falls down to external factors, like FreeSync / GSync... in most cases it'll be hard to tell the cards apart.


----------



## Blameless

Quote:


> Originally Posted by *AmericanLoco*
> 
> It's gotta be drivers.


Possibly, but I think some of these scenarios may simply be reflective of a fill rate limitation.

ROP count hasn't increased and thus raw pixel fill rate is the same as a 290(X) or 390(X).
Quote:


> Originally Posted by *Alatar*
> 
> Air is a good insulator and there's air under the front plate


Guru3D's backplate image is also only ~20C over ambient.


----------



## Rickles

Quote:


> Originally Posted by *Xuper*
> 
> What do you mean by Huge ?


He probably means that it doesn't meet his expectations.

I was fairly certain it would land where it is, though I didn't expect it to be losing to a 780 Ti on project cars. It is a great card for SFF that doesn't fit a 980 Ti, outside of that I wouldn't recommend this over 980 Ti.

Quote:


> Originally Posted by *p4inkill3r*
> 
> You realize that there aren't any tools in the wild yet to unlock voltage, yes?


We also don't know how HBM will interact with added voltage to the core, there could be a reason that voltage is currently locked... though this is probably an overly pessimistic viewpoint.


----------



## tconroy135

Quote:


> Originally Posted by *ThePath*
> 
> What huge disappointment. After all this hype about HBM, this what we get !!
> 
> Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU
> 
> Let us hope Zen won't be overhyped as well.


AMD going to their next-gen tech to compete with nVidia current gen, kinda sad...


----------



## iLeakStuff

Why is the noise level the same as 980Ti? I get that the radiator is on the outside while 980Ti is on the inside of the desktop, but I had hope that water cooling would bring lower noise. Lower temperatures is great and all but 60C or 80C means little since they are both acceptable. :/


----------



## ZealotKi11er

So the only way for this card to be a success was to beat the 3 weeks old GTX980 Ti. A close race is a failure. Last time I checked when HD 4870 got GDDR5 it was not faster then Nvidia's GDDR3 flagship nor was HD 4850 with GDDR3 that much slower. I think if you just look at this card [GPU] only it is not that impressive. R9 290X vs HD 7970 was a huge bump from 32 ROPs to 64 ROPs. Also GCN 1.0 To 1.1 was a 2 years difference. GCN 1.2 has been out since Tonga so nothing new and impressive. I think AMD put too much faith on HBM1. Maybe Fury X is just a step for them.


----------



## joeh4384

Quote:


> Originally Posted by *ThePath*
> 
> What huge disappointment. After all this hype about HBM, this what we get !!
> 
> Fury is the most overhyped GPU, just like how bulldozer was most overhyped CPU
> 
> Let us hope Zen won't be overhyped as well.


I still think it is a solid product. The internet hyped it up too much. It is pretty close to the 980ti in roughly the same power envelope. I say it is a good product but not the home run AMD really needed with their current market share.


----------



## Pantsu

Disappointing oc results, but I suppose it was to be expected out of GCN. I was kinda hoping against hope that AMD would've done more to improve the core besides HBM, but it doesn't seem to be the case.

They do need to price this lower than 980 Ti which isn't good for competition. I'd really like to get a faster card for my 4K monitor, but the 980 Ti just didn't feel good enough, and now Fury is even more disappointing. Looks like I'll just wait for 16 nm... *sigh*


----------



## Alatar

Quote:


> Originally Posted by *Blameless*
> 
> Guru3D's backplate image is also only ~20C over ambient.


So does anyone know if the backplate makes any sort of thermal connection to any of the hot parts? Are there thermal pads on the backside of the VRM or the GPU package?


----------



## edo101

Quote:


> Originally Posted by *p4inkill3r*
> 
> I'll be ordering one today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel that it will only get more competitive as driver maturation sets in and being within 8-10% of nvidia's hottest card is good enough for me.


Yep. I would too but I am not in the market....unless of course that sweet sexy Nano just tugs at me too much. But yeah I would, if nothing else just to support AMD. People want better performance but nobody buys their cards, and you wonder why they continue to struggle

Card will get better, and we'll get around to proper overclocks


----------



## ryanrenolds08

Hmmm. I was hoping for something a hair faster than a 980Ti so as I might have had something that I could sink my teeth in. Been wanting to upgrade for a while now....guess the "old" 780Ti OC will easily keep me going until next year.


----------



## Cool Mike

Newegg has sold out. I came close to purchasng one of the last remaining Fury's. After reading a few reviews the 980Ti is definitely superior. I will stick with the 980Ti.


----------



## MadRabbit

Quote:


> Originally Posted by *Alatar*
> 
> Well H looked at their own tests where it did seem somewhat underwhelming for the price considering that the competing card sits at the same price point, has 50% more memory and comes in custom versions that are actually quite a bit faster than the reference model.
> Air is a good insulator and there's air under the front plate


What ever fits their agenda I guess.

It's what people want to see, when AMD wasn't as good on 4K people complained, now that it is 4K doesn't magically matter anymore. It's around 5% slower than the Titan X while being around 300 cheaper. People whine no matter what. Like it's said before, AMD can't win no matter what.


----------



## Rickles

Quote:


> Originally Posted by *Alatar*
> 
> So does anyone know if the backplate makes any sort of thermal connection to any of the hot parts? Are there thermal pads on the backside of the VRM or the GPU package?


Well, I'd be willing to bet that 8 screws aren't going to dissipate ~50c of heat. I'd guess that thing has a big fat thermal pad on the back.


----------



## 970Rules

Reviews highlights:

"In terms of performance there wasn't any game where the AMD Radeon R9 Fury X was faster than the GeForce GTX 980 Ti. "

"Limited VRAM for a flagship $649 video card, sub-par gaming performance for the price, and limited display support options with no HDMI 2.0 and no DVI port. To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for? "

"Overclocking potential of the card is slim, and memory overclocking has been disabled completely. "
" The GM200 GPU on GeForce GTX 980 Ti and Titan X overclocks much better, which means that with both cards overclocked to the max, GTX 980 Ti will have a large performance lead over an overclocked Fury X."

"AMD's pricing of $650 for the Fury X matches that of GeForce GTX 980 Ti exactly. I'm not fully convinced that Fury X can win at this price level. "

" it struggles to keep its head above water in Battlefield 4, Shadow of Mordor, Dying Light and GTA V. In some cases it loses quite spectacularly. "

"Overclocking headroom wasn't anything to write home about either as there seems to be just 8-10% of additional core speed headroom and we've heard of some samples hitting a wall at a paltry 5%."

"you'll find that the Fury X struggles to live up to its considerable potential. Unfortunate slowdowns in games like The Witcher 3 and Far Cry 4 drag the Fury X's overall score below that of the less expensive GeForce GTX 980. What's important to note in this context is that these scores aren't just numbers. They mean that you'll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X."

"GPU voltage is going to be trivial here though as we could only mildly tweak the GPU to 1125 MHz (+75 MHz), after that it started to become unstable. "

"With these settings, we were only able to add 80MHz to the clock speed for a final reading of 1,130MHz before we ran into stability issues. This equates to just under 8 percent; not that impressive if we're being honest. There's also no ability to overclock the memory. "
" The 980 Ti can increase performance by a good 20 percent overclocked"

"If you're gaming at 1080p or 1440p, the GTX 980 Ti offers better value for money. The two cards have price parity but Nvidia's is faster and significantly better at overclocking, too. "

"Nvidia's pre-emptive launch of the GeForce GTX 980 Ti uses a more efficient core and regular GDDR5 memory to achieve benchmark performances that are, in our opinion, a little better than the latest Radeon's, perhaps helped in small part by having a larger framebuffer. Partner GTX 980 Ti's are faster still and overclock better than Fury X."

Hi guys with 650 bucks burning a hole in your pocket, Go buy a 980 Ti and overclock it to 1400-1500 MHz on the core and you can OC the memory also!


----------



## Forceman

Quote:


> Originally Posted by *AmericanLoco*
> 
> It's gotta be drivers. TPU had some instances where it pulled ahead of the TitanX @ 4K. I think the horsepower is there, but the software is letting it down. Hopefully those 15.20 drivers bring some life to it.


At some point though, that stops becoming an excuse and starts becoming a problem. How do they not have drivers ready for their flagship release?


----------



## Exilon

Quote:


> Originally Posted by *joeh4384*
> 
> I still think it is a solid product. The internet hyped it up too much. It is pretty close to the 980ti in roughly the same power envelope. I say it is a good product but not the home run AMD really needed with their current market share.


Lol don't blame the internet for this. Blame AMD's shill group RED TEAM PLUS and their feet dragging ever since Computex.


----------



## Ha-Nocri

AMD need to greatly improve GCN until Pascal comes out.. They compete against NV now only b/c HBM makes room for Fury to use more power for GPU itself.


----------



## cstkl1

Put nvidia aside.. Wheres the 50% faster than 290x.
In the presentation that was the base gpu comparison.


----------



## Blameless

Quote:


> Originally Posted by *Alatar*
> 
> So does anyone know if the backplate makes any sort of thermal connection to any of the hot parts? Are there thermal pads on the backside of the VRM or the GPU package?


This is what I want to know.

I haven't seen any good disassembly images yet.


----------



## Pentdragon

It's just a bad deal for the price, also a huge dissapointment considering the hype. They had water, idk why they didn't clock it 150 or 200 MHz higher.


----------



## jmcosta

Quote:


> Originally Posted by *Blameless*
> 
> Yeah, but is it cool because it's a good heatsink or a good insulator?


they removed the backplate in the other review


----------



## sugalumps

Same noise levels for having an awkward AIO









The reason you put up with them is for the better noise/temps. Same power in benches, 2gb less vram and late to the party while performing about the same if not worse. Was hoping for alot better, nvidia must have known the performance averages and timed and priced that 980ti to perfection.

Considering the performance is worse and the vram is already limiting(not going to go as far) why would you invest in this card, you would have to be seriously deep into one side of a camp to throw all advantages of the competitor out the window. Small form factor builds it will be great for, but you may aswell wait for the nano. Then lets not forget you will be gimped in a good number of games with nvidia and their tactics.


----------



## Olivon

Quote:


> Originally Posted by *SKYMTL*
> 
> I did look at the reviews. 4GB isn't a limiting factor.
> *Removes backplate*
> 
> *Complains about temperatures*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The backplate allows for proper dissipation through the PCB of some of that heat. Not a great situation but the "heat rises" rule is in effect here.


According to Damien Triolet, backplate has only aesthetics purpose and is not use for a better cooling (like almost all backplates FYI) :
Quote:


> This backplate is not participating in the cooling, the temperature is probably a little higher when it is in place.


----------



## AmericanLoco

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So the only way for this card to be a success was to beat the 3 weeks old GTX980 Ti. A close race is a failure. Last time I checked when HD 4870 got GDDR5 it was not faster then Nvidia's GDDR3 flagship nor was HD 4850 with GDDR3 that much slower. I think if you just look at this card [GPU] only it is not that impressive. R9 290X vs HD 7970 was a huge bump from 32 ROPs to 64 ROPs. Also GCN 1.0 To 1.1 was a 2 years difference. GCN 1.2 has been out since Tonga so nothing new and impressive. I think AMD put too much faith on HBM1. Maybe Fury X is just a step for them.


The HD4870 was successful because it was a small, power efficient core that performed nearly as good as nVidia's flagship while significantly undercutting them on price. The R9 Fury right is performing about equal or slightly slower to nVidia's 980ti, yet costs the same and is months late.

With the expense of HBM and how late the Fury is, I don't know if AMD can actually afford to drop its prices.


----------



## ebduncan

I don't get why people are posting such negative reviews. The Hardocp review is just crazy.

"Limited VRAM for a flagship $649 video card, sub-par gaming performance for the price, and limited display support options with no HDMI 2.0 and no DVI port. To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for? The AMD Radeon Fury X is a confusing product, like a technology demo not fully realized, a showcase for HBM only but with no real substance. The AMD Radeon Fury X looks to be a great marketing showcase, but its prowess starts waning when you consider its value to gamers and hardware enthusiasts."

It's almost as fast as the 980 Ti, has better cooling, it's smaller, uses around the same power. It will overclock well once the voltage control is released. Makes you wonder if DX 12 will change some of its fortunes (performance relative to 980ti)

oh well, I like the Fury X, I think it has a decent value compared to the 980ti esp for small form factor builds.


----------



## bossie2000

Wow i'm impress. Good work AMD. Did'nt think it would beat the Titan X in some games.


----------



## Blameless

Quote:


> Originally Posted by *Rickles*
> 
> Well, I'd be willing to bet that 8 screws aren't going to dissipate ~50c of heat. I'd guess that thing has a big fat thermal pad on the back.


50C isn't a measure of heat, just temperature.

A VRM doesn't need to dissipate much heat, but with just the PCB as a heatsink can still get extremely hot.

Anyway, if there isn't a thermal pad between the metal of the backplate and the VRM, then the VRM could well be running at 110C while the backplate is cold.
Quote:


> Originally Posted by *jmcosta*
> 
> they removed the backplate in the other review


Yes, I saw this, but I never saw any good pictures of the underside of the backplate.

The backplate is either doing wonders for temps, or it's hurting temps.

It would be nice to know which.


----------



## VSG

Isn't GPU-z detecting VRM temps here? Why is everyone doing thermal pics?


----------



## Tivan

Reminder that OC3D has supposedly been using 15.5 drivers, saying so in the video.


----------



## Alatar

Quote:


> Originally Posted by *MadRabbit*
> 
> What ever fits their agenda I guess.
> 
> It's what people want to see, when AMD wasn't as good on 4K people complained, now that it is 4K doesn't magically matter anymore. It's around 5% slower than the Titan X while being around 300 cheaper. People whine no matter what. Like it's said before, AMD can't win no matter what.


4K is clearly more important than ever looking at these reviews and I don't really remember a time when AMD was worse at 4K while everyone considered 4K important. If anything 4K started being considered important during the 290X/290 launch and 4K was the strongest part for those cards.

If anything 4K is given a disproportionate amount of attention when even most of the OCN-style people are on 1440p etc.


----------



## SpeedyVT

Quote:


> Originally Posted by *Kane2207*
> 
> I don't get the OC3D review. I've looked through most of the benches and looks like it's being kicked around by a 980 in some scenarios, yet they sing it's praises at the end giving it a gold award?
> 
> Maybe I just expected more, I dunno...


Also check it's frame variance, it's not always about the highest frame rate but the most consistent. Something NVidia has been messing up as of late.


----------



## Casey Ryback

Quote:


> Originally Posted by *tconroy135*
> 
> AMD going to their next-gen tech to compete with nVidia current gen, kinda sad...


You obviously don't know anything about HBM, your comment is void


----------



## Shadymort

Seems to me the Fury X is exactly where it was rumored to be: in the same market bracket as a gtx 980ti , winning some benchmarks and losing in others, and performing better at 4K than 1440p and lower compared to the other cards. So it's hard to be disappointed at this point. It may not be the hyped monster some people wanted it to be, but it's a quite good card performing in the ballpark of the gtx 980ti and priced accordingly.

I concur with some reviewers and users on the forum that the performance consistency is pretty odd. Some reviews note that for older titles (Metro 2033, etc), which are more optimized on the driver side on both nvidia and amd cards, the performance of the Fury X is better and more consistent, often nearing the R9 295X2 level; but in newer titles, which are less optimized, the performance is definitely odd, as even the gtx 980 seems to do better than the Fury X (which is strange, given the near R9 390X performance). While i don't expect miraculous drivers, i am pretty sure some of those results are caused by drivers not mature enough, rather than other causes.

All those benchmarks made me more curious about the Fury Nano performance. I expect a product witch the same or better performance as the gtx 980, and given the information we have on the Fury Dual, i bet the Nano chip will be the basis of this dual card. Once we know its performance, we can definitely estimate how far the dual card of this generation can push the performance limit







.

P.S.: One thing is for sure, the R9 295X2 is definitely one monster of a card.


----------



## Vesku

Quote:


> Originally Posted by *Forceman*
> 
> At some point though, that stops becoming an excuse and starts becoming a problem. How do they not have drivers ready for their flagship release?


Having to rewrite low level driver code to take into account an entirely new type of memory.


----------



## AmericanLoco

Quote:


> Originally Posted by *Forceman*
> 
> At some point though, that stops becoming an excuse and starts becoming a problem. How do they not have drivers ready for their flagship release?


I'm not excusing them, I think it's pretty ridiculous honestly, especially given how late this thing is. However, the recent beta driver showed significant performance increases, and the upcoming 15.20 driver is supposed to deliver even more.

There are instances where this card just barely edges out an R9 290x, which doesn't make any sense at all. AMD really just needs to throw out their drivers and start over.


----------



## maarten12100

Quote:


> Originally Posted by *SpeedyVT*
> 
> Also check it's frame variance, it's not always about the highest frame rate but the most consistent.


That paints an even worse picture though


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> You obviously don't know anything about HBM, your comment is void


But he is a good soldier, bleeding on the battlefield for the glory of Nvidia. that or he is really the American Psycho


----------



## Casey Ryback

Quote:


> Originally Posted by *iLeakStuff*
> 
> Why is the noise level the same as 980Ti? I get that the radiator is on the outside while 980Ti is on the inside of the desktop, but I had hope that water cooling would bring lower noise. Lower temperatures is great and all but 60C or 80C means little since they are both acceptable. :/


In the review I read..........

The peak for the fury was 32 dba,

and the peak for the 980ti was 40 dba.


----------



## TheMentalist

Funny how the R9-295X2 is still on top...


----------



## criznit

I just realized that the majority of the reviews are done on the 15.5 driver and not the 15.15 driver that was made specifically for the fury x. I wonder if that will make much of a difference?


----------



## dkizzy

So what is the solution? increase HBM memory clock speed? improve drivers? the gap doesn't seem to be daunting here. I would imagine the 15.15 driver will help.


----------



## Alatar

Quote:


> Originally Posted by *Blameless*
> 
> This is what I want to know.
> 
> I haven't seen any good disassembly images yet.


TPU has images:



No thermal pads on the backplate. Purely for looks and protection.


----------



## rt123

Quote:


> Originally Posted by *edo101*
> 
> American Psycho


That is a very good movie.


----------



## kingduqc

http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/30.html

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html

A reference 980ti is faster then a fury x by 9% in 1440p and a custom g1 980ti is faster the a reference 980ti by 15%. The 980ti also overclock more. R.I.P AMD?


----------



## edo101

Quote:


> Originally Posted by *criznit*
> 
> I just realized that the majority of the reviews are done on the 15.5 driver and not the 15.15 driver that was made specifically for the fury x. I wonder if that will make much of a difference?


Gonna have to check. Wouldn't be surprised, don't think these reviewers know enough to find the actual drivers. Especially with AMD switching them around for OSs. For me, not dissappointed. Excited really to see how well the card gets juiced up from driver and oc tool upgrades.

Just hope more people buy so we can continue to get some competition


----------



## Blameless

Quote:


> Originally Posted by *Vesku*
> 
> Having to rewrite low level driver code to take into account an entirely new type of memory.


Driver doesn't need to take into account the type of memory, just how much of it there is and it's topology.

Regardless, this is something they should have been working on for the last 9-12 months.
Quote:


> Originally Posted by *AmericanLoco*
> 
> There are instances where this card just barely edges out an R9 290x, which doesn't make any sense at all.


It makes perfect sense, if these are fill rate limited scenarios, as the Fury X has barely more pixel fill rate than the 290X.
Quote:


> Originally Posted by *maarten12100*
> 
> That paints an even worse picture though


Indeed.


----------



## Noufel

If only the fury was in 550$ price tag


----------



## Serandur

Quote:


> Originally Posted by *Forceman*
> 
> At some point though, that stops becoming an excuse and starts becoming a problem. How do they not have drivers ready for their flagship release?


AMD's chronic lateness is a huge issue.

Past 10 months: Just wait for Fiji. It's coming... sometime.

3 weeks ago: Just wait a little longer for the announcement. Coming at E3

E3: Just wait another week for release date and NDA to lift. Wait 4 for the Fury plain.

Now: Just wait for drivers to mature.

All of the delays were always justified with making sure the product is perfect and ready and that still doesn't appear to be the case.


----------



## BoredErica

Quote:


> Originally Posted by *Blameless*
> 
> Driver doesn't need to take into account the type of memory, just how much of it there is and it's topology.
> 
> Regardless, this is something they should have been working on for the last 9-12 months.
> It makes perfect sense, if these are fill rate limited scenarios, as the Fury X has barely more pixel fill rate than the 290X.
> Indeed.


When is pixel fill rate important?


----------



## Blameless

Quote:


> Originally Posted by *Alatar*
> 
> TPU has images:
> 
> 
> 
> No thermal pads on the backplate. Purely for looks and protection.


Thanks for the image.

Looks like VRM cooling is just crap then, barring anything defective about the sample showing the 104C temp.

At least I have my own thermal pads to use, should I get a discount Fury X.


----------



## SpeedyVT

My understanding of the situation is that the fury card when Windows 10 hits and has DX12 games to play on it is going to blow some socks off. Understanding that AMD gpus have a loss of 20-30% over an NVidia in DX11.


----------



## sugalumps

Quote:


> Originally Posted by *kingduqc*
> 
> http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/30.html
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
> 
> A reference 980ti is faster then a fury x by 9% in 1440p and a custom g1 980ti is faster the a reference 980ti by 15%. The 980ti also overclock more. R.I.P AMD?


The g1 really is the best card around, anything after that and you are going to see very minimul gains(hof / classy etc). Think I am going to have to sell my 980 and spare 780 and grab a g1 980ti.

Unless amd brings out a wonder driver, though it's never right now with amd. Especially in here, it's been they going to destroy nvidia. Now it's wait on drivers, then it's "eh they might be better at dx12!!! in the future". ^ Speak of the devil, just as I right it there is a dx12tm post.


----------



## Vesku

Quote:


> Originally Posted by *maarten12100*
> 
> That paints an even worse picture though


Guru3D throws around "can't peform any better really" and "almost perfect" in pretty much every FCAT run they did. Granted FCAT is apparently limited in resolution but that's on Nvidia for not updating it once AMD started doing better in "smoothness".


----------



## TheMentalist

Quote:


> Originally Posted by *Noufel*
> 
> If only the fury was in 550$ price tag


Just wait for it, it's bound to happen...


----------



## criznit

I just double check, and [H] is using the new driver. So unless this card drops by $50 OR the drivers in Win10 are just great, I can't see myself getting this


----------



## Kedas

not bad performance against the 980ti, but the price should've been a bit lower even if it were 600$ instead of 650$ and they're still not able to OC


----------



## Ganf

Only thing that could save it for me is VR performance, but no one does VR benchmarks, so I have the choice of buying the Fury and the hype behind it's VR performance being just as average as the 980ti, or picking up 980ti Lightnings, having amazeballs OCing, and being able to brute force through any VR disadvantages it may have.

I really need to see frame timing comparisons (Tom's Hardware's comparison is so smashed together that you can't tell anything, what the hell is up with that?) at high framerates to be able to determine if there's anything on the VR side due to HBM, even if they aren't VR tests.


----------



## Alatar

Quote:


> Originally Posted by *Noufel*
> 
> If only the fury was in 550$ price tag


Yeah this definitely isn't the "let's start a price war to gain some marketshare" strategy that some people predicted.

In fact personally I'd argue that 980Tis are better price/perf due to custom models and 50% extra VRAM.
Quote:


> Originally Posted by *Vesku*
> 
> Guru3D throws around "can't peform any better really" and "almost perfect" in pretty much every FCAT run they did. Granted FCAT is apparently limited in resolution but that's on Nvidia for not updating it once AMD started doing better in "smoothness".


PCPer has mixed results for frame times on the fury. GTA V being really bad with spikes (possible vram limitations?)

Either way measuring frame times hasn't required Nvidia software in years. They offered the first overlays, but aren't the only ones doing so anymore.


----------



## SpeedyVT

Quote:


> Originally Posted by *criznit*
> 
> I just double check, and [H] is using the new driver. So unless this card drops by $50 OR the drivers in Win10 are just great, I can't see myself getting this


Windows 10 drivers are great, unfortunately it won't fix older games. In Windows 10 as Project CARS running in it gains literally 30-40% more fps, nearly doubles. I'm not saying Windows 10 is the magic bullet, but every problem with AMD's cards is not the card itself but either the developers choices to support AMD or NVidia and AMD's poor DX11 support.

Hell even the 290 has more physical power over the 980, spec for spec. This isn't an arch thing it's completely bad DX11 support in older windows.


----------



## texni

i made my choice...just ordered an inno 3d 980 ti


----------



## Pawelr98

I think I will still purchase this card.

I still remember when HD7970 came out. At the beggining the card was slower than my HD6990. After few months thanks to driver updates the performance increased and the card bypassed my HD6990 easily.


----------



## rt123

NVM.


----------



## Ashura

Forbes - Radeon R9 Fury X Review
http://www.forbes.com/sites/jasonevangelho/2015/06/24/amd-radeon-r9-fury-x-review-amd-at-their-best/
Quote:


> I recently had a conversation with AMD CEO Dr. Lisa Su, and I asked what her primary goal was when she first assumed leadership of the company. "It's to clarify our mission," she said confidently. "AMD has often been defined by other people. We're defined by much larger competitors. My goal is to say 'this is what we're going to do.' With the technology we have, it's about how we differentiate ourselves in the market. The idea of being the second guy and clawing for a little bit of share isn't what you build a sustainable company around."


----------



## provost

Quote:


> Originally Posted by *criminal*
> 
> Fury X cards popped up at this link for a second:
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&N=8000&Order=BESTMATCH&Description=PPSSPYMKCFJSSG&icid=319818
> 
> Anybody still getting one right off the bat might find this link useful. :thumb
> 
> They are there!


Thanks. I put auto notify on the Sapphire. Not sure why Asus and some others (mfg I can recognize.. lol) are more than the msrp.


----------



## ebduncan

why are people saying it has no overclock room?

the reviewers cannot adjust the voltage yet, so of course there isn't going to be any real gains in the overclocking department until voltage control is out.

I'd fully expect them to reach 1300mhz or so with voltage control.


----------



## Cakewalk_S

I think the big improvement for this card is the HBM... its pretty sweet new tech. Implement that with a smaller tech <20nm and we should have some pretty good performance gains. For the first generation of HBM it's not bad. Wonder what the air cooling cards will end up looking like...


----------



## Serandur

Quote:


> Originally Posted by *Pawelr98*
> 
> I think I will still purchase this card.
> 
> I still remember when HD7970 came out. At the beggining the card was slower than my HD6990. After few months thanks to driver updates the performance increased and the card bypassed my HD6990 easily.


Not to dissuade you from doing what you want, but just as a warning, the 7970 featured a brand new architecture whereas the Fury X is using that same architecture modified. Driver gains like those on the 7970 are unlikely for the Fury X.


----------



## BinaryDemon

It's just a huge shame that AMD didnt do a limited run of the Fury X 8gb for review, even if they wouldnt have enough production units to sell until 3 months from now.


----------



## undeadhunter

Quote:


> Originally Posted by *Cool Mike*
> 
> *Newegg has sold out.* I came close to purchasng one of the last remaining Fury's. After reading a few reviews the 980Ti is definitely superior. I will stick with the 980Ti.


That's the sad part of it... people bad mouth it and are sooo dissapointed, meanwhile sold out .. LOL , and that's why companies still pull this crap off, people will buy it regardless of fails like no hdmi 2.0, less ram than the competition and late. Hype attracts the blind sheep


----------



## Blameless

Quote:


> Originally Posted by *Darkwizzie*
> 
> When is pixel fill rate important?


Resolution and MSAA or SSAA are the main things that will depend on raw pixel fill rate, but it's nearly always important. The question is whether or not something else is the bottleneck at the time.

Pixel fill rate used to be the prime factor in video card performance, to the extent it was the only spec you needed to know. However, over the last decade, shader performance has come to dominate.

Still, pixel fill rate can be a hard limit in some scenarios. At the very least you cannot have a frame rate higher than total pixel fill rate divided by the total number of pixels displayed each frame, but any non-shader based AA method, lack of good occlusion culling, most forms of transparency, etc, can still eat into available pixel fill rate.

Might be hard to tell the difference between a pixel fill rate limitation and a memory capacity limitation as almost everything that accentuates the former tends to need more VRAM as well.


----------



## maltamonk

Quote:


> Originally Posted by *Blameless*
> 
> Thanks for the image.
> 
> Looks like VRM cooling is just crap then, barring anything defective about the sample showing the 104C temp.


Uhmm.....can you elaborate on that please?


----------



## Vesku

Quote:


> Originally Posted by *Alatar*
> 
> Yeah this definitely isn't the "let's start a price war to gain some marketshare" strategy that some people predicted.
> 
> In fact personally I'd argue that 980Tis are better price/perf due to custom models and 50% extra VRAM.
> PCPer has mixed results for frame times on the fury. GTA V being really bad with spikes (possible vram limitations?)
> 
> Either way measuring frame times hasn't required Nvidia software in years. They offered the first overlays, but aren't the only ones doing so anymore.


Does anything measure it at the physical output level like Nvidia's FCAT though? Fraps had frametime capture well before FCAT but it's done at the driver level and so is actually a bit "fuzzy" compared to the actual video output.


----------



## Serandur

Quote:


> Originally Posted by *BinaryDemon*
> 
> It's just a huge shame that AMD didnt do a limited run of the Fury X 8gb for review, even if they wouldnt have enough production units to sell until 3 months from now.


Someone brought up a good point as to why 8 GB Fury Xs might not even be possible. I'm not sure if it's the truth, but the claim is that the interposer is already about as large as it can get and they simply can't fit 8 stacks on it.


----------



## Alatar

Quote:


> Originally Posted by *ebduncan*
> 
> why are people saying it has no overclock room?
> 
> the reviewers cannot adjust the voltage yet, so of course there isn't going to be any real gains in the overclocking department until voltage control is out.
> 
> I'd fully expect them to reach 1300mhz or so with voltage control.


Hawaii fell barely short of Tahiti max frequencies so I'd expect Fiji to fall barely short of hawaii max frequencies.

That's unless HBM or the interposer is limiting core clocks somehow.


----------



## CasualCat

Quote:


> Originally Posted by *Ashura*
> 
> Forbes - Radeon R9 Fury X Review
> http://www.forbes.com/sites/jasonevangelho/2015/06/24/amd-radeon-r9-fury-x-review-amd-at-their-best/


Kind of weird seeing a GPU review on Forbes. Not the place I would have expected to find one.


----------



## edo101

Quote:


> Originally Posted by *undeadhunter*
> 
> That's the sad part of it... people bad mouth it and are sooo dissapointed, meanwhile sold out .. LOL , and that's why companies still pull this crap off, people will buy it regardless of fails like no hdmi 2.0, less ram than the competition and late. Hype attracts the blind sheep


or you know they don't need HDMI 2.0 or 4k, and they are fine with the performance they get granted their will be better drivers and OC tools later. But hey, everybody has to think like you right?


----------



## p4inkill3r

Quote:


> Originally Posted by *ebduncan*
> 
> why are people saying it has no overclock room?
> 
> the reviewers cannot adjust the voltage yet, so of course there isn't going to be any real gains in the overclocking department until voltage control is out.
> 
> I'd fully expect them to reach 1300mhz or so with voltage control.


Why, you ask?
Rhetorical question, I'm sure.


----------



## Alatar

Quote:


> Originally Posted by *Vesku*
> 
> Does anything measure it at the physical output level like Nvidia's FCAT though? Fraps had frametime capture well before FCAT but it's done at the driver level and so is actually a bit "fuzzy" compared to the actual video output.


I was under the assumption that rivatuner basically had a 3rd party version of all of NV's capabilities?


----------



## rt123

Quote:


> Originally Posted by *undeadhunter*
> 
> That's the sad part of it... people bad mouth it and are sooo dissapointed, meanwhile sold out .. LOL , and that's why companies still pull this crap off, people will buy it regardless of fails like no hdmi 2.0, less ram than the competition and late. Hype attracts the blind sheep


People still buy 970s. Kinda of the same thing.


----------



## Casey Ryback

Quote:


> Originally Posted by *rt123*
> 
> People still buy 970s. Kinda of the same thing.


hear hear............


----------



## blue1512

Quote:


> Originally Posted by *undeadhunter*
> 
> That's the sad part of it... people bad mouth it and are sooo dissapointed, meanwhile sold out .. LOL , and that's why companies still pull this crap off, people will buy it regardless of fails like no hdmi 2.0, less ram than the competition and late. Hype attracts the blind sheep


Or maybe you are the blind one.
Why are AMD fans disappointed about FuryX when it's mostly ahead of 980Ti in non Gameworks titles?


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> Yup, they definitely dropped the ball on this one, they dind't learn anything from the bulldozer fiasco. If you create a lot of hype, you gotta deliver, if not people are going to be pissed.


AMD didn't create any hype on this card. The internet did. Bulldozer was a different story.


----------



## MapRef41N93W

Quote:


> Originally Posted by *blue1512*
> 
> Or maybe you are the blind one.
> Why are AMD fans disappointed about FuryX when it's mostly ahead of 980Ti in non Gameworks titles?


Really, because the graphs show a different story where the Fury X is only ahead in AMD titles like FC4. Meanwhile 1500 core 980tis are absolutely tearing Fury a new one in any title.


----------



## provost

Quote:


> Originally Posted by *rt123*
> 
> People still buy 970s. Kinda of the same thing.


or, May be it's because of their recent experience with the incumbent...








That's why I am buying the Furyx anyway... Lol


----------



## Forceman

Quote:


> Originally Posted by *Casey Ryback*
> 
> Anyway your expectations were too much if you expected it to 'smoke' the 980ti, that's just ridiculous.


Problem is, that's exactly what some posters have been saying for weeks/months. "Just wait for Fury". "Fury us going to destroy Titan". "Nvidia us trying to grab quick 980 Ti sales because they know Fury is coming". That ridiculous Hunting Titans poster. Etc. All that stuff adds up to create unrealistic expectations.


----------



## SpeedyVT

Quote:


> Originally Posted by *maltamonk*
> 
> Uhmm.....can you elaborate on that please?


I don't think he realizes how hot vrms get on average. 104 is cool for a VRM.


----------



## carlhil2

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9777989&CatId=7387 At this price, great deal...


----------



## Sleazybigfoot

This is *WITH* the backplate.


This is *WITHOUT* the backplate.


----------



## BinaryDemon

Quote:


> Originally Posted by *Serandur*
> 
> Someone brought up a good point as to why 8 GB Fury Xs might not even be possible. I'm not sure if it's the truth, but the claim is that the interposer is already about as large as it can get and they simply can't fit 8 stacks on it.


Hmmm. I thought the way it was described that the interposer could link 2x HBM stacks sitting side-by-side. I didnt think they had to stack the whole thing.


----------



## Assirra

Quote:


> Originally Posted by *ebduncan*
> 
> why are people saying it has no overclock room?
> 
> the reviewers cannot adjust the voltage yet, so of course there isn't going to be any real gains in the overclocking department until voltage control is out.
> 
> I'd fully expect them to reach 1300mhz or so with voltage control.


Maybe its me but considering my 980 strix reaches 1450 without a voltage adjust i find 1300 with a bit low.


----------



## Circlemage8

Considering Hexus's review (with 15.15 beta) for Witcher 3 versus most of the others who use 15.5 and have much worse performance. Wonder how much that is effecting other results.


----------



## Tivan

Did a little write up what drivers were used in what tests. Why can't they just all use 15.15 c;

TechPowerUp
- *15.5* Beta
PC Perspective
- *15.15* beta
Tom's hardware
- *14.2/15.4*beta
Hardware Canucks
- *15.15*
OC3D
- *15.5*
HardOCP
- *15.15*
Guru3D
?
Bit-Tech
- *15.15* launch driver
Hexus
- *15.15* beta
VMod
- *15.15*?
TechFrag
Not a review.
Hardware FR
?
Techreport
- *15.5* beta
PCWorld
?
PC Gamer
- *15.15*
Hardware Canucks
- *15.15*


----------



## Peybol

Here another one!

https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url


----------



## SKYMTL

Quote:


> Originally Posted by *PureBlackFire*
> 
> this card looks very inconsistent. in some titltes beating or matching the 980ti and in others falling way behind where this trend doesn't happen to other AMD cards. drivers? BIOS? hardware problems? by my estimate this card should flat out be 100-120% faster than the R9 285/R9 380 and it is in some titles. but what's making this card so slow in others?


ROP limitation could be some of the reason behind this. Remember, AMD didn't expand their back-end functions alongside their

Quote:


> Originally Posted by *Kuivamaa*
> 
> Nothing. BF4 is meant to be run with Mantle anyway, I doubt it receives any DX11 care at all for radeons.


What? BF4 had Mantle tacked on months after launch.

Quote:


> Originally Posted by *mav451*
> 
> If we can whittle it down to costs, what would it have cost AMD to make Fury X with 96 or 128 ROPs?


Likely nothing considering the core still has space taken up by TrueAudio DSPs, a technology that's as good as dead in the water.....









Quote:


> Originally Posted by *Olivon*
> 
> According to Damien Triolet, backplate has only aesthetics purpose and is not use for a better cooling (like almost all backplates FYI) :


In that case...ouch.


----------



## Vesku

Quote:


> Originally Posted by *Alatar*
> 
> I was under the assumption that rivatuner basically had a 3rd party version of all of NV's capabilities?


FCAT accuracy can't be mimicked in just software, afaik. It involves analyzing the physical output with a combination of hardware and software. Software based frametimes are getting their information from DirectX which does not 100% correspond to output timings.

http://us.hardware.info/reviews/4164/4/frametime-tests-20-our-take-on-the-latest-developments-nvidia-fcat


----------



## SpeedyVT

Quote:


> Originally Posted by *Tivan*
> 
> Did a little write up what drivers were used in what tests. Why can't they just all use 15.15 c;
> 
> TechPowerUp
> - 15.5 Beta
> PC Perspective
> - 15.15 beta
> Tom's hardware
> - 14.2/15.4beta
> Hardware Canucks
> - 15.15
> OC3D
> - 15.5
> HardOCP
> - 15.15
> Guru3D
> ?
> Bit-Tech
> - 15.15 launch driver
> Hexus
> - 15.15 beta
> VMod
> - 15.15?
> TechFrag
> Not a review.
> Hardware FR
> ?
> Techreport
> - 15.5 beta
> PCWorld
> ?
> PC Gamer
> - 15.15


I want them to use Windows 10.


----------



## Alatar

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> AMD didn't create any hype on this card. The internet did. Bulldozer was a different story.


AMD definitely created a lot of hype for this card. Probably the most for any GPU I've ever seen....

-Short "it's coming" teasers weeks before launch
-HBM NDA lift weeks before launch
-Their own E3 presentation
-Fiji unveil at computex
-All the talk about being a premium brand
*-At least one week of waiting before the reviews where the only available info of the card (benches) came from AMD themselves*

If you want to see a card launch with no hype created by the company themselves just look at the 980Ti launch. No public event for the release, no pre-launch announcements, and reviews just popped out of nowhere on a random sunday.

The whole 300 series launch has been massively hyped by AMD themselves as well as the public.


----------



## Serandur

Quote:


> Originally Posted by *BinaryDemon*
> 
> Hmmm. I thought the way it was described that the interposer could link 2x HBM stacks sitting side-by-side. I didnt think they had to stack the whole thing.


I meant that there's not enough space on the interposer to stack them side-by-side and they can't make a bigger interposer.


----------



## Alatar

Quote:


> Originally Posted by *Vesku*
> 
> FCAT accuracy can't be mimicked in just software, afaik. It involves analyzing the physical output with a combination of hardware and software. Software based frametimes are getting their information from DirectX which does not 100% correspond to output timings.
> 
> http://us.hardware.info/reviews/4164/4/frametime-tests-20-our-take-on-the-latest-developments-nvidia-fcat


Nvidia never offered hardware either. Their part of the tech was the overlay that was used with the 3rd party capture cards and ultra fast SSD storage.

And afaik rivatuner has been able to replicate that overlay part (the software) for ages now.


----------



## sugalumps

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> AMD didn't create any hype on this card. The internet did. Bulldozer was a different story.


The dude got up on stage at amds press conference and flat out said we brought you the single best gpu. "You asked for the single best gpu, and we have brought you that". - That is hype.
Quote:


> Originally Posted by *Forceman*
> 
> Problem is, that's exactly what some posters have been saying for weeks/months. "Just wait for Fury". "Fury us going to destroy Titan". "Nvidia us trying to grab quick 980 Ti sales because they know Fury is coming". That ridiculous Hunting Titans poster. Etc. All that stuff adds up to create unrealistic expectations.


Yup I remember the people saying that, "oh if nvidia are trying to sell the 980ti before amd at such a low price it's because the fury is going to destroy it".


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> Or maybe you are the blind one.
> Why are AMD fans disappointed about FuryX when it's mostly ahead of 980Ti in non Gameworks titles?


Which benches are you looking at????


----------



## maarten12100

Quote:


> Originally Posted by *carlhil2*
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9777989&CatId=7387 At this price, great deal...


At that price I would buy one. It is 700 euros here though...


----------



## provost

Quote:


> Originally Posted by *carlhil2*
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9777989&CatId=7387 At this price, great deal...


I am on my phone's mobile browser, but I don't see the Furyx?


----------



## Casey Ryback

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Really, because the graphs show a different story where the Fury X is only ahead in AMD titles like FC4. Meanwhile 1500 core 980tis are absolutely tearing Fury a new one in any title.


shadows of mordor - Fury X beats 980ti at 1440p, and both 980ti and titan X at 4K.

Can't compare OC scores yet, as the fury doesn't have voltage unlocked and it's speculation.


----------



## TheMentalist

Quote:


> Originally Posted by *Alatar*
> 
> AMD definitely created a lot of hype for this card. Probably the most for any GPU I've ever seen....
> 
> -Short "it's coming" teasers weeks before launch
> -HBM NDA lift weeks before launch
> -Their own E3 presentation
> -Fiji unveil at computex
> -All the talk about being a premium brand
> *-At least one week of waiting before the reviews where the only available info of the card (benches) came from AMD themselves*
> 
> If you want to see a card launch with no hype created by the company themselves just look at the 980Ti launch. No public event for the release, no pre-launch announcements, and reviews just popped out of nowhere on a random sunday.
> 
> The whole 300 series launch has been massively hyped by AMD themselves as well as the public.


I still remember their awful space countdown hype, turned out to be just a mediocre APU...so sad...


----------



## Vesku

Quote:


> Originally Posted by *Alatar*
> 
> Nvidia never offered hardware either. Their part of the tech was the overlay that was used with the 3rd party capture cards and ultra fast SSD storage.
> 
> And afaik rivatuner has been able to replicate that overlay part (the software) for ages now.


You don't seem to know much about how FCAT was distributed to review sites by Nvidia nor how it works.:
Quote:


> A number of US sites including Tech Report, PC Perspective, Anandtech and Tom's Hardware Guide received extremely fancy *hardware* and software from Nvidia to try out the new method.


http://us.hardware.info/reviews/4164/4/frametime-tests-20-our-take-on-the-latest-developments-nvidia-fcat

I haven't heard about anyone recreating FCAT capabilities with standard video capture hardware.


----------



## Ashura

Quote:


> Originally Posted by *Alatar*
> 
> If you want to see a card launch with no hype created by the company themselves just look at the 980Ti launch. No public event for the release, no pre-launch announcements, and reviews just popped out of nowhere on a random sunday.


AMD should've done this with the 3xx series, about a year back ..


----------



## AlvaJonathan

Hi,
Some Indonesian Fury X review and a bit of overclocking test here.

But for more important stuff, some said that Fury X is missing voltage control. With really minimum knowledge of the VRM Controller, I'm trying to guess that the controller is similar to what we found on R9 290X reference, the IOR IR3567B.
I'm using MSI Afterburner to probe the controller and doing some i2c dump for it.

i2c dump




probing the i2c bus 6, device 30, register 92 give me this:


Then, I tried adding some settings in the msi afterburner hardware profile (like the on I used to have on afterburner for R9 290X)


And there's the voltage option:


But at this point it didn't change anyhing yet, it makes it worse because when you tried to set the voltage, the GPU behaves 'strangely', and sometimes 'stuck' in the power saving pState (300Mhz GPU)







, maybe my setting is wrong and need to use the 'third party' voltage control support on recent afterburner.

I also cannot measure the volt on the card as I'm forbidden to open it.

Hopefully soon someone can confirm if the R9 Fury X indeed using same controller as R9 290X, and we can program the voltage using tools like afterburner in the future.


----------



## CasualCat

Quote:


> Originally Posted by *Circlemage8*
> 
> Considering Hexus's review (with 15.15 beta) for Witcher 3 versus most of the others who use 15.5 and have much worse performance. Wonder how much that is effecting other results.


It is also setting differences. They're using high presets others are using ultra. Compare Hardware Canucks for example which uses the same driver but higher settings.

edit: Also Tivan has a nice summary of the reviews and what driver version they're using. Maybe OP will add that.


----------



## undeadhunter

Quote:


> Originally Posted by *provost*
> 
> Thanks. I put auto notify on the Sapphire. Not sure why Asus and some others (mfg I can recognize.. lol) are *more than the msrp*.


That's common practice in newegg on fresh releases (price gouging)


----------



## Cool Mike

wAS GOING TO ORDER from Tiger. sAYS WILL ship in 7-10 days


----------



## PureBlackFire

Quote:


> Originally Posted by *Ashura*
> 
> Forbes - Radeon R9 Fury X Review
> http://www.forbes.com/sites/jasonevangelho/2015/06/24/amd-radeon-r9-fury-x-review-amd-at-their-best/
> Quote:
> 
> 
> 
> I recently had a conversation with AMD CEO Dr. Lisa Su, and I asked what her primary goal was when she first assumed leadership of the company. "It's to clarify our mission," she said confidently. "AMD has often been defined by other people. We're defined by much larger competitors. My goal is to say 'this is what we're going to do.' With the technology we have, it's about how we differentiate ourselves in the market. The idea of being the second guy and clawing for a little bit of share isn't what you build a sustainable company around."
Click to expand...

at least the CEO has her head on straight. let's see if this can have a positive impact going forward. personally, I'm tired of AMD's bungling of everything they possibly can. the gpu clearly is being bottlenecked somewhere, whether it's on the backend or the 4GB frame buffer. the lack of day one voltage control and possibly a decent driver is unexcusable at this point. the I/O having no legacy dvi-i support (another one of the reasons I switch to a 970 from a 290, couldn't use one of my monitors on the 290) and no HDMI 2.0 (which I need for 4k gaming currently) is just pushing me to stay with Nvidia, who's cards I've had almost nothing but problems with and I ain't giving them any more of my money. Guess I'm gonna be a console gamer by the fall. back to sub 1080p and 30 fps with dips.


----------



## Bandalo

They hunted, they found one, and it beat them.


----------



## Alatar

Quote:


> Originally Posted by *Vesku*
> 
> You don't seem to know much about how FCAT was distributed to review sites by Nvidia nor how it works:
> http://us.hardware.info/reviews/4164/4/frametime-tests-20-our-take-on-the-latest-developments-nvidia-fcat


Yes Nvidia sent some of the hardware to the sites but it's not Nvidia hardware but 3rd party hardware. Everyone can technically order the stuff. The hardware (storage and capture card) isn't offered by NV as some sort of product. All they did was buy the stuff and ship them out for reviewers.

The only actual Nvidia part of FCAT measuring was the software part that should now be functional on as a part of different 3rd party software suites.


----------



## MunneY

They aren't in stock yet at Amazon but they are coming today, I confirmed with a rep.

Sapphire - $649 - http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-21246-00-40G/dp/B01012TLSS

Gigabyte - $669 - http://www.amazon.com/Gigabyte-FURY-4096-Graphics-GV-R9FURYX-4GD-B/dp/B0106B8UAY/

XFX - $679 - http://www.amazon.com/XFX-RADEON-Graphics-Cards-R9-FURY-4QFA/dp/B0106IJXX0/

VisionTek - $699 - http://www.amazon.com/VisionTek-Radeon-Express-Graphics-900814/dp/B010A7V6KA/


----------



## SKYMTL

Quote:


> Originally Posted by *SpeedyVT*
> 
> I want them to use Windows 10.


Sure. Just make sure you ask AMD nicely to send us reviewers a Windows 10 driver for Fury.....which doesn't exist right now.
Quote:


> Originally Posted by *Casey Ryback*
> 
> shadows of mordor - Fury X beats 980ti at 1440p, and both 980ti and titan X at 4K.


In the in-game benchmark yes. In the game itself, no.


----------



## Circlemage8

Quote:


> Originally Posted by *CasualCat*
> 
> It is also setting differences. They're using high presets others are using ultra. Compare Hardware Canucks for example which uses the same driver but higher settings.


Assuming Hexus and Canucks were using the same settings for their fury vs 980ti tests. That just raises the question of what setting inside of ultra causes the fury to fall behind in lower resolutions.


----------



## Kane2207

Quote:


> Originally Posted by *SpeedyVT*
> 
> I want them to use Windows 10.


You can't keep changing the goal posts.

They should be using an unreleased OS
Wait for the drivers
DX12 is going to change everything in AMDs favour.

What really matters is the here and now, I'm not looking to base my purchase on performance that _might_ be achieved several months down the line, I want to know what I can buy _right now_, and I would guess many PC builders are exactly the same.


----------



## rainzor

Quote:


> Originally Posted by *Tivan*
> 
> Did a little write up what drivers were used in what tests. Why can't they just all use 15.15 c;


TPU used 15.15 for sure, i don't think 15.5 supports 300 series and Fury cards.


----------



## undeadhunter

Quote:


> Originally Posted by *edo101*
> 
> or you know they don't need HDMI 2.0 or 4k, and they are fine with the performance they get granted their will be better drivers and OC tools later. But hey, everybody has to think like you right?


No need to be like me, enjoy spending your cash on whatever fills your needs (even if that means paying the same or close for less right? lol ) to each it's own I guess


----------



## CrazyElf

I'm disappointed but not surprised. I think that AMD will have to lower the price of the Fury X soon.

Overall we are seeing:


At 2560 x 1440, it is about 7-8% slower than a 980Ti
At 3840 x 2160, it is about 2-4% slower than a 980Ti
They've made huge gains to power efficiency with this architecture, but it's still 15-20% less power efficient than a 980Ti - relative to the 290X vs the 780Ti, not much has changed here.
We'll see if future drivers can improve things. Historically, AMD drivers have done a better job of "aging" the card than Nvidia's, as the 680 vs 7970 and 780Ti vs 290X have demonstrated.

Since the 290X, AMD's Crossfire seems to be scaling better too than the Nvidia SLI, so we might see the Fury X win at 4k.

The big problem I see is that although the massive bandwidth does seem to improve things at 4k, the fact that the card only has 4GB of VRAM puts it in a bad spot, where even if drivers push the performance over the 980Ti and Titan X, you might find yourself in a situation where you run out of VRAM before you end up where your SP cores cannot handle the work at a good enough frame rate. Crossfire of course also faces these issues.

Edit:
A quick search of the reviews and I'm alarmed at the temperature of the VRMs too >100C? Can anyone confirm this?

I think in that case, that could seriously affect the lifespan of the card.

I should also mention that AMD has mentioned the idea that it could pool VRAM physically across GPUs. If that's true, that would be a huge step forward. At this point though, that has not happened.


----------



## DEW21689

I will still wait to see the Fury Nano (I like SFF Systems) but man I was expecting a lot more... I guess it isn't bad, just not at all what I had expected.

Correct me if I'm wrong but didn't they make a big fuss in some video that these cards had 1.5x the performance per watt over the r9 290x? Unless its just that early and I'm being that dumb at math, these don't seem even remotely close to 1.5x the performance per watt which makes me highly doubt that the Nano will be the 2x performance per watt they claimed...


----------



## Blameless

Quote:


> Originally Posted by *maltamonk*
> 
> Uhmm.....can you elaborate on that please?


The only image I've seen posted of the back of the Fury X's PCB while under load shows ~104C in the vicinity of the VRMs.

I wanted to know if the backplate was responsible for cooling the VRM, or if it was merely concealing them from the IR images in other reviews. TPU has images of the backplate that show there is no connection between the VRM and the backplate, which implies the latter.

The backplate does not seem to contribute to cooling at all, it just hides the hot components on the rear of the card.
Quote:


> Originally Posted by *SpeedyVT*
> 
> I don't think he realizes how hot vrms get on average. 104 is cool for a VRM.


I'm well aware of how hot VRMs can get and well aware that most are rated for 125-150C (depending on current demands...more current, lower the temp needs to be).

And 104C is _not_ cool for a VRM that has a copper pipe filled with flowing water attached to it. It's quite a bit higher than I'd expect, and worse than the reference air cooler/plate on prior generations.

In the same gaming scenarios, the VRM on a reference 290X won't even reach 80C. Some of the non-reference air coolers creep into the 100C range under gaming loads.

A full cover waterblock would have kept the VRM within 10C of core temperature. 104C is quite hot by comparison, and does not leave much headroom if you are going to be OCing and overvolting these parts.


----------



## Tivan

Kinda sad that of the video reviews, only PC Perspective and Hardware Canucks had the 15.15 driver going.

As for the cooling issues, it's definitely not a good design to have it run 100 degreesC on the back of the card. I wonder if they caught a poorly mounted model or if this is by design.


----------



## carlhil2

Quote:


> Originally Posted by *provost*
> 
> I am on my phone's mobile browser, but I don't see the Furyx?


----------



## Kane2207

Quote:


> Originally Posted by *Bandalo*
> 
> 
> 
> They hunted, they found one, and it beat them.












Just wow, someone really needs to take AMDs marketing department out the back like Old Yeller. Everybody (including AMD) would be the better for it...


----------



## MerkageTurk

Titan x it is then


----------



## Offler

I really dont understand why AMD havent increased amount of ROPs.

Amount of GCN cores increased, TMUs increased but not ROPs.

Nvidia has gone in exact opposite way. They stripped down amount of cores, but increased raw fillrate performance by more ROPs, and increasing frequency of used GDDR5 memory.

I guess that Nvidia products will age faster on newer games as amount of used shaders is increasing. its slightly visible on 4k tests which was done on HardwareCanucks. But using merely 72 ROPs over 64 would help beat TitanX.


----------



## SpeedyVT

Quote:


> Originally Posted by *Mozz13*
> 
> LTT youtube review is up. https://www.youtube.com/watch?v=-CDFNOTZy8o


I like their review, they must know that Windows 10 gives some significant performance difference to hold off on the most official review ever.


----------



## TheMentalist

What I'm wondering now is what nVidia's response to the Fury-X will be. Not that they need one right now...


----------



## Klocek001

I like the hardocp review the most. it is tough, but fair. the short version of today's reviews would be:

but I still have hope that amd will improve fury x with drivers as time goes by. just goes to prove how good 980ti is in dx11 titles.


----------



## th3illusiveman

Nice try from AMD but they were too late and alittle too slow.

At 4K the card gives the Ti and TX some competition but at lower resolutions it gets in serious trouble. This is most likely caused by AMD's DX11 driver overhead issue. Nvidia has substantially lower DX11 driver overhead which is very important in achieving high frame rates and since lower resolutions produce higher FPS then higher ones the Fury is slower. I have a feeling that once DX12 games launch this card will get more competitive at lower resolutions but right now that doesn't help.

The 650 price tag was a mistake but with the included water cooler i don't think they had much choice in the matter. The cooler alone adds a substantial cost to the product and any lower would result in poor profits but again it seems like they will need to drop the price.

It's a shame really but we have been here before. The GTX 680 was slightly faster then the 7970, the GTX 780 Ti was slightly faster then the 290X and now the GTX 980 Ti is slightly faster then the Fury. But look what happened to the GTX 680 and 780 Ti. They are both slower then their AMD counterparts and i'm sure the same will happen to the Ti in the future.

It's an interesting product and HBM is neat but 28nm is officially played out and 14nm cards will make these look like dinosaurs.


----------



## Blameless

Quote:


> Originally Posted by *Sleazybigfoot*
> 
> 
> This is *WITH* the backplate.
> 
> 
> This is *WITHOUT* the backplate.


Yes, this is what you'd expect to see by blocking the VRM with a plate. I could achieve a similar effect by setting a sheet of cardboard on the back of the GPU, or just holding my hand in front of my IR camera.

The VRM itself is probably running warmer with the plate, but since the plate is not in direct contact with the VRM, the plate is considerably cooler.


----------



## ssgtnubb

Placeholders on the egg


----------



## Clocknut

I find it rather strange that why Nvidia didnt go rated Titan X @ 300w tdp with close loop cooling, use 8GHz GDDR5 and clock the GPU @ 1200-1300MHz out of the box. That would easily justify the $999 price tag and kill Fury X.


----------



## Tivan

Quote:


> Originally Posted by *SpeedyVT*
> 
> I like their review, they must know that Windows 10 gives some significant performance difference to hold off on the most official review ever.


And they used graphs from techreport which were using 15.5 driver.
Hope they make a review of their own sometime, either way!


----------



## sugalumps

Quote:


> Originally Posted by *Kane2207*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wow, someone really needs to take AMDs marketing department out the back like Old Yeller. Everybody (including AMD) would be the better for it...


That was a fan made one, but they did have their slogans aswell. "Don't just upgrade. Revolutionize with the fury x". You can put all the fancy revolutionary memory you want in but if it doesnt beat your competition at the same price whats the point.


----------



## Kane2207

Quote:


> Originally Posted by *TheMentalist*
> 
> What I'm wondering now is what nVidia's response to the Fury-X will be. Not that they need one right now...


Something like this?


----------



## CasualCat

Quote:


> Originally Posted by *TheMentalist*
> 
> What I'm wondering now is what nVidia's response to the Fury-X will be. Not that they need one right now...


TBH I don't think they need one unless AMD lowers pricing. I do still think the 980 is overpriced though, but that really has nothing to do with FIJI's release.


----------



## rt123

Quote:


> Originally Posted by *AlvaJonathan*
> 
> Hi,
> Some Indonesian Fury X review and a bit of overclocking test here.
> 
> But for more important stuff, some said that Fury X is missing voltage control. With really minimum knowledge of the VRM Controller, I'm trying to guess that the controller is similar to what we found on R9 290X reference, the IOR IR3567B.
> I'm using MSI Afterburner to probe the controller and doing some i2c dump for it.
> 
> i2c Dump
> 
> 
> 
> 
> probing the i2c bus 6, device 30, register 92 give me this:
> 
> 
> Then, I tried adding some settings in the msi afterburner hardware profile (like the on I used to have on afterburner for R9 290X)
> 
> 
> And there's the voltage option:
> 
> 
> But at this point it didn't change anyhing yet, it makes it worse because when you tried to set the voltage, the GPU behaves 'strangely', and sometimes 'stuck' in the power saving pState (300Mhz GPU)
> 
> 
> 
> 
> 
> 
> 
> , I also cannot measure the volt on the card as I'm forbidden to open it.
> 
> Hopefully soon someone can confirm if the R9 Fury X indeed using same controller as R9 290X, and we can program the voltage using tools like afterburner in the future.


How well does it Bench with Tess off.??
Seems to do pretty good with Tess on.

Please post some results on the BOT when you get the time.


----------



## Alatar

Quote:


> Originally Posted by *Kane2207*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wow, someone really needs to take AMDs marketing department out the back like Old Yeller. Everybody (including AMD) would be the better for it...


The slide was 100% fake. Can't blame that one on AMD at all even if AMD did create a lot of hype.

Quote:


> Originally Posted by *TheMentalist*
> 
> What I'm wondering now is what nVidia's response to the Fury-X will be.


Rolling in money


----------



## Xuper

http://www.hardocp.com/images/articles/14351085919S0HOOZkGA_8_2.gif

I really want to know Why Fury Is 7% faster than 290x despite having 45% core over R9 290x? Just technically , What cause this Bottleneck?

It's bad Driver , low Peak pixel fill rate and Peak rasterization rate ( Compare to Geforce 980 TI) , Unoptimized game Engine or even BF4 loves maxwell ...etc ?

Look at this chart :

http://techreport.com/review/28499/amd-radeon-fury-x-architecture-revealed


----------



## Vesku

Quote:


> Originally Posted by *Alatar*
> 
> Yes Nvidia sent some of the hardware to the sites but it's not Nvidia hardware but 3rd party hardware. Everyone can technically order the stuff. The hardware (storage and capture card) isn't offered by NV as some sort of product. All they did was buy the stuff and ship them out for reviewers.
> 
> The only actual Nvidia part of FCAT measuring was the software part that should now be functional on as a part of different 3rd party software suites.


I see now that the key is it is a professional capture card, Nvidia sent reviewers the $1500-2000 capture card and other assorted bits. Why hasn't Nvidia updated to 4K. Guru3D said the capture card is capable: "Single Channel 4 lane PCI Express bus with maximum data rate of 650MB/sec and support for a maximum canvas of 4kx4k HD video "

http://www.guru3d.com/articles_pages/fcat_benchmarking_review,4.html


----------



## provost

Quote:


> Originally Posted by *carlhil2*


Cool, but never heard of vision tek, may be more mainstream as an AMD AIB?


----------



## carlhil2

Quote:


> Originally Posted by *Cool Mike*
> 
> wAS GOING TO ORDER from Tiger. sAYS WILL ship in 7-10 days


And, you are saving at LEAST a hundred dollars..


----------



## Blameless

Quote:


> Originally Posted by *TheMentalist*
> 
> What I'm wondering now is what nVidia's response to the Fury-X will be. Not that they need one right now...


A $10 price increase to the 980Ti.
Quote:


> Originally Posted by *Xuper*
> 
> http://www.hardocp.com/images/articles/14351085919S0HOOZkGA_8_2.gif
> 
> I really want to know Why Fury Is 7% faster than 290x despite having 45% core over R9 290x? Just technically , What cause this Bottleneck?


Fury X has 5% more pixel fill rate than the 290X.
Quote:


> Originally Posted by *provost*
> 
> Cool, but never heard of vision tek, may be more mainstream as an AMD AIB?


Nothing wrong with VisonTek, well not compared to any other AIB really.


----------



## kingduqc

Sorry AMD.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Clocknut*
> 
> I find it rather strange that why Nvidia didnt go rated Titan X @ 300w tdp with close loop cooling, use 8GHz GDDR5 and clock the GPU @ 1200-1300MHz out of the box. That would easily justify the $999 price tag and kill Fury X.


Because the Titan X released in March while Fury came out in June? Because the Titan X doesn't even need any of that to just be outright faster than Fury X along with having triple the VRAM. Fury isn't even faster than a 980ti and you expect NVIDIA to cut their margins on a Titan for what exactly?


----------



## Offler

Quote:


> Originally Posted by *sugalumps*
> 
> That was a fan made one, but they did have their slogans aswell. "Don't just upgrade. Revolutionize with the fury x". You can put all the fancy revolutionary memory you want in but if it doesnt beat your competition at the same price whats the point.


I would wait for next drivers (about 6 months later). If it increases performance of the card by 5-7 percent which is realistic expecation, it would be really nice competitor to 980ti.

Question is how much matters 4k for gamers.


----------



## p4inkill3r

Quote:


> Cool, but never heard of vision tek, may be more mainstream as an AMD AIB?


VisionTek is a great company.


----------



## GMcDougal

I put a lot of weight into HardOCP reviews and wanted to wait and see what they said before I said this statement....but this Fury X is borderline a flop in its current state. I'm not really sure what to think right now. Unless AMD can come up with a driver to really increase performance, and I mean quickly then this is gonna be a major flop....Atleast at its current price.


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> That was a fan made one, but they did have their slogans aswell. "Don't just upgrade. Revolutionize with the fury x". You can put all the fancy revolutionary memory you want in but if it doesnt beat your competition at the same price whats the point.


That's the whole point, they didn't lie about anything in that comment.

You are revolutionizing with HBM memory, they never said it would be faster than nvidia's flagship.

It is faster than their previous cards though.


----------



## Kane2207

Quote:


> Originally Posted by *sugalumps*
> 
> That was a fan made one, but they did have their slogans aswell. "Don't just upgrade. Revolutionize with the fury x". You can put all the fancy revolutionary memory you want in but if it doesnt beat your competition at the same price whats the point.


Quote:


> Originally Posted by *Alatar*
> 
> The slide was 100% fake. Can't blame that one on AMD at all even if AMD did create a lot of hype.


Ah, I stand corrected then - thanks









I still stand by my statement that AMD would be no worse off without their marketing department though


----------



## maltamonk

Quote:


> Originally Posted by *Blameless*
> 
> Yes, this is what you'd expect to see by blocking the VRM with a plate. I could achieve a similar effect by setting a sheet of cardboard on the back of the GPU, or just holding my hand in front of my IR camera.
> 
> The VRM itself is probably running warmer with the plate, but since the plate is not in direct contact with the VRM, the plate is considerably cooler.


They have a copper pipe that is used for it. Check out the 14:30 mark in oc3d's video


----------



## SlackerITGuy

Just checked most of the reviews out there.

So the card ended up being significantly slower than the GTX 980 Ti, it wins in some cases (Crysis 3 and Metro Last Light) by a very small margin but in average it loses by a good ~10 to 15% vs the GTX 980 Ti.

*Not only that, and maybe it was just me, but I was expecting some sort of new driver optimization feature that would leverage the advantages of HBM vs GDDR5, maybe improving Min FPS? maybe doing some sort of clever caching to improving stuttering in non low level API games?*

I don't know, maybe I was expecting a lot more from this release, but to not come closer to the GTX 980 Ti is definitely a huge loss, at $650 I don't know who would pick this up instead of a GTX 980 Ti to be honest (I understand you get liquid cooling and HBM, but at the end of the day performance is performance). Overclocking ability was also a huge question mark, and it ended up disappointing in that regard as well.

It excels at build quality and thermals, I will give AMD that, and like both Tahiti and Hawaii it will probably age extremely well.

I was really hoping it would beat GTX 980 Ti, not because I wanted AMD at the top (I could not care less about that) but because I wanted a fierce competition at that price range, maybe forcing one party to lower prices? but as it stands now that will certainly not be the case.

NVIDIA is going to take their sweet sweet time with Pascal, as they should after this release, and that's not a good thing for us in the upgrade waiting game, hopefully 14/16nm + HBM2 is damn worth it.


----------



## BoredErica

Quote:


> Originally Posted by *TheMentalist*
> 
> What I'm wondering now is what nVidia's response to the Fury-X will be. Not that they need one right now...


It's the 980ti and 980 price drop. Preemptive but they already knew what they were up against.


----------



## MerkageTurk

This may be the same scenario as before

I.e. 7970 slower than a 680 and now on par with 780


----------



## Bandalo

Quote:


> Originally Posted by *kingduqc*
> 
> 
> 
> Sorry AMD.


You should be the sorry one. You're stuck with owning Batman: Arkham Knight.


----------



## staryoshi

If you were to make me choose between the Fury X and a 980 Ti, I'd pick the 980 Ti. However, I do see the Fury X as an interesting product. Not enough to earn my purchase, but I am not disappointed with the launch.
Quote:


> Originally Posted by *Sleazybigfoot*
> 
> 
> This is *WITH* the backplate.
> 
> 
> This is *WITHOUT* the backplate.


Backplates do not assist in cooling any components except for _perhaps_ exposed memory modules if they were to make contact with the backplate. They are purely aesthetic. That card is still roasting underneath the backplate, as you can see by blips of red on the edges of the backplate. (The heat is not transferred from the back of the card to the backplate as there is a gap between the plate and the PCB)


----------



## pompss

Im pretty sure my old gtx 980 strix with the mod bios will be the same or even faster then the fury X.
Dissapointing after all the waiting.
When the kingpin , HOF and the strix gtx 980 ti comes out then its game over for amd


----------



## kingduqc

Quote:


> Originally Posted by *Bandalo*
> 
> You should be the sorry one. You're stuck with owning Batman: Arkham Knight.










giving it away or selling off cheap for sure...


----------



## Casey Ryback

It doesn't matter what brand you buy your fury from they are all the same card lol.

visiontek, mcdonalds brand fury X whatever.

warranty may differ though


----------



## MerkageTurk

Not really how about customer service or warranty


----------



## BoredErica

Quote:


> Originally Posted by *Casey Ryback*
> 
> It doesn't matter what brand you buy your fury from they are all the same card lol.
> 
> visiontek, mcdonalds brand fury X whatever.
> 
> warranty may differ though


I would love a Mcdonalds Fury X.


----------



## sugalumps

Quote:


> Originally Posted by *Casey Ryback*
> 
> That's the whole point, they didn't lie about anything in that comment.
> 
> You are revolutionizing with HBM memory, they never said it would be faster than nvidia's flagship.
> 
> It is faster than their previous cards though.


Again, once again for you who missed my post and the press conference apparntly. The guy from amd got up on stage and said "you wanted the single fastest card and we brought you that". They hyped.

Quote:


> Originally Posted by *MerkageTurk*
> 
> This may be the same scenario as before
> 
> I.e. 7970 slower than a 680 and now on par with 780


Ye m8 you only need two years for that to happen though.................


----------



## provost

Quote:


> Originally Posted by *p4inkill3r*
> 
> VisionTek is a great company.


Got it. Went to order on the Tiger Direct site (it does seem like a good price) , and then saw this small qualifier about 7-21 days expected shipping timeframe ...lol
Sounds like a pre-order, which I tend never to do... will wait for others


----------



## th3illusiveman

Quote:


> Originally Posted by *GMcDougal*
> 
> I put a lot of weight into HardOCP reviews and wanted to wait and see what they said before I said this statement....but this Fury X is borderline a flop in its current state. I'm not really sure what to think right now. Unless AMD can come up with a driver to really increase performance, and I mean quickly then this is gonna be a major flop....Atleast at its current price.


the HOCP review is one of the worst i've seen for the Fury. In other reviews it's much closer to the 980 Ti then it is on their site. It's been like that for Most AMD cards too.


----------



## Rickles

Quote:


> Originally Posted by *Shadymort*
> 
> Seems to me the Fury X is exactly where it was rumored to be: in the same market bracket as a gtx 980ti , winning some benchmarks and losing in others, and performing better at 4K than 1440p and lower compared to the other cards. So it's hard to be disappointed at this point. It may not be the hyped monster some people wanted it to be, but it's a quite good card performing in the ballpark of the gtx 980ti and priced accordingly.
> 
> I concur with some reviewers and users on the forum that the performance consistency is pretty odd. Some reviews note that for older titles (Metro 2033, etc), which are more optimized on the driver side on both nvidia and amd cards, the performance of the Fury X is better and more consistent, often nearing the R9 295X2 level; but in newer titles, which are less optimized, the performance is definitely odd, as even the gtx 980 seems to do better than the Fury X (which is strange, given the near R9 390X performance). While i don't expect miraculous drivers, i am pretty sure some of those results are caused by drivers not mature enough, rather than other causes.
> 
> All those benchmarks made me more curious about the Fury Nano performance. I expect a product witch the same or better performance as the gtx 980, and given the information we have on the Fury Dual, i bet the Nano chip will be the basis of this dual card. Once we know its performance, we can definitely estimate how far the dual card of this generation can push the performance limit
> 
> 
> 
> 
> 
> 
> 
> .
> 
> P.S.: One thing is for sure, the R9 295X2 is definitely one monster of a card.


IMO the fury nano is going to be at or below the performance of the 290x, you have to remember the nano is marketing 2x the perf/watt and is a low wattage card, 2x perf/watt of the 390x (@1080p) is something that the *GTX* *960* can also claim. I really don't think it's going to be touching the 980.


----------



## eXe.Lilith

Pretty sure everybody here dissing the card and how it stands vs the 980 ti will be sorry once people start pushing these to their limits.

Personally I got only one thing to say:



Gief EKWB blocks nao!


----------



## SpeedyVT

Quote:


> Originally Posted by *Darkwizzie*
> 
> I would love a Mcdonalds Fury X.


You mean McDonalds Flury X.


----------



## Tivan

Quote:


> Originally Posted by *sugalumps*
> 
> Again, once again for you who missed my post and the press conference apparntly. The guy from amd got up on stage and said "you wanted the single fastest card and we brought you that". They hyped.
> Ye m8 you only need two years for that to happen though.................


Maybe he meant the dual Fiji = D


----------



## ondoy

Fury X CF vs Titan X SLI....


----------



## MerkageTurk

Agents of shield came to mind


----------



## lajgnd

Lol.

Haven't been a fan of AMD since the 9700 pro a decade ago.

Their products are now garbage.

Fury X, hyped as AMD's return to form, *Surprise* is a total flop.

Literally, a day late and a dollar short.

Same price as 980Ti, worse performance... Not even factoring in late driver updates for day 1 games.

At this point if you are on AMD side, well... Dunno what to really say without making some people angry.

But they don't even have an argument on price/performance. Just worse all around. If Fury X was $399 or $499 would be a different story, but it's not. Poor AMD.


----------



## tpi2007

Quote:


> Originally Posted by *Kuivamaa*
> 
> Ok someone to bench BF4 with mantle now, it still is my main game and the reviews tell me absolutely nothing vs my 290X, same for DA:I.


I don't know if this has been addressed already (haven't read the whole thread), so here it goes just in case it hasn't:
Quote:


> Initially, I tested BF4 on the Radeons using the Mantle API, since it was available. Oddly enough, the Fury X's performance was kind of lackluster with Mantle, so I tried switching over to Direct3D for that card. Doing so boosted performance from about 32 FPS to 40 FPS. The results below for the Fury X come from D3D.


http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/11

If I remember correctly the same thing happened to the 285.


----------



## TheMentalist

Yeah, true nVidia will take their time now for sure. They can lower the prices of the 980/980ti whenever they need to.

I still think AMD should have gone all out with this release. They need to work on their drivers(and the release time frame). Better pacing between card releases(don't know how that would go for them, research is expensive and hard).
The Fury-X is not a bad card but really not something with impact. Destroyed nothing in the end, TitanX is still on top and 980ti is beating it with pricing.


----------



## MapRef41N93W

Quote:


> Originally Posted by *eXe.Lilith*
> 
> Pretty sure everybody here dissing the card and how it stands vs the 980 ti will be sorry once people start pushing these to their limits.
> 
> Personally I got only one thing to say:
> 
> 
> 
> Gief EKWB blocks nao!


Pretty sure you're going to be pretty disappointed when your card isn't breaking 1200MHz even with voltage. The card has 14.9m transistors per mm^2. Dense die cards never OC well (see: 290x which isn't even as dense as Fury).

Meanwhile you could buy GM200 and put it under water and get 1500+ core clocks


----------



## lacrossewacker

IMO - with the way the Fury was hyped...just matching the 980 Ti would've been a fail. It appears that even THAT is beyond the capabilities for the Fury....

God help the air cooled version released later....


----------



## p4inkill3r

Quote:


> Originally Posted by *ondoy*
> 
> Fury X CF vs Titan X SLI....


Nice!


----------



## Hawk777th

Totally a disappointment. AMD does not have the R&D Budget that Nvidia has and can't compete at the same level. They really should have priced this thing alot lower to gain back some market share. I guess its a product for AMD fanboys where the hype is real. But I have to laugh at all the people on OCN that said it was going to KILL THE TITAN X. GG NVIDIA! I don't know how many times I had to read that Nvidia had to RUSH the Titan X so AMD didnt smoke them! I like AMD but you have to be realistic in your expectations that AMD is a smaller company in quite a bit of turmoil. The chances of them putting out a end all product with the budgets they have is very slim. But it was the same with Bulldozer it was going to be the END ALL CPU! AMD fanboys are still blaming Windows and Devs for not using code that will make it faster.

This card keeps being compared to a 980Ti ya but if you look at the reviews its saying stock clocks with no OC. The OC Ti would run away its not even close while this one cant be overclocked at this point in time. But wait guys! In the future we will be able to overclock!!!!

I can't wait until people get these home that don't read specs and realize it doesn't even have HDMI 2.0 the rage will be strong with this one. Its a crippled product low vram even if its fast vram newer games will continue to demand even more.

AMD needs to price accordingly to survive and they are not doing that. Yes the first run of these will be sold out due to HYPE!!! But for the performance 4GB and no HDMI 2.0 etc it is priced way to high. If it was cheaper I would consider it for a small build etc but they are dreaming.


----------



## Cool Mike

Ordered the sapphire version from Amazon. None are available yet, but thinking they will be later today.

Thinking new drivers will bring it to 980Ti levels soon.


----------



## maltamonk

Quote:


> Originally Posted by *sugalumps*
> 
> The dude got up on stage at amds press conference and flat out said we brought you the single best gpu. "You asked for the single best gpu, and we have brought you that". - That is hype.


Afaik "World's best graphics performance" is what he said, to be fair in that regard the dual gpu card in the quantum does that.


----------



## NoDoz

Quote:


> Originally Posted by *Mozz13*
> 
> LTT youtube review is up. https://www.youtube.com/watch?v=-CDFNOTZy8o


They gave a reviewer a DOA


----------



## th3illusiveman

Quote:


> Originally Posted by *lajgnd*
> 
> Lol.
> 
> Haven't been a fan of AMD since the 9700 pro a decade ago.
> 
> Their products are now garbage.
> 
> Fury X, hyped as AMD's return to form, *Surprise* is a total flop.
> 
> Literally, a day late and a dollar short.
> 
> Same price as 980Ti, worse performance... Not even factoring in late driver updates for day 1 games.
> 
> At this point if you are on AMD side, well... Dunno what to really say without making some people angry.
> 
> But they don't even have an argument on price/performance. Just worse all around. If Fury X was $399 or $499 would be a different story, but it's not. Poor AMD.


Their products are not garbage by any stretch and while the Fury launch may be disappointing to some it's still within 10% of the GTX 980 Ti and it does have really innovative technology in it.

It's late and it's priced aliitle too high but it's not a bad card at all.


----------



## Derp

I buy cards from both companies and this card just doesn't interest me at all. Mediocre performance combined with inefficiency has been the AMD way for too long on both the CPU and GPU sides.


----------



## Wishmaker

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Pretty sure you're going to be pretty disappointed when your card isn't breaking 1200MHz even with voltage. The card has 14.9m transistors per mm^2. Dense die cards never OC well (see: 290x which isn't even as dense as Fury).
> 
> Meanwhile you could buy GM200 and put it under water and get 1500+ core clocks


He had the cash, he pulled the trigger, that is all what she wrote. Ifs, buts, coconuts, no biggie. Can't wait for his feedback!


----------



## Tivan

Quote:


> Originally Posted by *tpi2007*
> 
> I don't know if this has been addressed already (haven't read the whole thread), so here it goes just in case it hasn't:
> http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/11
> 
> If I remember correctly the same thing happened to the 285.


It's kind of hard to take anything useful out of a review that uses 15.5 drivers, but agreed. I wonder if they fixed the issues with 15.15 yet or if it's still due.


----------



## forthedisplay

Quote:


> Originally Posted by *lajgnd*
> 
> Lol.
> 
> Haven't been a fan of AMD since the 9700 pro a decade ago.
> 
> Their products are now garbage.
> 
> Fury X, hyped as AMD's return to form, *Surprise* is a total flop.
> 
> Literally, a day late and a dollar short.
> 
> Same price as 980Ti, worse performance... Not even factoring in late driver updates for day 1 games.
> 
> At this point if you are on AMD side, well... Dunno what to really say without making some people angry.
> 
> But they don't even have an argument on price/performance. Just worse all around. If Fury X was $399 or $499 would be a different story, but it's not. Poor AMD.


4000- and 5000-series were better than anything nVidia had to offer at the time. 7000-series were not bad either, and have certainly been a more long-lasting alternative than what the 600-series ever was.

They are behind now, but the gap isn't that massive. Hopefully 2016 changes things for the better for the sake of all of us, we're getting a new processing node and the second generation of HBM will alleviate the current 4GB shortcoming.


----------



## Devnant

Well, what can I say? I´m disappointed. I was expecting more. Maybe new drivers could change the picture, but guess I´ll have to stick with team green.


----------



## Gobigorgohome

My big question is, second hand Titan X, new 980 Ti or Fury X? I am gaming at 4K, the higher settings the better. I would need two cards either way.


----------



## raghu78

Its pretty much game over for AMD. The reviews have proven that Fury X cannot compete with 980 Ti and AMD is going to bleed further marketshare over the next 12-15 months. I doubt now if AMD can even keep their marketshare at 20% . Maxwell sweeps AMD across the board in a show of unbeatable performance and efficiency. Its sad that the AMD of today cannot compete with Nvidia even with a similar die size whereas earlier they used to compete with smaller die sizes.

Whats damning is there seems to be some major design issue as R9 Fury X is just not able to scale performance over R9 390X in so many cases/ games. There are so many instances of 10-15% improvement over R9 390X. With 45% more shaders and the same clocks thats miserable scaling. AMD really need a clean sheet design as Fury X has proven that the current architecture is not scalable. Very disappointing overall


----------



## Vesku

Quote:


> Originally Posted by *sugalumps*
> 
> Again, once again for you who missed my post and the press conference apparntly. The guy from amd got up on stage and said "you wanted the single fastest card and we brought you that". They hyped.
> Ye m8 you only need two years for that to happen though.................


It actually only took about 9 months to be fairly clearly the better choice. That said, the longevity argument is less important at this point in time since the next generation of GPUs will be on a node shrink.


----------



## Tivan

Quote:


> Originally Posted by *raghu78*
> 
> Its pretty much game over for AMD. The reviews have proven that Fury X cannot compete with 980 Ti and AMD is going to bleed further marketshare over the next 12-15 months. I doubt now if AMD can even keep their marketshare at 20% . Maxwell sweeps AMD across the board in a show of unbeatable performance and efficiency. Its sad that the AMD of today cannot compete with Nvidia even with a similar die size whereas earlier they used to compete with smaller die sizes.
> 
> Whats damning is there seems to be some major design issue as R9 Fury X is just not able to scale performance over R9 390X in so many cases/ games. There are so many instances of 10-15% improvement over R9 390X. With 45% more shaders and the same clocks thats miserable scaling. AMD really need a clean sheet design as Fury X has proven that the current architecture is not scalable. Very disappointing overall


Considering it beats TitanX SLI in crossfire in some benches, I wouldn't say it's done yet. Also it's really confusing that some reviewers use 15.5 drivers.


----------



## lajgnd

Quote:


> Originally Posted by *th3illusiveman*
> 
> Their products are not garbage by any stretch and while the Fury launch may be disappointing to some it's still within 10% of the GTX 980 Ti and it does have really innovative technology in it.
> 
> It's late and it's priced aliitle too high but it's not a bad card at all.


I'm sorry, but there's absolutely no reason to buy this.

-Same price than competition
-Worse performance than competition
-Worse customer support than competition (drivers on release date for new games, which translates into even worse performance and compatibility)

The only reason to buy this is if you have some sort of bizarre hatred of NVidia, because there's no logical reason to own this product over a 980Ti.

AMD needed to either crush NVidia on performance or on price, it did neither. There's absolutely no reason for anyone to buy this over an NVidia product.

This is worse ownage than intel cpus over AMD cpus. At least AMD can try to say they have some sort of price leverage with CPUs.


----------



## Kuivamaa

Quote:


> Originally Posted by *SKYMTL*
> 
> What? BF4 had Mantle tacked on months after launch.


What does that have anything to do with what I say. Testing BF4 under DX11 has been nearly pointless ever since mantle came out.It is painfully obvious that almost every game that supports mantle gets no DX11 care from AMD on a drivers level. I play both BF4 and DA:I on a daily basis (DA:I less so with TW3 around admittedly) and DX11 numbers tell me absolutely nothing on how Fury X performs vs my 290X.


----------



## Noufel

I wanted to go with a furyX cfx to replace my 980g1 sli but after seeing all those benchs i can't afford to spend 1300$ on the furyx and know it got beaten by the refference 980ti in almost al scenarios, im sure that it will improuve with the drivers ( like the 7970 performing like a 780 ) but how much time it will take 6-8 months by that time pascal will be out.
I've maid my mind, and it will be a 980ti G1 sli for me sry AMD next time perhaps


----------



## MapRef41N93W

Quote:


> Originally Posted by *ondoy*
> 
> Fury X CF vs Titan X SLI....


XDMA X-Fire has generally scaled better than SLI. Back when the 780ti was about 10% faster than a 290x, the 290x Xfire would often win by 5-10% vs the 780ti SLI. However that's not nearly enough benches from enough testing sources to outright claim it's faster straight up stock vs stock.


----------



## provost

Quote:


> Originally Posted by *Cool Mike*
> 
> Ordered the sapphire version from Amazon. None are available yet, but thinking they will be later today.
> 
> Thinking new drivers will bring it to 980Ti levels soon.


Yep, ordering mine form Amazon too from the link that Munney attached earlier.

If something doesn't work out, Amazon is great with their return/exchange policy.... lol


----------



## Tivan

Quote:


> Originally Posted by *ondoy*
> 
> Fury X CF vs Titan X SLI....
> 
> ...


+102% performance from the second card in Battlefield (4 DX) 4k, waow.


----------



## tpi2007

Quote:


> Originally Posted by *Tivan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> I don't know if this has been addressed already (haven't read the whole thread), so here it goes just in case it hasn't:
> http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/11
> 
> If I remember correctly the same thing happened to the 285.
> 
> 
> 
> It's kind of hard to take anything useful out of a review that uses 15.5 drivers, but agreed. I wonder if they fixed the issues with 15.15 yet or if it's still due.
Click to expand...

Scott (a.k.a. Damage in the forums) adressed that in the comments, he did use the proper drivers:


----------



## SpeedyVT

AMD was right to say the fastest 4K GPU!


----------



## Forceman

Quote:


> Originally Posted by *tpi2007*
> 
> I don't know if this has been addressed already (haven't read the whole thread), so here it goes just in case it hasn't:
> http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/11
> 
> If I remember correctly the same thing happened to the 285.


Wonder if that is because Mantle is designed/optimized for GCN 1.1 and was never updated for GCN 1.2?
Quote:


> Originally Posted by *tpi2007*
> 
> Scott adressed that in the comments, he did use the proper drivers:


And yet he says 15.5 again? Was that a typo, or does he not know that 15.15 is the Fury driver.


----------



## wholeeo

Quote:


> Originally Posted by *Blameless*
> 
> *A $10 price increase to the 980Ti.*
> Fury X has 5% more pixel fill rate than the 290X.
> Nothing wrong with VisonTek, well not compared to any other AIB really.


----------



## Casey Ryback

Quote:


> Originally Posted by *lajgnd*
> 
> I'm sorry, but there's absolutely no reason to buy this.


I'm sorry but they are selling fast.

We just don't understand why yet









Price drop - Lock it in.


----------



## criznit

Quote:


> Originally Posted by *SpeedyVT*
> 
> You mean McDonalds Flury X.


That would be a great buy if it didn't release the Fury on my stomach


----------



## Newbie2009

Disappointed.

-Late
-Underpowered
-Overpriced.

I don't see why anyone would buy this over a 980ti.


----------



## Alatar

Anand's review isn't up but their benches are up in the database:


----------



## Noobism

Quote:


> Originally Posted by *ondoy*
> 
> Fury X CF vs Titan X SLI....


Why is it that the Digital Storm reviews seem better then everything else lol.


----------



## Rickles

Quote:


> Originally Posted by *DEW21689*
> 
> I will still wait to see the Fury Nano (I like SFF Systems) but man I was expecting a lot more... I guess it isn't bad, just not at all what I had expected.
> 
> Correct me if I'm wrong but didn't they make a big fuss in some video that these cards had 1.5x the performance per watt over the r9 290x? Unless its just that early and I'm being that dumb at math, these don't seem even remotely close to 1.5x the performance per watt which makes me highly doubt that the Nano will be the 2x performance per watt they claimed...


The gtx 960 can claim 2x the performance per watt of the 390x so don't hold your breath waiting for the nano.

Quote:


> Originally Posted by *Blameless*
> 
> Yes, this is what you'd expect to see by blocking the VRM with a plate. I could achieve a similar effect by setting a sheet of cardboard on the back of the GPU, or just holding my hand in front of my IR camera.
> 
> The VRM itself is probably running warmer with the plate, but since the plate is not in direct contact with the VRM, the plate is considerably cooler.


Very unfortunate that it is solely a heat(temperature?) shield, perhaps they will never unlock voltage if those things are pushing 105c already.


----------



## Aaron_Henderson

Seems to me they are pushing the components on these cards too hard, VRM temps are way out of line...there is a reason they used water cooling...and it doesn't seem to be for any other reason other than these cards literally need it to keep from burning up, in their current iteration, at least. Truly disappointing...not that I expected any different...but still. If the performance was higher, maybe the design could be overlooked. But these cards with the closed loop cooling just seem ridiculous, not revolutionary. HBM was the only really interesting thing about this card, and unfortunately, doesn't really help the performance as much as some would have hoped. I was hoping these cards might force Nvidia to respond with something other than a chuckle at how they really can't be touched. AMD should have designed these cards around air cooling, and left the HBM alone if that is what is driving the price to what it is. Had AMD released these cards at around the $500 mark, with the same performance, only air cooled without HBM...no one would care what size the PCB was, and they could charge a premium for the Nano. They really needed to match/beat the 980 Ti with a more attractive price point if they really wanted people to consider these oddball GPUs.


----------



## fcman

"While AMD's Fiji GPU looks good on paper in most aspects, its current implementation in the AMD Radeon Fury X leaves a lot to be desired. AMD's GPU program for the first time has truly reminded us of its CPU program."
-HardOCP

That's just mean.


----------



## Rookie1337

Quote:


> Originally Posted by *lajgnd*
> 
> Lol.
> 
> Haven't been a fan of AMD since the 9700 pro a decade ago.
> 
> Their products are now garbage.
> 
> Fury X, hyped as AMD's return to form, *Surprise* is a total flop.
> 
> Literally, a day late and a dollar short.
> 
> Same price as 980Ti, worse performance... Not even factoring in late driver updates for day 1 games.
> 
> At this point if you are on AMD side, well... Dunno what to really say without making some people angry.
> 
> But they don't even have an argument on price/performance. Just worse all around. If Fury X was $399 or $499 would be a different story, but it's not. Poor AMD.


? What? It's within 980Ti levels either above it or below it by less than 5fps on most of the reviews I saw once you get past 1440p. The 1440p and below performance though was pretty strange to see. But you my sir are trolling or only looking at charts you want to see to make your opinion. Do have to wonder why the sub 4k performance isn't competitive while it's 4k performance is.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *raghu78*
> 
> Its pretty much game over for AMD. The reviews have proven that Fury X cannot compete with 980 Ti and AMD is going to bleed further marketshare over the next 12-15 months. I doubt now if AMD can even keep their marketshare at 20% . Maxwell sweeps AMD across the board in a show of unbeatable performance and efficiency. Its sad that the AMD of today cannot compete with Nvidia even with a similar die size whereas earlier they used to compete with smaller die sizes.
> 
> Whats damning is there seems to be some major design issue as R9 Fury X is just not able to scale performance over R9 390X in so many cases/ games. There are so many instances of 10-15% improvement over R9 390X. With 45% more shaders and the same clocks thats miserable scaling. AMD really need a clean sheet design as Fury X has proven that the current architecture is not scalable. Very disappointing overall


Agreed.

Especially when you throw in some other picky issues, like only using HDMI 1.4a and not 2.0 ... 104C temps on the memory on what many were saying was going to be a "full cover" water block (maybe that is why AMD made it so you CAN'T overclock the memory, or at least part of the reason why).

Don't get me wrong, it's a good card, about on part with the same priced GTX 980Ti, but it was SUPPOSED to be better ... and "A Titan Killer", and it fails.

Couple that with the still horrible mess that is FreeSync, and I see no good reason for anyone from nVidia to "switch teams" and get a Fury X over a GTX 980Ti.


----------



## Horsemama1956

Another I forgot to add. Why did AMD give up on Boost? Personally I hate it, but it seems like the difference in these reviews and in reviews in recent years in general. In "stock" situations the nVidia cards are gettting 100+ increase in clocks speeds which is obviously going to show in benchmarks.


----------



## raghu78

Quote:


> Originally Posted by *SpeedyVT*
> 
> AMD was right to say the fastest 4K GPU!


Not really. Its barely matching 980 Ti stock. custom GTX 980 Ti cards will thrash it by a good 10-15% margin..

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/#diagramm-watch-dogs-3840-2160_2
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives-22.html
http://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html


----------



## sugalumps

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Seems to me they are pushing the components on these cards too hard, VRM temps are way out of line...*there is a reason they used water cooling...and it doesn't seem to be for any other reason other than these cards literally need it to keep from burning up, in their current iteration, at least.* Truly disappointing...not that I expected any different...but still. If the performance was higher, maybe the design could be overlooked. But these cards with the closed loop cooling just seem ridiculous, not revolutionary. HBM was the only really interesting thing about this card, and unfortunately, doesn't really help the performance as much as some would have hoped. I was hoping these cards might force Nvidia to respond with something other than a chuckle at how they really can't be touched. AMD should have designed these cards around air cooling, and left the HBM alone if that is what is driving the price to what it is. Had AMD released these cards at around the $500 mark, with the same performance, only air cooled without HBM...no one would care what size the PCB was, and they could charge a premium for the Nano. They really needed to match/beat the 980 Ti with a more attractive price point if they really wanted people to consider these oddball GPUs.


Think how bad the air cooled fury is going to be......................... Maybe that's why they have delayed it after the fury x.

Quote:


> Originally Posted by *fcman*
> 
> "While AMD's Fiji GPU looks good on paper in most aspects, its current implementation in the AMD Radeon Fury X leaves a lot to be desired. AMD's GPU program for the first time has truly reminded us of its CPU program."
> -HardOCP
> 
> That's just mean.


Ouch


----------



## tpi2007

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> I don't know if this has been addressed already (haven't read the whole thread), so here it goes just in case it hasn't:
> http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/11
> 
> If I remember correctly the same thing happened to the 285.
> 
> 
> 
> Wonder if that is because Mantle is designed/optimized for GCN 1.1 and was never updated for GCN 1.2?
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Scott adressed that in the comments, he did use the proper drivers:
> 
> 
> 
> Click to expand...
> 
> And yet he says 15.5 again? Was that a typo, or does he not know that 15.15 is the Fury driver.
Click to expand...

Mantle was just to get things going with DX 12 and Vulkan, which should accommodate architecture changes better. Or maybe not, maybe AMD just had some driver work to do to accommodate GCN 1.2 to perform on par with that game and they didn't bother seeing as Mantle was just seen as a ramp platform for the other mainstream APIs.

I would say that Scott must have checked with AMD what drivers should be used. In any case, do the normal 15.5 (assuming that there are two versions of it) even work with the Fury?


----------



## Sleazybigfoot

Quote:


> Originally Posted by *Blameless*
> 
> Yes, this is what you'd expect to see by blocking the VRM with a plate. I could achieve a similar effect by setting a sheet of cardboard on the back of the GPU, or just holding my hand in front of my IR camera.
> 
> The VRM itself is probably running warmer with the plate, but since the plate is not in direct contact with the VRM, the plate is considerably cooler.


Yeah I realized that shortly after posting it, I thought there was heat tape (whatever it's called) in between, but it's just insulated by air as far as I can tell.


----------



## lajgnd

Quote:


> Originally Posted by *Casey Ryback*
> 
> I'm sorry but they are selling fast.
> 
> We just don't understand why yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Price drop - Lock it in.


No doubt because of insane fanboys buying the limited stock of say 5 cards per retailer.

Because really, any sane person is going to look at the hard facts and benchmarks and see this is a dud.

Worse performance, same price, lower RAM. LOL.

Everyone can just laugh at Fury X owners now. It's not even about the card itself at this point, it's just a reflection on them personally... Overpriced, underpowered, late to the party. Good lord, this is a disaster.

No doubt the reviews aren't even more blunt and scathing because sites are afraid AMD will cut them all off. They're so eloquent in gentle in saying it's a total flop of a card.

Some even going out to say but it comes with water cooling! LOL. THe damn thing comes with watercooling stock because they had to up the damn clocks up too high for air otherwise it would have been owned even WORSE.


----------



## Clocknut

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Because the Titan X released in March while Fury came out in June? Because the Titan X doesn't even need any of that to just be outright faster than Fury X along with having triple the VRAM. Fury isn't even faster than a 980ti and you expect NVIDIA to cut their margins on a Titan for what exactly?


for shrinking AMD market share another half. Pretty simple.







besides, if they do the close loop 300w tdp 1.2-1.3Ghz @ $999 & allow after market, it would sell a whole lot more than now. So the margin wouldnt be affecting since they sell more.

If I were Nvidia share holder, I will not be happy Nvidia deliberately limiting Titan X to 250w tdp only. It just make no sense. Why is it a taboo to go over 250w? AMD did it. All Nvidia need is do the same and kill AMD's market share further. Intel held 95% share no anti trust knocking their door, why Nvidia should stop @ 70%?


----------



## Kane2207

Quote:


> Originally Posted by *Casey Ryback*
> 
> I'm sorry but they are selling fast.
> 
> We just don't understand why yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Price drop - Lock it in.


How do we know they're selling fast? Is it based on them being out of stock? Surely that could also quite easily be attributed to extremely limited supply as per the rumours?


----------



## ondoy

amd claimed it's the fastest GPU.... where is it now ?


----------



## BinaryDemon

Quote:


> Originally Posted by *Darkwizzie*
> 
> I would love a Mcdonalds Fury X.


I bet it would come with the best bundle.


----------



## sugalumps

Quote:


> Originally Posted by *ondoy*
> 
> amd claimed it's the fastest GPU.... where is it now ?


"Something something dx12, blah blah future wait just a year like free sync".


----------



## pompss

The fury its on water and the gtx 980 ti its on air and still the gtx 980 ti its faster
If they do a review with some gtx 980 ti ON WATER its gameover for AMD.
I dont understand how people can buy it.
Huge flop for me.


----------



## airisom2

I have mixed feelings about this card. I don't know if I was expecting too much or if this is actually a good card whose show was stolen by the 980Ti. I'm also pretty disappointed by how much AMD hyped this card. While they have taken great strides when it comes to innovation, it just doesn't seem as substantial when you read the benchmarks. The average performance seems to be between the 980 and the 980Ti, and for $100 less (more for non-reference variants), that's where it should be. Seeing as this card just released, I'm sure that future driver updates will help the card out. On the other hand, the 980 Ti can overclock extremely well and the performance increases you get when overclocking along with the other NV specific features may justify the cost over Fury X for some folks. To me, NV Pascal is looking better and better.

And about the 104C VRMs, I'm pretty sure that card wasn't screwed together well . Here are the shots from hardware.fr:



And Guru3D's shot


As G3D didn't remove the backplate, focus on the 8-pin connector area. On Hardware.fr's thermal image shot, they're hot, and on G3D's, they're far from it. There is no reason for VRMs that are actively cooled by an AIO unit with a copper tube going over them go get 100C+ on load unless there is little to no contact.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Clocknut*
> 
> for shrinking AMD market share another half. Pretty simple.
> 
> 
> 
> 
> 
> 
> 
> besides, if they do the close loop 300w tdp 1.2-1.3Ghz @ $999 & allow after market, it would sell a whole lot more than now. So the margin wouldnt be affecting since they sell more.
> 
> If I were Nvidia share holder, I will not be happy Nvidia deliberately limiting Titan X to 250w tdp only. It just make no sense. Why is it a taboo to go over 250w? AMD did it. All Nvidia need is do the same and kill AMD's market share further. Intel held 95% share no anti trust knocking their door, why Nvidia should stop @ 70%?


If you want to go over 275w tdp you simply buy the superclocked/hybrid versions or put a custom bios on yourself. If you take a peak at the Titan X owners club you'll notice something like 15 custom bios's right in the OP that offer you basically whatever you want with a Titan X (except going over 1.278v since the card is hard locked there and needs an actual mod to go over).

Titan's not allowing AIB versions is because NVIDIA controls them. AIB's are basically just distributor's for the Titan series (though NVIDIA did allow cooler modifications this time at least).


----------



## zealord

I really had a lot of hope for the Fury X. AMD gambled with HBM, but they should've sticked with GDDR5 for the 28nm generation and had released this card earlier this year before the Titan X with 8GB GDDR5.

This is sadly no 649$/699€ card even if it comes with HBM and a Hybrid cooling System. Overclocking (atleast from the reviews I read so far) looks pretty bad.

The Fury X in my eyes should've been a 499$ air-cooled 8GB GDDR5 card, because at its current state the performance is alright, but not at 649$. AMD just can't release a card that is worse than a 980 Ti and ask the same price for it.

Also it looks like HBM is not helping much at all. Even at 4K the Titan X / 980 Ti are slightly faster. I mean HBM is better, but it isn't worth it from a gamers perspective currently.

I am so disappointed in the Fury X that I would rather buy a 980 Ti with 800$ G-Sync monitor instead of a Fury X and a 650$ FreeSync monitor. And that says a lot since I value money very highly.

So what now until 14/16nm HBM2 cards come out? More than one year wait for anything that can top Titan X / 980 Ti / Fury X?

Well atleast I don't have to buy anything that came out lately. 28nm GPUs are too expensive and good 1440p monitors aswell. No reason for me to upgrade for 1500$ if we get downgrades like Witcher 3 and ports like Batman Arkham Knight.


----------



## Alatar

Quote:


> Originally Posted by *Clocknut*
> 
> for shrinking AMD market share another half. Pretty simple.
> 
> 
> 
> 
> 
> 
> 
> besides, if they do the close loop 300w tdp 1.2-1.3Ghz @ $999 & allow after market, it would sell a whole lot more than now. So the margin wouldnt be affecting since they sell more.
> 
> If I were Nvidia share holder, I will not be happy Nvidia deliberately limiting Titan X to 250w tdp only. It just make no sense. Why is it a taboo to go over 250w? AMD did it. All Nvidia need is do the same and kill AMD's market share further. Intel held 95% share no anti trust knocking their door, why Nvidia should stop @ 70%?


Nvidia probably wants to sell their reference cards to OEMs and lets the AIBs go over the TDP for the retail market.

Staying well within pci-e spec is probably important to some OEMs who care mostly about reliability, return rates, etc.


----------



## kingduqc

[H] review was so hard on the card and it got me scared a bit for AMD'S future.
Quote:


> AMD's GPU program for the first time has truly reminded us of its CPU program.


Kinda brutal to see even watercooled their card is slower and does not OC much. Worst part of that is the whole 300 series are rebranded and cost more then the older cards while maxwell lineup is full maxwell through and through.


----------



## Exilon

Quote:


> Originally Posted by *Horsemama1956*
> 
> Another I forgot to add. Why did AMD give up on Boost? Personally I hate it, but it seems like the difference in these reviews and in reviews in recent years in general. In "stock" situations the nVidia cards are gettting 100+ increase in clocks speeds which is obviously going to show in benchmarks.


AMD didn't give up on boost... remember the 290X throttling? AMD states maximum boost clock. Nvidia states minimum boost clock.


----------



## Forceman

Quote:


> Originally Posted by *sugalumps*
> 
> Think how bad the air cooled fury is going to be......................... Maybe that's why they have delayed it after the fury x.


Well, if that little cool pipe isn't making good contact with the VRMs that could explain the high temps in a way that wouldn't affect the air cooled cards. There is no airflow at all in there, so bad contact with the water pipe would be a sort-of-disaster. Hopefully the larger heatsink/cold-plate of the air cooled cards would keep the VRMs cooler.


----------



## Str8Klownin

They might wanna release some info about that R9 Nano quick fast and in a hurry. I still think thats the real gem in this batch.


----------



## Aonex

Well I sure hope the Nano will be priced accordingly in light of these results. Hopefully drivers will be a little more mature by the time it arrives.


----------



## SKYMTL

Quote:


> Originally Posted by *Kuivamaa*
> 
> What does that have anything to do with what I say. Testing BF4 under DX11 has been nearly pointless ever since mantle came out.It is painfully obvious that almost every game that supports mantle gets no DX11 care from AMD on a drivers level. I play both BF4 and DA:I on a daily basis (DA:I less so with TW3 around admittedly) and DX11 numbers tell me absolutely nothing on how Fury X performs vs my 290X.


Tell me again that DX11 is pointless when DX11 features BETTER performance in BF4 than Mantle....we experienced that and so did TechReport.


----------



## provost

Quote:


> Originally Posted by *zealord*
> 
> I really had a lot of hope for the Fury X. AMD gambled with HBM, but they should've sticked with GDDR5 for the 28nm generation and had released this card earlier this year before the Titan X with 8GB GDDR5.
> 
> This is sadly no 649$/699€ card even if it comes with HBM and a Hybrid cooling System. Overclocking (atleast from the reviews I read so far) looks pretty bad.
> 
> The Fury X in my eyes should've been a 499$ air-cooled 8GB GDDR5 card, because at its current state the performance is alright, but not at 649$. AMD just can't release a card that is worse than a 980 Ti and ask the same price for it.
> 
> Also it looks like HBM is not helping much at all. Even at 4K the Titan X / 980 Ti are slightly faster. I mean HBM is better, but it isn't worth it from a gamers perspective currently.
> 
> I am so disappointed in the Fury X that I would rather buy a 980 Ti with 800$ G-Sync monitor instead of a Fury X and a 650$ FreeSync monitor. And that says a lot since I value money very highly.
> 
> So what now until 14/16nm HBM2 cards come out? More than one year wait for anything that can top Titan X / 980 Ti / Fury X?
> 
> Well atleast I don't have to buy anything that came out lately. 28nm GPUs are too expensive and good 1440p monitors aswell. No reason for me to upgrade for 1500$ if we get downgrades like Witcher 3 and ports like Batman Arkham Knight.


Well, after reading Raghu's comments, there may not be any hope of voltage unlock or dirver scaling for Furyx... may be time to take my finger off that "protest purchase" I was about to make... lol
No positives, come on AMD... no really.. what happened to the overclockers dream and all that


----------



## geoxile

Well, at least AMD is good at dropping prices in a hurry. I expect to see these at $600 at least in a few months unless their marketing actually worked.


----------



## th3illusiveman

Quote:


> Originally Posted by *lajgnd*
> 
> I'm sorry, but there's absolutely no reason to buy this.
> 
> -Same price than competition
> -Worse performance than competition
> -Worse customer support than competition (drivers on release date for new games, which translates into even worse performance and compatibility)
> 
> The only reason to buy this is if you have some sort of bizarre hatred of NVidia, because there's no logical reason to own this product over a 980Ti.
> 
> AMD needed to either crush NVidia on performance or on price, it did neither. There's absolutely no reason for anyone to buy this over an NVidia product.
> 
> This is worse ownage than intel cpus over AMD cpus. At least AMD can try to say they have some sort of price leverage with CPUs.


Alittle dramatic don't you think?

So in response to your bullet points:

- It is the same price as the competition but remember that the 980 Ti reference blower card costs 650, any custom cards will have a premium attached to them and while the reference Fury comes with a water cooler which results in lower temps and quieter operation while still exhausting all the hot air out of the case.

- It has worse performance at lower resolutions but quickly catches up and sometimes exceeds the 980 Ti at 4K resolution which is what this card was marketed for. 2 of them will perform better then 2 980 Ti's in SLi and even Titan X's in Sli (which cost $700 more).

- That is subjective. Nvidia might claim to have "game ready drivers" for alot of new releases but visit their forums when those drivers launch and you will see nothing but problems and complaints. At this point it's more of a marketing tactic then an actual tangible benefit.

There are afew reasons to buy a Fury X, if you want a very fast card for 4K gaming that runs cool and quiet then it is a worthy candidate if you are buying 2 cards to get high FPS at 4K it's even better.You really shouldn't need a Fury X at 1080p.

It's a rocky launch no doubt about it but the hardware itself is solid.


----------



## Alatar

I think the Fiji pro version (R9 Fury) is going to be the best priced one.

Hopefully non ref designs are allowed and only a couple of CUs are cut while the price drops close to $500. That'd make it a pretty good deal and would force the 980 to drop lower.

I don't get why everyone thinks the nano will be priced really well. I see it as a niche card that gets the best binned PRO GPUs or something. I'd expect to see it priced higher or on par with the normal Fury.


----------



## Clocknut

Quote:


> Originally Posted by *Alatar*
> 
> Nvidia probably wants to sell their reference cards to OEMs and lets the AIBs go over the TDP for the retail market.
> 
> Staying well within pci-e spec is probably important to some OEMs who care mostly about reliability, return rates, etc.


AFAIK AIB for Titan X is a uncommon thing, unlike 980Ti. Seriously Nvidia should opening up let the AIB makers make a super factory clock 1.2-1.3Ghz titan X.


----------



## MapRef41N93W

Quote:


> Originally Posted by *th3illusiveman*
> 
> Alittle dramatic don't you think?
> 
> So in response to your bullet points:
> 
> - It is the same price as the competition but remember that the 980 Ti reference blower card costs 650, any custom cards will have a premium attached to them and while the reference Fury comes with a water cooler which results in lower temps and quieter operation while still exhausting all the hot air out of the case.
> 
> - It has worse performance at lower resolutions but quickly catches up and sometimes exceeds the 980 Ti at 4K resolution which is what this card was marketed for. *2 of them will perform better then 2 980 Ti's in SLi and even Titan X's in Sli (which cost $700 more).*
> 
> - That is subjective. Nvidia might claim to have "game ready drivers" for alot of new releases but visit their forums when those drivers launch and you will see nothing but problems and complaints. At this point it's more of a marketing tactic then an actual tangible benefit.
> 
> There are afew reasons to buy a Fury X, if you want a very fast card for 4K gaming that runs cool and quiet then it is a worthy candidate if you are buying 2 cards to get high FPS at 4K it's even better.You really shouldn't need a Fury X at 1080p.
> 
> It's a rocky launch no doubt about it but the hardware itself is solid.


Way to base such a broad statement on a video of 3 benchmarks from a single source.
Quote:


> Originally Posted by *Clocknut*
> 
> AFAIK AIB for Titan X is a uncommon thing, unlike 980Ti. Seriously Nvidia should opening up let the AIB makers make a super factory clock 1.2-1.3Ghz titan X.


The EVGA superclocked version already boosts to 1300+ right out of the box.


----------



## ondoy

performance not really that bad it's just a few fps difference between 980Ti....
just that it's not the Fastest GPU as AMD claimed...


----------



## HeadlessKnight

Based on those reviews, this card is one of the most disappointing AMD cards of the last decade. I think this card takes the second place as one of the worst cards AMD made after the HD 2900 XT. I just hope it's performance is caused by immature drivers and it's only a matter of time before we see substantial performance gains.


----------



## 47 Knucklehead

Honestly, I think HardOCP said it best ...
Quote:


> The new AMD Fiji GPU and Fury X video card looks awesome on paper, but has underwhelmed and disappointed us when it comes to real world gameplay. The AMD Radeon R9 Fury X feels like a proof of concept for HBM technology.
> 
> ...
> 
> There is a definite pattern that leads to one video card being the best value for the money, and it is GeForce GTX 980 Ti, not the AMD Radeon R9 Fury X.
> 
> Limited VRAM for a flagship $649 video card, sub-par gaming performance for the price, and limited display support options with no HDMI 2.0 and no DVI port. To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for?


----------



## sugalumps

Quote:


> Originally Posted by *Alatar*
> 
> I think the Fiji pro version (R9 Fury) is going to be the best priced one.
> 
> Hopefully non ref designs are allowed and only a couple of CUs are cut while the price drops close to $500. That'd make it a pretty good deal and would force the 980 to drop lower.
> 
> I don't get why everyone thinks the nano will be priced really well. I see it as a niche card that gets the best binned PRO GPUs or something. I'd expect to see it priced higher or on par with the normal Fury.


Ye most likely, just like the 7950 and the 290 they were the best price to performance cards of both gens on both sides.


----------



## p4inkill3r

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Based on those reviews, this card is one of the most disappointing AMD cards of the last decade. I think this card takes the second place as one of the worst cards AMD made after the HD 2900 XT. I just hope it's performance is caused by immature drivers and it's only a matter of time before we see substantial performance gains.


Hottest take I've ever seen.

The GPU trades blows with Nvidia's flagship, yet somehow it is *A SUPREME DISAPPOINTMENT*.

Step away from the keyboard.


----------



## rt123

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Based on those reviews, this card is one of the most disappointing AMD cards of the last decade. I think this card takes the second place as one of the worst cards AMD made after the HD 2900 XT. I just hope it's performance is caused by immature drivers and it's only a matter of time before we see substantial performance gains.


LMAO.

Overstating much.


----------



## Tivan

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Way to base such a broad statement on a video of 3 benchmarks from a single source.


I'd just like to note that the Fury X is out performing the 980 ti at least in some games, in every review that uses 15.15 drivers. If they do some work on the current drivers, they might actually pull a convincing performance lead, but yeah, time to wait. = D


----------



## Exilon

Quote:


> Originally Posted by *Alatar*
> 
> I think the Fiji pro version (R9 Fury) is going to be the best priced one.
> 
> Hopefully non ref designs are allowed and only a couple of CUs are cut while the price drops close to $500. That'd make it a pretty good deal and would force the 980 to drop lower.
> 
> I don't get why everyone thinks the nano will be priced really well. I see it as a niche card that gets the best binned PRO GPUs or something. I'd expect to see it priced higher or on par with the normal Fury.


Fury Pro is going to be running at higher temperatures. As a result, expect lower clocks (transistors get slower as they heat up) at the same voltage and higher % of TDP consumed by leakage (transistors get leakier as they heat up). My GTX 780 goes from 80% TDP to 90% TDP (>30W) just while waiting for it to heat up from 40C (idle) to 80C (steady state load).

All of this really makes me want to get the MSI 980Ti and strap a Kraken G10 to it.


----------



## CasualCat

Quote:


> Originally Posted by *ondoy*
> 
> Fury X CF vs Titan X SLI....


Do they run Heaven with Tessellation off? Appeared that way in the video at least.


----------



## hamzta09

So.. flop?


----------



## sugalumps

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Honestly, I think HardOCP said it best ...


*" To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for?"*

Amd apoligists, "but but we are supporting the little guy". Ye good on you, you be the hero while others enjoy better performance/card.


----------



## th3illusiveman

Quote:


> Originally Posted by *kingduqc*
> 
> [H] review was so hard on the card and it got me scared a bit for AMD'S future.
> Kinda brutal to see even watercooled their card is slower and does not OC much. Worst part of that is the whole 300 series are rebranded and cost more then the older cards while maxwell lineup is full maxwell through and through.


They are in a very tight spot right now but come on. Would a $650 card really help them out? Very few people spend that much on GPUs and even if the Fury was a homerun it's not like AMD would suddenly start printing money. It's sad they can't claim to have the fastest single GPU but it's not make or break for them. At the lower end they have competitive cards which offer great performance which is what sells the most.

They still have the Fury non-X which will be faster then the GTX 980 while priced $50 higher. I think that card will help them out when it drops.
Quote:


> Originally Posted by *hamzta09*
> 
> So.. flop?


Nah, people just being over dramatic because the card failed to live up to rumors. It's priced the same a 980 Ti while offering slightly lower performance yet people are making it out to be the end of the world. Nothing alittle price drop can't fix.


----------



## lajgnd

Quote:


> Originally Posted by *th3illusiveman*
> 
> Alittle dramatic don't you think?
> 
> So in response to your bullet points:
> 
> - It is the same price as the competition but remember that the 980 Ti reference blower card costs 650, any custom cards will have a premium attached to them and while the reference Fury comes with a water cooler which results in lower temps and quieter operation while still exhausting all the hot air out of the case.
> 
> - It has worse performance at lower resolutions but quickly catches up and sometimes exceeds the 980 Ti at 4K resolution which is what this card was marketed for. 2 of them will perform better then 2 980 Ti's in SLi and even Titan X's in Sli (which cost $700 more).
> 
> - That is subjective. Nvidia might claim to have "game ready drivers" for alot of new releases but visit their forums when those drivers launch and you will see nothing but problems and complaints. At this point it's more of a marketing tactic then an actual tangible benefit.
> 
> There are afew reasons to buy a Fury X, if you want a very fast card for 4K gaming that runs cool and quiet then it is a worthy candidate if you are buying 2 cards to get high FPS at 4K it's even better.You really shouldn't need a Fury X at 1080p.
> 
> It's a rocky launch no doubt about it but the hardware itself is solid.


In response to your damage control spin

-It's the same price as competition and lower performance. My Titan X never gets loud enough for me to hear, the thing idles silently and when I game I am distracted by game sound. I don't stick my head up to my case to focus on card noise. It's not loud enough to be noticeable.

-It "sometimes" exceeds the 980Ti at 4K. But let's not forget that for pretty much every game released in the past year, they're both pretty close to unplayable at 4K, so who cares if it pulls ahead by a couple percentage points when they're both still crap...

-As far as SLI is concerned. AMD can barely produce drivers competent enough for single GPU (and when they do they're LATE). Crossfire? Nope, not even going to bother.

-Runs cool? Yeah, it runs so cool it needs watercooling to operate at stock.

-NVidia drivers aren't subjective. Game Ready drivers ALWAYS come out before release of a major game, and have way less problems than AMD. Sure there are problems, but to even make the argument that "nvidia has problems too!!!" is absurd and grasping for straws. This is a fact, not an opinion.

-Believe it or not, for 1080p even my Titan X isn't sufficient for EVERYTHING. Something like GTAV doesn't max 1080p at 60FPS. Witcher 3 has occasional drops (not frequent though), and can't even manage full 1080p 60 fps with Hairworks. How about those that want more than 60 FPS? 1080P resolution not mattering is a BS lame excuse.

Card is a total failure, AMD is a failure. End of story really. Sad but true.


----------



## Clocknut

Quote:


> Originally Posted by *MapRef41N93W*
> 
> The EVGA superclocked version already boosts to 1300+ right out of the box.


I was talking about 1300 b4 the boost. Pretty sure those maxwell can do that without problem if they were given another 50w Tdp. That should make enough room to bury Fury X.
Quote:


> Originally Posted by *HeadlessKnight*
> 
> Based on those reviews, this card is one of the most disappointing AMD cards of the last decade. I think this card takes the second place as one of the worst cards AMD made after the HD 2900 XT. I just hope it's performance is caused by immature drivers and it's only a matter of time before we see substantial performance gains.


No need already.

The msg is pretty clear to me. AMD was once at the very top from 4870/5870/6970, they run the whole thing into ground. Add with no WHQL driver for 7 months, no VSR for old cards, no fps cap. no dynamic/adaptive vsync. It is about time I dump my 7790 swap for 750Ti or just buy a used big kepler & go back to Nvidia like I once was.(gtx570/9800GT)


----------



## Rickles

Quote:


> Originally Posted by *airisom2*
> 
> I have mixed feelings about this card. I don't know if I was expecting too much or if this is actually a good card whose show was stolen by the 980Ti. I'm also pretty disappointed by how much AMD hyped this card. While they have taken great strides when it comes to innovation, it just doesn't seem as substantial when you read the benchmarks. The average performance seems to be between the 980 and the 980Ti, and for $100 less (more for non-reference variants), that's where it should be. Seeing as this card just released, I'm sure that future driver updates will help the card out. On the other hand, the 980 Ti can overclock extremely well and the performance increases you get when overclocking along with the other NV specific features may justify the cost over Fury X for some folks. To me, NV Pascal is looking better and better.
> 
> And about the 104C VRMs, I'm pretty sure that card wasn't screwed together well . Here are the shots from hardware.fr:
> 
> 
> 
> And Guru3D's shot
> 
> 
> As G3D didn't remove the backplate, focus on the 8-pin connector area. On Hardware.fr's thermal image shot, they're hot, and on G3D's, they're far from it. There is no reason for VRMs that are actively cooled by an AIO unit with a copper tube going over them go get 100C+ on load unless there is little to no contact.


Curious to say the least, but do note that Guru3D only says that they had the GPU at 100% for an hour and a half, I don't know if that necessarily means they are stressing the memory.

Also note that Toms has a higher reading on what typically wouldn't be a hot spot










I think what we are seeing is that back plate is actually doing nothing other than holding the heat in place and it bleeds out near the PCI slot and power connectors. I'd imagine that French review site is spot on with the temps of the VRMs, these cards are literally going to cook themselves into blackscreens and it'll be the 7970 RMA carousel all over.


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> *" To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for?"*
> 
> Amd apoligists, "but but we are supporting the little guy". Ye good on you, you be the hero while others enjoy better performance/card.


Why are you mad at the people supporting the little guy? when in theory it only helps you as a consumer lol.

Short poppy syndrome?


----------



## Cakewalk_S

Quote:


> Originally Posted by *AlvaJonathan*
> 
> Hi,
> Some Indonesian Fury X review and a bit of overclocking test here.
> 
> But for more important stuff, some said that Fury X is missing voltage control. With really minimum knowledge of the VRM Controller, I'm trying to guess that the controller is similar to what we found on R9 290X reference, the IOR IR3567B.
> I'm using MSI Afterburner to probe the controller and doing some i2c dump for it.
> 
> i2c dump
> 
> 
> 
> 
> probing the i2c bus 6, device 30, register 92 give me this:
> 
> 
> Then, I tried adding some settings in the msi afterburner hardware profile (like the on I used to have on afterburner for R9 290X)
> 
> 
> And there's the voltage option:
> 
> 
> But at this point it didn't change anyhing yet, it makes it worse because when you tried to set the voltage, the GPU behaves 'strangely', and sometimes 'stuck' in the power saving pState (300Mhz GPU)
> 
> 
> 
> 
> 
> 
> 
> , maybe my setting is wrong and need to use the 'third party' voltage control support on recent afterburner.
> 
> I also cannot measure the volt on the card as I'm forbidden to open it.
> 
> Hopefully soon someone can confirm if the R9 Fury X indeed using same controller as R9 290X, and we can program the voltage using tools like afterburner in the future.











This is awesome. I will have to try this on my 970 strix!!!! Mega REP


----------



## ondoy

Club3D Announces its Radeon R9 Fury X Graphics Card
PowerColor Announces its Radeon R9 Fury X Graphics Card
VisionTek Announces Radeon R9 Fury X, Alongside R9 300 and R7 300 Series


----------



## Dyius

Assuming my order doesn't get cancelled, I picked one up for less than 500 dollars. I feel like the price/performance is more justifiable there.


----------



## sugalumps

Quote:


> Originally Posted by *Casey Ryback*
> 
> Why are you mad at the people supporting the little guy? when in theory it only helps you as a consumer lol.
> 
> Short poppy syndrome?


Becasue dont support them at all then they have no choice but to sell their cpu or gpu division and actualy focus on one


----------



## jologskyblues

HardOCP is going back to my RSS bookmarks toolbar. I thought they were AMD shills before but it turns out they just tell it like it is regardless of the vendor in question.


----------



## Alatar

Quote:


> Originally Posted by *p4inkill3r*
> 
> Hottest take I've ever seen.
> 
> The GPU trades blows with Nvidia's flagship, yet somehow it is *A SUPREME DISAPPOINTMENT*.
> 
> Step away from the keyboard.


Well it's not a 2900XT however I wouldn't really just go "it trades blows with NV's flagship" either.

-It's slightly trailing the 980Ti and Titan X in most reviews and the 980Ti custom cards are much faster than the Fury X.
-NV's cards have 50% or 200% more VRAM
-NV's cards are likely to OC better once the dust settles

The reason it's such a disappointment for some people is because the card was hyped to the extreme by both AMD and the public. The overall expectations due to the big die size and HBM advantage (as well as extra time to tweak the clocks against NV) was that this card had to beat the TX to be a success technology wise.

This is pretty much where I expected it to land so I'm just here going "meh". It's a bit like the 6970 launch to me. Somewhat late, pretty hyped with new tech and in the end deeply average.


----------



## rcfc89

[quote name="BigMack70" url="/t/1561860/various-amd-radeon-r9-fury-x-reviews#post_24083199

This card is nothing more than AMD playing _"me too!!!!!"_ but with less VRAM, no HDMI 2.0, and a few months late to the party ....









Anyways on a more positive note... hope you guys who buy the card enjoy it![/quote]

*What a major fail by AMD. No 2.0 and slower then a 980Ti. Add in the ugly arse radiator you have to find a place for.*


----------



## Newbie2009

Quote:


> Originally Posted by *hamzta09*
> 
> So.. flop?


I think flop is too strong. Problem child.


----------



## HeadlessKnight

Quote:


> Originally Posted by *p4inkill3r*
> 
> Hottest take I've ever seen.
> 
> The GPU trades blows with Nvidia's flagship, yet somehow it is *A SUPREME DISAPPOINTMENT*.
> 
> Step away from the keyboard.


We are Overclock.net. 980 Ti/ Titan X can OC to a healthy 20-25%. Fury X proven to be a bad overclocker from those reviews. Titan X is 13% faster according to TPU at 1440p and 980 Ti is 9% faster. Now factor in the OCing headroom of GM200 and expect that percentage to increase even more. It might be on par with 980 Ti at 4K and slightly behind Titan X, but that is useless for most people.


----------



## Casey Ryback

Quote:


> Originally Posted by *Rickles*
> 
> I think what we are seeing is that back plate is actually doing nothing other than holding the heat in place and it bleeds out near the PCI slot and power connectors. I'd imagine that French review site is spot on with the temps of the VRMs, these cards are literally going to cook themselves into blackscreens and it'll be the 7970 RMA carousel all over.


Doesn't look great as it's an odd spot to be getting so hot., then again why would people even need to run furmark? (I'm assuming it's some stupid stress test like that?)

The MSI GTX 970 gets to 90C on the vrm's, but that is a different area for heat.


----------



## wholeeo

Quote:


> Originally Posted by *ondoy*
> 
> performance not really that bad it's just a few fps difference between 980Ti....
> just that it's not the Fastest GPU as AMD claimed...


I don't think they claimed that.


----------



## Cool Mike

Where?


----------



## Alatar

Quote:


> Originally Posted by *Exilon*
> 
> All of this really makes me want to get the MSI 980Ti and strap a Kraken G10 to it.


For me the whole generation just makes me want to wait for 16nm with the big pascal tape out rumors and VR being a 2016 product (at least close to it).


----------



## p4inkill3r

Quote:


> Originally Posted by *Alatar*
> 
> This is pretty much where I expected it to land so I'm just here going "meh".


That is a reasonable position to have.


----------



## CrazyElf

@OP

There is already a Crossfire review.

You may want to add the Digital Storm reviews:

http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/
Quote:


> Originally Posted by *From article*


With Crossfire, the situation doesn't look nearly as bad at 4k. I think that Crossfire does seem to have better scaling still than SLI.

The problem is the VRAM (only 4GB) and that if the Fury X is limited to just 1250-1300 MHz on OCs, versus what people could get with say, a 980Ti, it's not much of a "win".

Edit:
I think it's the fact that they did not add more ROPs that may play a role - despite the shader additions and other improvements, the ROPs remained constant.

Quote:


> Originally Posted by *Alatar*
> 
> For me the whole generation just makes me want to wait for 16nm with the big pascal tape out rumors and VR being a 2016 product (at least close to it).


How long before we get a "big" Pascal?

Wouldn't it look like:


2016 - we get a 300mm^2 Pascal
Performance maybe comparable from 580 to 680
2017 - we get a 550mm^2+ Pascal?

That's assuming no further delays in the next process.


----------



## sugarhell

Wow this is so slow..


----------



## provost

Quote:


> Originally Posted by *p4inkill3r*
> 
> Hottest take I've ever seen.
> 
> The GPU trades blows with Nvidia's flagship, yet somehow it is *A SUPREME DISAPPOINTMENT*.
> 
> Step away from the keyboard.


I think his post is a manifestation of this card falling way short of expectations (does not matter who set these expectations). Trust me, this guy is as upset with Nvidia's shenanigans as some, ahem cough others







, and was most likely rooting for the Furyx to be a real winner. His post is not as much a critique of AMD, as it is an expression of utter disappointment due to the card falling short of expectations.


----------



## rt123

Quote:


> Originally Posted by *HeadlessKnight*
> 
> We are Overclock.net. 980 Ti/ Titan X can OC to a healthy 20-25%. Fury X proven to be a bad overclocker from those reviews. Titan X is 13% faster according to TPU at 1440p and 980 Ti is 9% faster. Now factor in the OCing headroom of GM200 and expect that percentage to increase even more.


And apparently when Fury X additional voltage control to let the voltage go above stock, we at Overclock.net will run the core at 500Mhz.
Dat 1:1 Core to Memory ratio is where its at.


----------



## CasualCat

Quote:


> Originally Posted by *Dyius*
> 
> Assuming my order doesn't get cancelled, I picked one up for less than 500 dollars. I feel like the price/performance is more justifiable there.


How did you manage that? $500 makes that a really, really competitive card.

Edit: At $500 I'd say it would make choosing a 980Ti pretty difficult (especially for the people who just run it stock) unless you needed that last bit of performance.


----------



## NuclearPeace

It costs just as much as the 980 Ti but often its performing worse. It sometimes performs better than the 980 Ti and rarely the Titan X but then its only just; nowhere near the margins that some of us here on OCN and the implications from AMD marketing would make you believe. Performance is erratic, with it jumping around from being slightly ahead of the 980 in some games to beating the Titan X. It isn't an "overclocker's dream" either with it topping out at 1150 MHz on stock voltages, which might be good for GCN but is lower than many Maxwell's stock boost clocks. Its coming with less memory which is something enthusiasts crave now. For those who care about it, power consumption is still characteristically much higher than equivalent Maxwell GPUs. Also reviewers are reporting coil whine and pump noise.

Sorry but I don't see any reason to get it at its current state and price. If drivers improve performance and/or prices lower, then maybe. Really disappointed to see that it trails a 980 Ti reference which we all know thermally throttles. If you grab a G1 980 Ti or wait for the MSI GAMING 980 Ti you are most likely going to be able to overclock it to 1500 core, which the Fury X wont be able to come close to.

Its all down to the Fury non X and the Fury Nano. Fiji seems to be bottlenecked not by stream processors, so perhaps a Fury that has 87.5% the amount of cores as the Fury X will be around 95% as fast.


----------



## p4inkill3r

Quote:


> Originally Posted by *HeadlessKnight*
> 
> We are Overclock.net. 980 Ti/ Titan X can OC to a healthy 20-25%. Fury X proven to be a bad overclocker from those reviews. Titan X is 13% faster according to TPU at 1440p and 980 Ti is 9% faster. Now factor in the OCing headroom of GM200 and expect that percentage to increase even more.


Fury X proven to OC 100mhz on stock voltage from the reviews.








But you knew that before you dropped that big take, of course.


----------



## PureBlackFire

Quote:


> Originally Posted by *Xuper*
> 
> http://www.hardocp.com/images/articles/14351085919S0HOOZkGA_8_2.gif
> 
> I really want to know Why Fury Is 7% faster than 290x despite having 45% core over R9 290x? Just technically , What cause this Bottleneck?
> 
> It's bad Driver , low Peak pixel fill rate and Peak rasterization rate ( Compare to Geforce 980 TI) , Unoptimized game Engine or even BF4 loves maxwell ...etc ?
> 
> Look at this chart :
> 
> http://techreport.com/review/28499/amd-radeon-fury-x-architecture-revealed


just think back to the GCN launch. the HD7950 had a monster gap in specs (mostly core count) compared to the HD7870, but was not more than 10% faster overall. in fact the HD7950 was the dog of the year, losing to the HD7870 in Crysis 2 and losing to every nvidia mid range in most stock benches (low clock speed will do that). Tahiti in general left alot of performance potential untapped. the 7970 had double the core count of the HD7850 but nowhere close to double the performance, don't remember if it was even 50% faster at launch. Tahiti needed more ROPs among other things and it looks like the same is true of Fiji. thing has too many cores it cannot feed. the people wanted another Hawaii (with a better cooler) but we got another Tahiti. being that Tahiti started 28nm and this is at the ass end, I doubt it's relative performance will hold up as good. one ray of hope being that Tahiti (and really the whole initial GCN lineup) saw a huge performance boost from drivers later on, more than every other gpu that followed.


----------



## raghu78

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Agreed.
> 
> Especially when you throw in some other picky issues, like only using HDMI 1.4a and not 2.0 ... 104C temps on the memory on what many were saying was going to be a "full cover" water block (maybe that is why AMD made it so you CAN'T overclock the memory, or at least part of the reason why).
> 
> Don't get me wrong, it's a good card, about on part with the same priced GTX 980Ti, but it was SUPPOSED to be better ... and "A Titan Killer", and it fails.
> 
> Couple that with the still horrible mess that is FreeSync, and I see no good reason for anyone from nVidia to "switch teams" and get a Fury X over a GTX 980Ti.


R9 Fury X has some major problems with the design as the perf scaling is not there in a lot of games over a R9 390X. AMD needed to have improved the front end to ensure a good consistent scaling across over R9 290X / R9 390X in all games. Instead we see cases like Dragon Age Inquisition where Fury X scales 10-15% for 45% more shaders. Thats miserable scaling.

DAI

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives-10.html
http://www.techpowerup.com/reviews/AMD/R9_Fury_X/17.html
http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/6/#diagramm-dragon-age-inquisition-2560-1440

There are more cases but right now I am not going to bother with searching for all of them.


----------



## hamzta09

Did Nvidia not state the Fury X is the worlds fastest Graphics Card? Or was that the Dual GPU one? Im confused now.


----------



## zealord

So price drop in 2 months?


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> Did Nvidia not state the Fury X is the worlds fastest Graphics Card? Or was that the Dual GPU one? Im confused now.


Why would nvidia quote anything about fury X performance?


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> Why would nvidia quote anything about fury X performance?


I meant AMD.


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> I meant AMD.


Worlds fastest graphics card, it may have been.

Considering the dual fiji is going to be, they did not lie.


----------



## iatacs19

The overall footprint is actually a lot bigger than the 980Ti/Titan when you add the radiator into the mix.


----------



## th3illusiveman

Quote:


> Originally Posted by *lajgnd*
> 
> In response to your damage control spin
> 
> -It's the same price as competition and lower performance. My Titan X never gets loud enough for me to hear, the thing idles silently and when I game I am distracted by game sound. I don't stick my head up to my case to focus on card noise. It's not loud enough to be noticeable.
> 
> -It "sometimes" exceeds the 980Ti at 4K. But let's not forget that for pretty much every game released in the past year, they're both pretty close to unplayable at 4K, so who cares if it pulls ahead by a couple percentage points when they're both still crap...
> 
> -As far as SLI is concerned. AMD can barely produce drivers competent enough for single GPU (and when they do they're LATE). Crossfire? Nope, not even going to bother.
> 
> -Runs cool? Yeah, it runs so cool it needs watercooling to operate at stock.
> 
> -NVidia drivers aren't subjective. Game Ready drivers ALWAYS come out before release of a major game, and have way less problems than AMD. Sure there are problems, but to even make the argument that "nvidia has problems too!!!" is absurd and grasping for straws. This is a fact, not an opinion.
> 
> -Believe it or not, for 1080p even my Titan X isn't sufficient for EVERYTHING. Something like GTAV doesn't max 1080p at 60FPS. Witcher 3 has occasional drops (not frequent though), and can't even manage full 1080p 60 fps with Hairworks. How about those that want more than 60 FPS? 1080P resolution not mattering is a BS lame excuse.
> 
> - Card is a total failure, AMD is a failure. End of story really. Sad but true.


I don't have any reason to do damage control for AMD. You can try justifying your $1000 GPU all you want but i have no horses in this race. You also cannot discount it's 4K performance based on a weak argument like that. So if one card isn't sufficient get two, but wait, then the Fury X gets every faster....









- AMD single GPU drivers are fine and both companies could improve their dual card drivers.

- Yes it runs cool, this is a fact

- AMD cards only have issues when games use Nvidias proprietary game works (Surprising no one) and even them Nvidia GPUs perform subpar as well.

- The card is not a total failure and your comments are pretty sad to be honest. Don't get so worked up over graphics cards man. Go outside or something.


----------



## Kane2207

Quote:


> Originally Posted by *Casey Ryback*
> 
> Worlds fastest graphics card, it may have been.
> 
> Considering the dual fiji is going to be, they did not lie.


I think if we've learned anything today, it's to wait until the card is actually released rather than rely on AMDs word???


----------



## Alatar

Quote:


> Originally Posted by *hamzta09*
> 
> Did Nvidia not state the Fury X is the worlds fastest Graphics Card? Or was that the Dual GPU one? Im confused now.


Quote:


> Originally Posted by *hamzta09*
> 
> I meant AMD.


They were probably being purposefully deceptive with those statements. They talked about world's fastest graphics performance and at first they didn't say which card they were referring to but the slide at the end of the presentation clearly labeled the dual card as the fastest card in the world.


----------



## raghu78

Quote:


> Originally Posted by *zealord*
> 
> So price drop in 2 months?


actually the way its going AMD would rather not sell than sell below cost. That 596 sq mm die with 4 GB HBM and that AIO cooler is not going to be cheap to manufacture.


----------



## Superplush

Shouldn't they just throw in a few W10, DX12 Benches for the kicks. Not like it's a million miles away, haven't AMD been focussing more on W10 performance too ?


----------



## Casey Ryback

Quote:


> Originally Posted by *iatacs19*
> 
> The overall footprint is actually a lot bigger than the 980Ti/Titan when you add the radiator into the mix.


But people love AIO's cos they look cool and match a cheaper air cooler, and they can possibly leak or fail completely.










Honestly they are probably getting sales because of the AIO lols.


----------



## CasualCat

Quote:


> Originally Posted by *iatacs19*
> 
> The overall footprint is actually a lot bigger than the 980Ti/Titan when you add the radiator into the mix.


Really can't count the radiator in the footprint, maybe a bit of the tubing, but the radiator just goes where you'd have a 120 anyhow. I'd add the build quality of the card/housing itself seems top notch.
Quote:


> Originally Posted by *Kane2207*
> 
> I think if we've learned anything today, it's to wait until the card is actually released rather than rely on AMDs word???


Think it is less AMD itself and more the devout fanboys. AMD did its usual marketing (nothing really over the top imho) and fans ran with it as the second coming and savior of AMD's market share.

Were I to fault AMD here it'd be time to market and price. It still appears to be a solid card otherwise, but getting it out before the 980Ti and/or having it priced better would put them in a stronger position. Alatar might be right that the real gem/strong card may end up being the Pro.


----------



## Bartouille

I bet this card would have had the same performance with GDDR5... I'm disappointed.


----------



## rcfc89

https://youtu.be/xhVo7yPjQvE


----------



## Rickles

Quote:


> Originally Posted by *p4inkill3r*
> 
> Hottest take I've ever seen.
> 
> The GPU trades blows with Nvidia's flagship, yet somehow it is *A SUPREME DISAPPOINTMENT*.
> 
> Step away from the keyboard.


I think you are forgetting where AMD has found themselves, they really needed to pull a rabbit out of the hat with this card. It's highly unlikely they'll be gaining any market share with this lineup and their future is now hinged on Zen.

If VRMs really are running at 105c I'd bet they are also going to be seeing a lot of these cards coming back as RMAs.

And can we all be honest and look at what the fury nano is touted to do? It's supposed to be based on high performance/watt (2x perf/watt over the 290x) which should place it between the GTX 970 and 960. Granted this is where they have potential to gain the most market share, but it isn't going to be a high end card.

SFF is about the only thing going for AMD with these new cards, and even then most cases can accommodate the longer Nvidia cards.


----------



## tpi2007

Here is my opinion on this: assuming that they indeed fixed the pump noise, I think this card will do moderately well based on the novelty factor: HBM, high quality materials, cool and quiet, the power consumption at this performance level shouldn't make much of a difference for enthusiasts, I'd say that in that regard AMD managed to keep the card domesticated. But the 1.4a HDMI, inconsistent performance meaning more driver optimization is needed and the generally below 980 Ti performance, up to 10%, and let's see about that 4 GB limit in further testing, make it a tough sell other than for those that really want one of the first HBM equipped cards. And even then, it's a bit hard to get excited by HBM since it doesn't seem to be doing anything extraordinary for performance. Also, odd decision to keep the ROP number at 64, I wonder if that has any implications in the performance scaling against the 390X.

All in all, apart from people wanting one of the first HBM cards and a quiet flagship, I'd wait for the Fury. Should still beat the 980 (well, it has to, otherwise it's going to be competing against the 390X), the 4 GB of VRAM shouldn't be a concern at that cheaper price point and performance and hopefully the WHQL drivers that will be released alongside Windows 10, but also available for 7 and 8.1, will make it perform better.

It's nice to see AMD keeping up without the long delay that took them to answer the original Titan, but I still think that their lineup is not overall competitive. The Fury (non X) should help, but otherwise:

1. R7 370: the rumours originally pointed to it being a 1280 cores part with a $135 pricetag. Now, even though it's a GCN 1.0 part, at that price point it would be a very compelling product that would steal the show from the 750 Ti, power consumption notwithstanding. But then we learned that it is in fact a 1024 cores part with a $149 price tag I was very disappointed. We've had the Radeon HD 7850 1 GB selling for around $160 in October of 2012. The 1 GB size didn't make much of a difference in testing back then, but now the factory overclock and newer games certainly will, so 2 GB is the way to go, but still, it's very hard to get excited for what is otherwise the same card all over again two years and 8 months later for $149.

2. R9 380. This would have done moderately well if AMD had priced the 4 GB version at $199. That, together with the core clock and VRAM clock increase, while keeping the same TDP, would make for a compelling offer over the 960, being overall even faster than the 285 already was. As is, $199 for the same 2 GB of VRAM as the GTX 960, and considering that the 960 has more overclock headroom and a lower TDP to boot with, meh. Not to mention that they could have locked the $250 segment with full Tonga, which again, continues only in the Retina iMac as the R9 M295X.

3. R9 390. Sort of ok. The clockspeed bumps made it perform on par or slightly above the 290X and occupy its price point, while using a little less power. There isn't much appeal here. People looking for Hawaii would do better to get a 290X now, it has more headroom and costs the same. It should cost a bit less.

4. R9 390X. Not really ok. Way too overpriced for a card that doesn't at least have Tonga's feature set and uses 100w more than the 980 to achieve slightly slower or the same results.


----------



## Exilon

Quote:


> Originally Posted by *rt123*
> 
> And apparently when Fury X additional voltage control to let the voltage go above stock, we at Overclock.net will run the core at 500Mhz.
> Dat 1:1 Core to Memory ratio is where its at.


Increasing voltage on GCN? You're going to pay for that.
Quote:


> Originally Posted by *Alatar*
> 
> For me the whole generation just makes me want to wait for 16nm with the big pascal tape out rumors and VR being a 2016 product (at least close to it).


I don't expect early FF GPUs to be that much faster than the late 28nm GPUs. 28nm GPUs are hitting the reticle limit for TSMC, but FF GPUs will be hitting a cost limit first. If 16nm cost/transistor doesn't take a steep dive in 2016, then big die GPUs aren't going to happen. The only advantage a 8B transistor 16nm GPU will have over a 8B transistor 28nm GPU will be die size and power. The 16nm GPU will cost more, and seeing how Intel's finfets went, the 28nm GPU might even overclock better due to faster operation at high voltage ranges.


----------



## Tivan

Quote:


> Originally Posted by *Alatar*
> 
> They were probably being purposefully deceptive with those statements. They talked about world's fastest graphics performance and at first they didn't say which card they were referring to but the slide at the end of the presentation clearly labeled the dual card as the fastest card in the world.


Imagine the 1xFiji was the world's fastest GPU in the world. That'd mean they can't make a 2xFiji because that'd be faster c;


----------



## Casey Ryback

Quote:


> Originally Posted by *Kane2207*
> 
> I think if we've learned anything today, it's to wait until the card is actually released rather than rely on AMDs word???


I don't follow.............I don't really see where they have lied.

You have to market a product in a positive manner or you don't sell them, and possibly get fired.


----------



## Alatar

No reason to drop price if they're going to sell all the cards they can make. There were rumors of low availability and extremely small shipments to retailers.
Quote:


> Originally Posted by *Kane2207*
> 
> I think if we've learned anything today, it's to wait until the card is actually released rather than rely on AMDs word???


Never going to happen lol. Every time the hype train starts people hop on it without question, and even if veterans don't there are plenty of new guys that do









The interesting thing will be to see whether the E3 marketing mill with only AMD material targeting the general public for a week is worth the enthusiast community backlash.


----------



## Wishmaker

I said a while back when someone told me that my intuition about fury was wrong. I based my assessment on AMD's history where they innovate and the competition just does it better. Same here, they came out with HBM and it is borderline what they promised. Drop the price to 500 $ and you will not have 980 TI steal your market share AMD.

You can't escape your own history AMD. No matter how far technology advances, you are still two fries short of a happy meal. Sad but within my expectations. I will buy Pascal as I am in no rush to get Fury.


----------



## 47 Knucklehead

HAHAHA

Watching the Linus Tech Tips video and AMD sent them a Fury X that was DEAD ON ARRIVAL.

That doesn't bode well for quality control.


----------



## Junkboy

Shame, Pascal here I come.

Also, I really miss ATI.


----------



## Blackops_2

Quote:


> Originally Posted by *HARDOCP*
> AMD's GPU program for the first time has truly reminded us of its CPU program


I have to disagree with this statement, it's absurd. They didn't go back wards in performance, so therefore the statement isn't applicable at all.

The card was overhyped but this is why we wait. I think given some time with driver's we're going to see more performance. I still consider it a good job by AMD. They're just late to the party yet again. If we had had this card out before the Ti they would've actually grabbed some sales. I did expect between the 980Ti and Titan X

They're close but here we are yet again in a "hawaii vs GK110" situation OCing clearly goes to the green team so far, i'm not sure better cooling is going to suddenly change that.

They need to drop the price to 550-600$ for X and 450-500$ for Fury and it might sale. I'd really prefer 500$ fury X and 450$ for fury.


----------



## zealord

Quote:


> Originally Posted by *raghu78*
> 
> actually the way its going AMD would rather not sell than sell below cost. That 596 sq mm die with 4 GB HBM and that AIO cooler is not going to be cheap to manufacture.


well that is AMDs problem now. What do they think they can do? Release a product with a halo name that is worse than a 980 Ti, has less VRAM and same price. It just doesn't work like that. Especially not for AMD.

AMD gambled with the Fury X and they failed. I really wanted them to succeed with it, becuse I currently use a 290X and got quite used to how things work for AMD cards and I was looking forward to buying a FreeSync monitor and not a G-Sync monitor.

I won't buy either 980 Ti or Fury X, but if I had to choose right now then I'd go with the 980 Ti and that is saying a lot about how bad the Fury X is at the current price of 649$/699€.


----------



## iatacs19

Quote:


> Originally Posted by *CasualCat*
> 
> Really can't count the radiator in the footprint, maybe a bit of the tubing, but the radiator just goes where you'd have a 120 anyhow


It's definitely part of the card and should be included in the overall size because the Fury X cannot function without the radiator.


----------



## raghu78

Quote:


> Originally Posted by *Bartouille*
> 
> I bet this card would have had the same performance with GDDR5... I'm disappointed.


But it would have drawn a hell of a lot more power.







Anyway this launch is one of the most disappointing by AMD in recent times. Not as bad as R9 285. But still bad nonetheless. AMD is down and there to be finished off by Nvidia. I think over the next 6-12 months Nvidia will grow market share further. Another monopoly in GPU market is almost ready.


----------



## th3illusiveman

Quote:


> Originally Posted by *Bartouille*
> 
> I bet this card would have had the same performance with GDDR5... I'm disappointed.


It would have use ALOT more power too. HBM was chosen to save power and it does that very well.


----------



## Yor_

980 Ti incoming...


----------



## Tojara

Quote:


> Originally Posted by *47 Knucklehead*
> 
> HAHAHA
> 
> Watching the Linus Tech Tips video and AMD sent them a Fury X that was DEAD ON ARRIVAL.
> 
> That doesn't bode well for quality control.


Yep, any card should reasonably withstand a drop onto solid concrete.


----------



## rcfc89

Quote:


> Originally Posted by *CasualCat*
> 
> Really can't count the radiator in the footprint, maybe a bit of the tubing, but the radiator just goes where you'd have a 120 anyhow
> Think it is less AMD itself and more the devout fanboys. AMD did its usual marketing (nothing really over the top imho) and fans ran with it as the second coming and savior of AMD's market share.


I disagree. Unless you plan on running this card off an existing radiator you have to factor in where exactly you're are going to mount that radiator with its limited length tubing. Unless you're replacing that as well.


----------



## Alatar

Die size might have also been even bigger with GDDR5 controllers.

So even if they were willing to sacrifice the power consumption advantage HBM might have been the only option on 28nm.


----------



## Achromatis

Well I'll still buy AMD. Actually what I want is to just not buy Nvidia and support that Gameworks crap which keeps ruining games, and that leaves me with AMD









But I think I'll wait for second generation HBM.


----------



## rt123

Quote:


> Originally Posted by *Exilon*
> 
> Increasing voltage on GCN? You're going to pay for that.


On Intel/Nvidia that happens for free I guess.


----------



## Str8Klownin

Quote:


> Originally Posted by *Alatar*
> 
> I don't get why everyone thinks the nano will be priced really well. I see it as a niche card that gets the best binned PRO GPUs or something. I'd expect to see it priced higher or on par with the normal Fury.


This to me is the biggest concern with the Nano and after they see these reviews, they have an opportunity to do something big with this little guy. If this does perform like they say, the pricing will be the trickiest part, possibly undercutting one of their own maybe? Doubtful but they have time to get this right.


----------



## Tivan

Quote:


> Originally Posted by *Rickles*
> 
> It's highly unlikely they'll be gaining any market share with this lineup


See you in half a year.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Tojara*
> 
> Yep, any card should reasonably withstand a drop onto solid concrete.


That's why there is this new invention out there ... it's called a box and foam. I hear people ship even more delicate item than video cards in it all the time without breakage.









WooT!

AMD has a winner here ... the reviews are in and their stock prices are up $0.01.


----------



## p4inkill3r

Quote:


> Originally Posted by *Rickles*
> 
> I think you are forgetting where AMD has found themselves, they really needed to pull a rabbit out of the hat with this card. It's highly unlikely they'll be gaining any market share with this lineup and their future is now hinged on Zen.
> 
> If VRMs really are running at 105c I'd bet they are also going to be seeing a lot of these cards coming back as RMAs.
> 
> And can we all be honest and look at what the fury nano is touted to do? It's supposed to be based on high performance/watt (2x perf/watt over the 290x) which should place it between the GTX 970 and 960. Granted this is where they have potential to gain the most market share, but it isn't going to be a high end card.
> 
> SFF is about the only thing going for AMD with these new cards, and even then most cases can accommodate the longer Nvidia cards.


I've forgotten nothing about AMD's situation, as a glance at my portfolio can attest, but the hyperbolic statements and green-flag waving at the fact the 980Ti is winning (at he cost of Titan X's relevancy) is, while not surprising, mind-numbingly predictable.


----------



## th3illusiveman

overclocking results from G3D


















no gain from overclocking at lower resolution while it does get afew FPS at higher resolutions. I think this has something to do withe their DX11 driver overhead.


----------



## azanimefan

this card just shows how tone deaf AMD is.

they don't seem to get they have a "bad reputation"; which means in order to sell a product they need to

1) have the superior product
2) have the cheaper price

if those two things don't come together then their product launches will be DOA. The whole 300 lineup, plus the Fury lineup is a DOA launch. It's too expensive for the market, and the performance isn't there to justify it.

Congratz AMD, you're even more tone deaf then i thought was possible. This pretty much guarantees that ZEN will give us Sandybridge performance on an 8c/16t which they'll then sell for $1000; while their 4c/8t APU goes for $350. i'm calling it now. that's just how out of touch AMD is right now.

What i don't get is the delay in the 3xx series / fury cards. They said back around xmas they'd hold off till july so they can have a full product lineup launch not just a rebrand launch. yet it's july and we get a rebrand launch. Had this launch happened in December like it was planned they would have had the fastest card on the market, and forced nvidia to hurry the titan launch. Instead they held on until the titan and 980ti came out then lamely waited a little longer, then released a card that can't match the 980ti at a higher price point.

Color me baffled.


----------



## headd

Quote:


> Originally Posted by *raghu78*
> 
> Its pretty much game over for AMD. The reviews have proven that Fury X cannot compete with 980 Ti and AMD is going to bleed further marketshare over the next 12-15 months. I doubt now if AMD can even keep their marketshare at 20% . Maxwell sweeps AMD across the board in a show of unbeatable performance and efficiency. Its sad that the AMD of today cannot compete with Nvidia even with a similar die size whereas earlier they used to compete with smaller die sizes.
> 
> Whats damning is there seems to be some major design issue as R9 Fury X is just not able to scale performance over R9 390X in so many cases/ games. There are so many instances of 10-15% improvement over R9 390X. With 45% more shaders and the same clocks thats miserable scaling. AMD really need a clean sheet design as Fury X has proven that the current architecture is not scalable. Very disappointing overall


i am sure that bad scaling is because 64Rops...if FURY X have 128Rops it will be much faster
Hard ROPs bottleneck and maybe not much faster tessellation vs hawaii...that why it scales so bad.
i really dont know why FURY X only have 64rops...


----------



## zealord

Quote:


> Originally Posted by *th3illusiveman*
> 
> overclocking results from G3D
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no gain from overclocking at lower resolution while it does get afew FPS at higher resolutions. I think this has something to do withe their DX11 driver overhead.


oh boy. oh boy









Furydozer why whyyyyyyyyyyyy


----------



## criminal

http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=R9FURYX


----------



## Ganf

Quote:


> Originally Posted by *raghu78*
> 
> But it would have drawn a hell of a lot more power.
> 
> 
> 
> 
> 
> 
> 
> Anyway this launch is one of the most disappointing by AMD in recent times. Not as bad as R9 285. But still bad nonetheless. AMD is down and there to be finished off by Nvidia. I think over the next 6-12 months Nvidia will grow market share further. Another monopoly in GPU market is almost ready.


It's depressing to think that Zen won't pull their chestnuts out of the fire on the processor side also. Everyone keeps harping on the idea of Samsung buying AMD, I'd rather IBM bought them and jumped back into the consumer market. IBM knows how to get raw power out of hardware architectures, and AMD can simply provide them with the licensing to do whatever they want. Samsung is too infatuated with the mobile side of things and will push AMD to focus on that, whereas IBM knows it doesn't have a chance in hell in mobile with it's power hungry architectures and will be going for Intel and Nvidia's throats. Especially Intel's, since they'll be able to offer two architecture's for servers.


----------



## GTR Mclaren

Same price as the 980ti but 5-10% slower....jesus AMD, are you going to return to the 4870 PII X4 times someday??


----------



## TheMentalist

I feel bad for AMD


----------



## Shogon

Quote:


> Originally Posted by *zealord*
> 
> So price drop in 2 months?


Hell, I give it a month or so. Maybe when this hits 7970 / 290 prices I'll see how it does in folding.
Quote:


> HBM, High Bandwidth Memory, is a technology that needed to happen. This is a step forward for video cards in the continual evolution of the hardware. That said, we feel AMD has perhaps implemented it in the wrong place, in the wrong product stack.
> 
> HBM is limited right now in its first iteration to just 4GB of VRAM on a single GPU. We think the decision to constrain your flagship high-end video card, at a $649 price point, with only 4GB of VRAM today is a shortsighted, confusing and ultimately bottlenecking choice.
> 
> The new AMD Fiji GPU and Fury X video card looks awesome on paper, but has underwhelmed and disappointed us when it comes to real world gameplay. The AMD Radeon R9 Fury X feels like a proof of concept for HBM technology.


[H] laying it on thick.

Pain and sorrow consume the internet, and much joy over recent purchases conveyed. The wait is over, and so is AMD.


----------



## mouacyk

Is this card going to excel at anything -- something that might take advantage of that 512GB/s bandwidth? Otherwise, HBM is quite premature and NVidia is making a better decision to wait until 2016 and HBM2 where they should have a sufficient GPU to complement it. So much of that bandwidth, which is the innovation here, has gone to waste in all these benchmarks. Not sure why AMD doesn't design a tech demo or something to highlight it.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *th3illusiveman*
> 
> overclocking results from G3D
> 
> no gain from overclocking at lower resolution while it does get afew FPS at higher resolutions. I think this has something to do withe their DX11 driver overhead.


Yeah, early OC'ing numbers are showing only about a 10% OC ability.

I guess that is to be expected for what is basically a pre-overclocked, pre-water cooled, card.

I'm sure people will be able to maybe squeeze out maybe 15% down the road, but so far, this doesn't seem to be the "Overclockers Dream" like some has said (not to mention the "Titan Killer" that AMD said).


----------



## Tivan

Quote:


> Originally Posted by *azanimefan*
> 
> this card just shows how tone deaf AMD is.
> 
> they don't seem to get they have a "bad reputation"; which means in order to sell a product they need to
> 
> 1) have the superior product
> 2) have the cheaper price


I think to fix a bad reputation, you need positive reviews and good PR. Which they've been getting with this launch, I'd say.

You can have a product as good as it gets and as cheap as it gets, but it doesn't fix a bad rep. 290 was their shot at that, fixing their image is what they're onto now, because marketing the 290 as a budget alternative to the 970 clearly failed.

I mean sure, you might not like smoke and mirrors, but that's what having a good rep hinges on. See Nvidia.


----------



## th3illusiveman

Quote:


> Originally Posted by *47 Knucklehead*
> 
> This.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't even honestly recommend this to my friends, family, and customers who are not fanboys in either camp. Compared to th eGTX 980Ti ... there just isn't enough of a reason to buy the Fury X over the 980Ti when you consider all the other issues (FreeSync, no DVI port, only HDMI 1.4a) ... all for the same price.


Like you would ever recommend any AMD card


----------



## Kriant

I am...honestly disappointed. Guess I will be moving "out" from the "red camp" during my next upgrade


----------



## dieanotherday

those stupid industrial spies,

AMD probably had it down a while back, nvidia then tuned titan x and 980 ti to fury x levels with lower power consumption.

AMD found out Nvidia knows and tries to find ways to get back but failed because they don't have money.

These companies are evil


----------



## Casey Ryback

Quote:


> Originally Posted by *zealord*
> 
> oh boy. oh boy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Furydozer why whyyyyyyyyyyyy


It's nothing like bulldozer, come on, let's not start losing our marbles over this.

OCN is all doom and gloom right now, it's a bit ridiculous.

The cards are ok, they are competing with the 980ti on a fair few fronts, just overpriced.

Bulldozer was no better performance wise than it's predecessors so it was a lot different.


----------



## funky882

This is beyond disappointing... I love AMD but I was expecting something that would be a game changer, not something that would trade blows with Maxwell.

You know, I really wanted to like AMD, the last two cards I've had from them kicked ass, and honestly I was expecting to make an incredibly irresponsible purchase the day of the Fury X release. However, I'm tired of their ability to both raise my hopes beyond belief, then crush them to a fine depressing dust. I feel like GPUs used to be so much more exciting.

AMD our relationship is over...Hopefully Pascal will removed the wooden stake AMD drove into MY COLD LIFELESS SOUL.,


----------



## 47 Knucklehead

Quote:


> Originally Posted by *th3illusiveman*
> 
> Like you would ever recommend any AMD card


To certain people, yes I do.

I best you think that I don't own any AMD/ATI cards currently too? Well, you are wrong.

I use the best tool for the job. But lower performance, and lesser features for the same price ... no, I won't do it.


----------



## KarathKasun

Still waiting for Nano specs.

If its a lower clocked/lower volt Fury X it will be a hit for the upgrade crowd. 980 performance/power usage with a tiny board that will fit any case.


----------



## MapRef41N93W

Quote:


> Originally Posted by *texni*
> 
> can someone explain to me how nvidia cards can compete despite having less "cores"?
> You nailed it.
> 
> Exactly what made me choose a 980 ti


Because you can't compare cores directly across architectures...The 980 had 800 less cores than the 780ti but was faster because of the Maxwell architecture. NVIDIA was able to squeeze 192 cores of Kepler performance out of 128 cores of Maxwell. GCN is a totally different architecture than Maxwell.


----------



## BigMack70

Quote:


> Originally Posted by *azanimefan*
> 
> this card just shows how tone deaf AMD is.
> 
> they don't seem to get they have a "bad reputation"; which means in order to sell a product they need to
> 
> 1) have the superior product
> 2) have the cheaper price
> 
> if those two things don't come together then their product launches will be DOA. The whole 300 lineup, plus the Fury lineup is a DOA launch. It's too expensive for the market, and the performance isn't there to justify it.
> 
> Congratz AMD, you're even more tone deaf then i thought was possible. This pretty much guarantees that ZEN will give us Sandybridge performance on an 8c/16t which they'll then sell for $1000; while their 4c/8t APU goes for $350. i'm calling it now. that's just how out of touch AMD is right now.


This. And the worst part of all of it is that we are the ones who get screwed as Intel and Nvidia laugh all the way to the bank with little/no competition.


----------



## Casey Ryback

Quote:


> Originally Posted by *Tivan*
> 
> 290 was their shot at that, fixing their image is what they're onto now, because marketing the 290 as a budget alternative to the 970 clearly failed.


At current prices the R9 290 is a better deal than 970 for sure. At all resolutions.

Then again they are being discounted afaik.


----------



## ZealotKi11er

Personally I much prefer a R9 290X GPU with horrible stock cooling and cheaper price. Most people want both but you cant have both. I think people that want to get Fury should wait for the Air Cooled version. It will be $100 cheaper which is good enough to compete with GTX 980 Ti in price/performance. Also CLC cost $. CLC for these GPUs costs $100. Fury Air Cooled better have 4096 SP.


----------



## Rickles

Quote:


> Originally Posted by *p4inkill3r*
> 
> I've forgotten nothing about AMD's situation, as a glance at my portfolio can attest, but the hyperbolic statements and green-flag waving at the fact the 980Ti is winning (at he cost of Titan X's relevancy) is, while not surprising, mind-numbingly predictable.


I am sure you do realize that Fiji falling behind the 980 Ti (and in some games the 780 Ti) could very well mean the end of AMD's GPU division, but stating that others have no grounds to be disappointed is a stretch. AMD is resting on Zen and their non existent server market share to pull them out of the gutter while simultaneously having to jump several generations of Intel processors to be doing anything significant in that market.

Disappointment in AMD at this point is in my opinion perfectly acceptable and anything else is questionable.


----------



## hamzta09

Dont know if its been posted.


----------



## Casey Ryback

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Personally I much prefer a R9 290X GPU with horrible stock cooling and cheaper price. Most people want both but you cant have both. I think people that want to get Fury should wait for the Air Cooled version. It will be $100 cheaper which is good enough to compete with GTX 980 Ti in price/performance. Also CLC cost $. CLC for these GPUs costs $100. Fury Air Cooled better have 4096 SP.


Good points, people are going crazy and chanting voodoo curses at AMD when only their rip off flagship gpu has been released.

That's like judging nvidia by the titan X, I wouldn't suggest for anybody to buy on of those either, even before the 980ti was released.

For all we know the fury and nano could be a far better product for the masses.


----------



## KarathKasun

Quote:


> Originally Posted by *Rickles*
> 
> I am sure you do realize that Fiji falling behind the 980 Ti (and in some games the 780 Ti) could very well mean the end of AMD's GPU division, but stating that others have no grounds to be disappointed is a stretch. AMD is resting on Zen and their non existent server market share to pull them out of the gutter while simultaneously having to jump several generations of Intel processors to be doing anything significant in that market.
> 
> Disappointment in AMD at this point is in my opinion perfectly acceptable and anything else is questionable.


Im sure the Fury falling behind in PCars has nothing to do with the card/architecture and more to do with some other things that NV has patents for.


----------



## p4inkill3r

Quote:


> Originally Posted by *Rickles*
> 
> I am sure you do realize that Fiji falling behind the 980 Ti (and in some games the 780 Ti) could very well mean the end of AMD's GPU division, but stating that others have no grounds to be disappointed is a stretch. AMD is resting on Zen and their non existent server market share to pull them out of the gutter while simultaneously having to jump several generations of Intel processors to be doing anything significant in that market.
> 
> Disappointment in AMD at this point is in my opinion perfectly acceptable and anything else is questionable.


Disappointment is fine, hyperbole is not.


----------



## drBlahMan

AMD will be alright. They remind me of Nintendo...They do enough to stay in the game that allows them to experiment with technology. People are going to keep buying AMD (_even some of the individuals who are pissed & claim that they are walking away from AMD_).


----------



## Alatar

Quote:


> Originally Posted by *KarathKasun*
> 
> Im sure the Fury falling behind in PCars has nothing to do with the card/architecture and more to do with some other things that NV has patents for.


Project CARS is a combination of CPU overhead and AMD's cards for some reason hitting their power limits (at stock) extremely quickly leading to throttled clocks.


----------



## Casey Ryback

Quote:


> Originally Posted by *Rickles*
> 
> Disappointment in AMD at this point is in my opinion perfectly acceptable and anything else is questionable.


It's a niche product that doesn't suit many buyers, that's what I've realised.

They should've waited and bought out the whole lineup, and got everyone excited with not excessively low, but reasonable prices.


----------



## ozlay

seems to have some driver issues


----------



## szeged

so its goin toe to toe with the reference 980ti. All the rumored performance leaks were about right i guess, oh well i was hoping for a titan match or beater but hey this isnt too bad considering the drivers used for the reviews are probably poop mcgoop.

Maybe we will see it drop in price to $599 and then itll really make it shine vs the TI.


----------



## flash2021

someone come get me when there are Win10 / DX12 benches. In about a month (for win10 rite?), we'll see the Fury X's true colors...whether good or bad (and I'm assuming AMD will have good DX12 drivers ready for day1)


----------



## Casey Ryback

Quote:


> Originally Posted by *drBlahMan*
> 
> AND will be alright. They remind me of People are going to keep buying AMD (_even some of the individuals who are pissed & claim that they are walking away from AMD_).


Exactly......did you see the hitler video?


----------



## Cakewalk_S

This thread is BLOWING up today...wow...I hit F5 and bam, 4 new pages...


----------



## texni

noise test


----------



## Alatar

Quote:


> Originally Posted by *szeged*
> 
> Maybe we will see it drop in price to $599 and then itll really make it shine vs the TI.


Not without custom models from AIBs it wont.


----------



## Rickles

Quote:


> Originally Posted by *KarathKasun*
> 
> Still waiting for Nano specs.
> 
> If its a lower clocked/lower volt Fury X it will be a hit for the upgrade crowd. 980 performance/power usage with a tiny board that will fit any case.


2x the performance per watt of the 290x at 175w should put it somewhere between the 960 and 970, which is the market segment that seems to sell a lot of cards. If they can price it aggressively enough the Nano could be their saving grace.

If it's over $350 then I think it's also a bust.


----------



## szeged

Quote:


> Originally Posted by *Alatar*
> 
> Not without custom models from AIBs it wont.


well that only opens it up to getting compared to custom models for the ti then, strix vs strix will be an interesting one to see. Matrix and lightning also.

still waiting on that kitguru review of the card....hey wait a second....


----------



## michaelius

Quote:


> Originally Posted by *Rickles*
> 
> 2x the performance per watt of the 290x at 175w should put it somewhere between the 960 and 970, which is the market segment that seems to sell a lot of cards. If they can price it aggressively enough the Nano could be their saving grace.
> 
> If it's over $350 then I think it's also a bust.


Amd slide says "up tp 2x perf/W" we saw today how much it's worth.


----------



## Alatar

Quote:


> Originally Posted by *szeged*
> 
> well that only opens it up to getting compared to custom models for the ti then, strix vs strix will be an interesting one to see. Matrix and lightning also.
> 
> still waiting on that kitguru review of the card....hey wait a second....


That's the point, there will be no custom Fury Xs because AMD is taking the Titan route with this card.


----------



## Exilon

Quote:


> Originally Posted by *Rickles*
> 
> 2x the performance per watt of the 290x at 175w should put it somewhere between the 960 and 970, which is the market segment that seems to sell a lot of cards. If they can price it aggressively enough the Nano could be their saving grace.
> 
> If it's over $350 then I think it's also a bust.


600mm^2 die + HBM ... it doesn't matter if they sell a lot of Nanos at $350 if they make barely any money from it.


----------



## drBlahMan

Quote:


> Originally Posted by *Casey Ryback*
> 
> Exactly......did you see the hitler video?


Absolute classic







...I needed that laugh...That video was right on time


----------



## dieanotherday

You know what would have been amazing

SINGLE SLOT CARD


----------



## MojoW

Those prices








I'm seeing prices between the 750 and 850 euro.
They really need to drop the price if they want to sell.
Let's see what happens in a few months as this does not bode well for AMD.


----------



## airisom2

Wow, I thought this card was $550 (that's what it should cost, imo). I retract my statement about having mixed feelings about this card. At $650, this card is a flop.

If the 980Ti hadn't been released, Fury X would have had much better appraisal. Then, we would have been left speculating how close the performance of the Fury X would be to the 980Ti instead of the 980Ti showing you why not to buy a Fury X.


----------



## Alatar

Quote:


> Originally Posted by *dieanotherday*
> 
> You know what would have been amazing
> 
> SINGLE SLOT CARD


Could be done with AIOs. All that's needed is an unit that has the pump next to the rad instead of the common on-top-of-the-waterblock design.


----------



## xSociety

Quote:


> Originally Posted by *Achromatis*
> 
> Well I'll still buy AMD. Actually what I want is to just not buy Nvidia and support that Gameworks crap which keeps ruining games, and that leaves me with AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I think I'll wait for second generation HBM.


Oh give me a break. The developers are ruining games, not Nvidia Gameworks.


----------



## dieanotherday

Omg pump noise

This is getting ridiculous


----------



## szeged

Quote:


> Originally Posted by *Alatar*
> 
> That's the point, there will be no custom Fury Xs because AMD is taking the Titan route with this card.


oh i didnt know that, i havent been keeping up with fury x much lately, well that kind of sucks.

maybe fury non x will be the real player?


----------



## Newbie2009

Not impressed with the card, but might get one to play with if I see a glimmer it might be a good overclocker with volts. I fear the worst though. 1250core with +100mv I would guess being the average.


----------



## GamerDork

AMD failboat ahoy.


----------



## FallenFaux

Well, still no compelling reason to upgrade from 290x Crossfire, I guess I'm waiting for 14nm.

Is anyone else disappointed no one benched on Win10?


----------



## hamzta09

550W @ 4K in Tombraider? hory herr.


----------



## forthedisplay

Overall, pretty damn disappointing. 980 TI is still the card to go for.

2016 seems to be a year when cards have enough of an improvement for me to justify upgrading. For me the R9 290 is enough at this point, these >700€ cards do not bring enough of an improvement @ 1440p for me to care right now, at least in the games I'm playing and compared to pricing of these things it was a real bargain. New node and enough VRAM with HBM (as well as computing power to leverage the bandwidth) might make a difference, along with 4K displays becoming a more commonplace thing and cheaper in larger sizes.


----------



## Newbie2009

Quote:


> Originally Posted by *FallenFaux*
> 
> Well, still no compelling reason to upgrade from 290x Crossfire, I guess I'm waiting for 14nm.
> 
> Is anyone else disappointed no one benched on Win10?


Yeah, considering I have my free upgrade incoming, I would have liked to see some charts.


----------



## provost

Quote:


> Originally Posted by *azanimefan*
> 
> *this card just shows how tone deaf AMD is.*


There does seem to be a disconnect, and may be because their attention is always divided between the CPU and GPU divisions.


----------



## Casey Ryback

Quote:


> Originally Posted by *szeged*
> 
> still waiting on that kitguru review of the card....hey wait a second....


Oh damn, all the reviews out must be skewed in AMD's favor due to them being in fear from AMD's stance and only using 100% Pro AMD websites.


----------



## wholeeo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Personally I much prefer a R9 290X GPU with horrible stock cooling and cheaper price. Most people want both but you cant have both. I think people that want to get Fury should wait for the Air Cooled version. It will be $100 cheaper which is good enough to compete with GTX 980 Ti in price/performance. Also CLC cost $. CLC for these GPUs costs $100. Fury Air Cooled better have 4096 SP.


Not sure if you seen the LTT review but it seem like it was hinted that the Fury Air one may be a a cut down version.


----------



## bigjdubb

About as good as expected I guess. I was hoping for a little bit more but the entire lineup seems to pretty much keep pace with nvidia, just using a bit more juice. It would have been nice if it landed somewhere between 980ti and Titan, I'm sure it will eventually with driver updates and what not. The one thing that makes me keep wanting to switch from green to red is Freesync, I hate that G-Sync costs an additional $250-$300 per monitor.


----------



## ZealotKi11er

I am personally going to see how this card performs on the hand of OCN members.


----------



## 47 Knucklehead

I really feel bad for AMD ... and anyone who waited for this card.

They hyped the hell out of this card, and there was no way that it could live up to the hype, and now everyone see it. It's basically tied for second place with no advantage over it's competition ... not in power, not in price, nothing. Ok, it is a little less noisy than a stock 980Ti, but if you compare it to a 980Ti Hybrid, it doesn't even beat that card, and the overclocking basically sucks compared to the 980Ti Hybrid ... a stinking 1150MHz up from 1050MHz. They 980Ti Hybrid goes up to 1550MHz up from stock 1102MHz.

Oh well, I can't bare to watch the rest of the thread for awhile. Too depressing.


----------



## zealord

Well now we know why Nvidia released the 980 Ti last month. If the Titan X was the only competitor to the Fury X then the Fury X at 649$ would look like a solid card, but with the 980 Ti at 649$ aswell the Fury X just isn't ... well interesting.

I think the rumored prices of 849$ were actually true at the time a couple of months ago, but with the release of the 980 Ti at 649$ AMD just realized that they have to go as low as they can afford to with the Fury X.

I just don't see a single reason to buy a Fury X right now and that makes me sad. I wonder how AMD plans for the next 12-15 months with this card? For 649$ noone will buy it and if they slash the price to 549$ then the profits will decrease


----------



## Lass3

I'm mostly disappointed by the bad AIO solution, which is *FAR TOO NOISY!*



And this is why I don't like cheap AIO solutions in general









Maybe it was the only way, since PCB is so small?


----------



## tconroy135

Quote:


> Originally Posted by *Casey Ryback*
> 
> Oh damn, all the reviews out must be skewed in AMD's favor due to them being in fear from AMD's stance and only using 100% Pro AMD websites.


Don't you find it odd that the card comes in at under 980ti performance, but most of the reviews highly rate the card...


----------



## Recognition

I waited since November for this and I am highly disappointed. I cannot stand to wait anymore, because my 5870 is running on its last legs.

In your guys' opinion, will DX 12 and drivers make this card that much better than the 980 ti? Because if so, I will wait it out to see. If not, I will purchase a 980ti tomorrow.


----------



## GamerDork

Nvidia should drop their prices to $550 for their reference cards just to put AMD out of business. I think all the hype and wait shows how pathetic they are and how they'll never compete. AMD lived off being the budget option, now they aren't even doing that well.

It sucks they didn't outdue Nvidia.. nothing is going to lower overall prices again because there's literally no competition from AMD again. I'm extremely disappointed with their final product and all the results.


----------



## Casey Ryback

Quote:


> Originally Posted by *GamerDork*
> 
> AMD failboat ahoy.


Where you headed captain? Can we stop off at the island of clickbait and have ourselves some good old bagging of AMD round a campfire with our mate at KG?


----------



## Cyclonic

Gigabyte 980 TI G1 for 710 euro
Fury X 789 euro


----------



## fantasyalive

AMD should just kept their mouth's shut and released this as the 390x and 390 and pushed the rest of their stack down. Then they could have very competitive products at their respective price points.


----------



## Captivate

I was really hoping for AMD to pull through here, but looking at all these reviews it seems that nvidia won this round, yet again. It's a good thing for my wallet, though. I'll just stick with my sli 780s until 16nm generation comes out. 28nm is oooooold and boring (although fury x/hbm made it a little bit more exciting).


----------



## Casey Ryback

Quote:


> Originally Posted by *GamerDork*
> 
> Nvidia should drop their prices to $550 for their reference cards just to put AMD out of business. I think all the hype and wait shows how pathetic they are and how they'll never compete. AMD lived off being the budget option, now they aren't even doing that well.
> 
> It sucks they didn't outdue Nvidia.. nothing is going to lower overall prices again because there's literally no competition from AMD again.


Thing is this isn't their budget offering, it's their gouge card, it's a titan X in furious clothing.

Noobs will see it in stores and go cool man watercooled fury! full price no problems.


----------



## zealord

Quote:


> Originally Posted by *Recognition*
> 
> I waited since November for this and I am highly disappointed. I cannot stand to wait anymore, because my 5870 is running on its last legs.
> 
> In your guys' opinion, will DX 12 and drivers make this card that much better than the 980 ti? Because if so, I will wait it out to see. If not, I will purchase a 980ti tomorrow.


DX12 is no magical driver that makes everything better. DX12 will be a slow process of bringing games with DX12 feature and at best we get a 20% performance gain if we are lucky.

Also PS4 has sold too many units and is the most important next gen console. It won't support DX12 afaik so I wouldn't put too much hope into DX12.

Publishers/Developers don't care for performance of games as much as some people would like to think.


----------



## Z-Kev

I really was hoping for this card, AMD's offering is not good enough, I will be buying the evga 980ti hybrid which can be overclocked fantastically and still keep 50 degrees temp as well

http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks

the fury x is just too limited and seems pointless at its current price, if it was £450 then the value of it might sway me but not at £540


----------



## provost

Quote:


> Originally Posted by *zealord*
> 
> Well now we know why Nvidia released the 980 Ti last month. If the Titan X was the only competitor to the Fury X then the Fury X at 649$ would look like a solid card, but with the 980 Ti at 649$ aswell the Fury X just isn't ... well interesting.
> 
> I think the rumored prices of 849$ were actually true at the time a couple of months ago, but with the release of the 980 Ti at 649$ AMD just realized that they have to go as low as they can afford to with the Fury X.
> 
> I just don't see a single reason to buy a Fury X right now and that makes me sad. I wonder how AMD plans for the next year to come with this card? For 649$ noone will buy it and if they slash the price to 549$ then the profits will decrease


AMD had the right idea, but unfortunately for AMD, Nvidia beat AMD to it. Good for 980Ti buyers and consumers as this brought Titan x performance at a lower price.
AMD will continue to have this challenge with two 800 pounds gorillas, Intel and Nvidia, in the respective markets.


----------



## BigMack70

Quote:


> Originally Posted by *Recognition*
> 
> In your guys' opinion, will DX 12 and drivers make this card that much better than the 980 ti?


No.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *zealord*
> 
> I just don't see a single reason to buy a Fury X right now and that makes me sad. I wonder how AMD plans for the next 12-15 months with this card? For 649$ noone will buy it and if they slash the price to 549$ then the profits will decrease


Only plans I can see are the Dual Fury ... which nVidia could easily just counter with a Dual 980Ti card. So I don't see a major advantage there.

The Nano ... could be interesting, but with no HDMI 2.0 support (assuming that AMD doesn't learn from their mistake here on the Fury X and put in something other than HDMI 1.4a) ... that is may be their own saving grace. But honestly, given how hot the HBM has shown to be and the GPU ... I wonder how far underclocked that card will have to be to actually work on 1 fan.


----------



## PureBlackFire

Quote:


> Originally Posted by *sugalumps*
> 
> Ye most likely, just like the 7950 and the 290 they were the best price to performance cards of both gens on both sides.


just a reminder, the 7950 was not great at launch price of $450. it was barely faster than it's much leaner little sibling (7870) and way slower than Nvidia's $500 GTX680 and $400 GTX670. the R9 290 was the tits right out of the gate dumping on the equivalent priced Nvidia gpu (GTX770). inb4"lolzminingcrazejackdapriceup".

Hardwarecanucks video review:


----------



## Casey Ryback

Quote:


> Originally Posted by *tconroy135*
> 
> Don't you find it odd that the card comes in at under 980ti performance, but most of the reviews highly rate the card...


I want what you're smoking









All the reviews state the pros and cons of the card. Some even go pretty far with how disappointing they are.

You serious right now or trolling?

They rate the card on it's merits, it has high performance, and very low temperatures. The 980ti rates a lot higher.............


----------



## Noufel

Why didnt AMD hired nvidia driver team to help them


----------



## BigMack70

Quote:


> Originally Posted by *PureBlackFire*
> 
> just a reminder, the 7950 was not great at launch price of $450. it was barely faster than it's much leaner little sibling (7870) and way slower than Nvidia's $500 GTX680 and $400 GTX670.


Just a reminder that the 7950 launched well before the 680 and 670.

When it launched, its only real competition was the 7970 and if you were manually overclocking, it was basically just as good as the 7970 since both could typically hit ~1200 MHz and IIRC the 7950 was only like 5% slower clock for clock than the 7970.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Casey Ryback*
> 
> Noobs will see it in stores and go cool man watercooled fury! full price no problems.


There aren't going to be too many noobs who walk into a store and drop $650 on a video card and NOT read the online reviews first. Sure, AMD can catch a few suckers that way, but because of the price point, they ain't gonna catch many.

Besides, they can just as easily see the EVGA GTX 980Ti Hybrid that is water cooled and has 50% more memory to real in the suckers who don't read reviews.


----------



## Z-Kev

Quote:


> Originally Posted by *47 Knucklehead*
> 
> There aren't going to be too many noobs who walk into a store and drop $650 on a video card and NOT read the online reviews first. Sure, AMD can catch a few suckers that way, but because of the price point, they ain't gonna catch many.
> 
> Besides, they can just as easily see the EVGA GTX 980Ti Hybrid that is water cooled and has 50% more memory to real in the suckers who don't read reviews.


http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks


----------



## decimator

Quote:


> Originally Posted by *SpeedyVT*
> 
> You mean McDonalds Flury X.


Excuse me, that's *Mc*Flurry X









And I guess this throws any prospect of a 6GB full GM200 chip from nVidia right out the window...Hell, you might even see the price of the 980 Ti slowly increase if AMD doesn't drop the price of the Fury X soon.

Best case scenario is better drivers and DX12 being the savior for Fiji, but no one should be buying a Fury X now and counting on that to happen...


----------



## GamerDork

Quote:


> Originally Posted by *Casey Ryback*
> 
> I want what you're smoking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All the reviews state the pros and cons of the card. Some even go pretty far with how disappointing they are.
> 
> You serious right now or trolling?
> 
> They rate the card on it's merits, it has high performance, and very low temperatures. The 980ti rates a lot higher.............


Of course they're trying to give AMD glowing results. AMD already denied a whole bunch of places test cards because of negative review possibilities.


----------



## sugalumps

Quote:


> Originally Posted by *texni*
> 
> noise test


So certain non reference 980/980 ti's are better in the noise depertment aswell?!

Who was that one guy that argued for pages and pages that aio's/clc's were the future and air cooling was holding the industry back?


----------



## CasualCat

Quote:


> Originally Posted by *Casey Ryback*
> 
> Thing is this isn't their budget offering, it's their gouge card, it's a titan X in furious clothing.
> 
> Noobs will see it in stores and go cool man watercooled fury! full price no problems.


Quote:


> Originally Posted by *47 Knucklehead*
> 
> There aren't going to be too many noobs who walk into a store and drop $650 on a video card and NOT read the online reviews first. Sure, AMD can catch a few suckers that way, but because of the price point, they ain't gonna catch many.
> 
> Besides, they can just as easily see the EVGA GTX 980Ti Hybrid that is water cooled and has 50% more memory to real in the suckers who don't read reviews.


Brick and mortars still carry video cards?


----------



## Casey Ryback

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Besides, they can just as easily see the EVGA GTX 980Ti Hybrid that is water cooled and has 50% more memory to real in the suckers who don't read reviews.


True, but I think the HBM sticker on the box is slightly shinier than the 6GB DDR5.

It should hold their attention long enough for them to find themselves swiping their card.

Once they've got a new gpu it's all...........







...........anyway.


----------



## zealord

Quote:


> Originally Posted by *Casey Ryback*
> 
> True, but I think the HBM sticker on the box is slightly shinier than the 6GB DDR5.
> 
> It should hold their attention long enough for them to find themselves swiping their card.
> 
> Once they've got a new gpu it's all...........
> 
> 
> 
> 
> 
> 
> 
> ...........anyway.


From what I can see in the reviews I'd rather have 6GB GDDR5 than 4GB HBM. I expected the Fury X to rock at higher resolutions, but it doesn't really.


----------



## Casey Ryback

Quote:


> Originally Posted by *GamerDork*
> 
> Of course they're trying to give AMD glowing results. AMD already denied a whole bunch of places test cards because of negative review possibilities.


So you're denying that it has high performance and low temperatures?

AMD denied a very small amount of rubbish websites.

It got rated under any nvidia card I've seen released for some time.

How again are the reviews favouring it?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Casey Ryback*
> 
> True, but I think the HBM sticker on the box is slightly shinier than the 6GB DDR5.
> 
> It should hold their attention long enough for them to find themselves swiping their card.
> 
> Once they've got a new gpu it's all...........
> 
> 
> 
> 
> 
> 
> 
> ...........anyway.


No one buying a GPU in a brick and mortar store on a whim has any idea what HBM is or why they should care. Go to Toms Hardware forums (the de facto central for the casual/uninformed pc builder) and you'll find a bunch of people who think it's just faster GDDR5.


----------



## Casey Ryback

Quote:


> Originally Posted by *zealord*
> 
> From what I can see in the reviews I'd rather have 6GB GDDR5 than 4GB HBM. I expected the Fury X to rock at higher resolutions, but it doesn't really.


I was talking about fools though, random fools that don't know jack.


----------



## keikei

How much faster is the Fury X against the R9 290X?


----------



## Z-Kev

Quote:


> Originally Posted by *Casey Ryback*
> 
> So you're denying that it has high performance and low temperatures?
> 
> AMD denied a very small amount of rubbish websites.
> 
> It got rated under any nvidia card I've seen released for some time.
> 
> How again are the reviews favouring it?


not quite, the evga 980ti hybrid runs at under 50 degrees even when overclocked


----------



## szeged

the noise test sounds like sharpening an axe on my bench grinder when its set to low.

Yeah its obviously not loud but the tone its putting out is just annoying as all hell. Seriously is that the pump making that awful grinding squeal?


----------



## zealord

Quote:


> Originally Posted by *keikei*
> 
> How much faster is the Fury X against the R9 290X?


40-50% at 4K
40% at 1080p.


----------



## criminal

Quote:


> Originally Posted by *szeged*
> 
> the noise test sounds like sharpening an axe on my bench grinder when its set to low.
> 
> Yeah its obviously not loud but the tone its putting out is just annoying as all hell. Seriously is that the pump making that awful grinding squeal?


Yeah, it is the pump. Supposedly all review samples have that issue. The retail version are supposed to not have that issue though. That would drive me nuts!


----------



## dieanotherday

Quote:


> Originally Posted by *Lass3*
> 
> I'm mostly disappointed by the bad AIO solution, which is *FAR TOO NOISY!*
> 
> 
> 
> And this is why I don't like cheap AIO solutions in general
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe it was the only way, since PCB is so small?


It's pump noise thats ridiculous


----------



## BoredErica

Quote:


> Originally Posted by *sugalumps*
> 
> So certain non reference 980/980 ti's are better in the noise depertment aswell?!
> 
> Who was that one guy that argued for pages and pages that aio's/clc's were the future and air cooling was holding the industry back?


Big Stroonz.


----------



## maltamonk

Quote:


> Originally Posted by *tconroy135*
> 
> Don't you find it odd that the card comes in at under 980ti performance, but most of the reviews highly rate the card...


Not in the slightest. I would be much more concerned if they claimed doom and gloom, like a lot of posters here do, when realistically it's comparable. The 980ti was rated highly. This card is comparable to the 980ti, thus also highly rated. Wouldn't really make sense otherwise. But ofc sense does fly out the window in most of these conversations here.


----------



## SuprUsrStan

I don't know if this was posted but I had quite a chuckle when I saw this.


----------



## Casey Ryback

Quote:


> Originally Posted by *Z-Kev*
> 
> not quite, the evga 980ti hybrid runs at under 50 degrees even when overclocked


I don't see how that's related to the questions I asked, that you quoted.

The fury is similar, both have an AIO, 980ti performs slightly better across resolutions, and probably got higher marks when reviewed,as all 980ti's did lol.

Has no bearing on review sites being in AMD's pocket.


----------



## criminal

Quote:


> Originally Posted by *Syan48306*
> 
> I don't know if this was posted but *I had quite a chuckle* when I saw this.


Me too.


----------



## Smanci

Quote:


> Originally Posted by *criminal*
> 
> Yeah, it is the pump. Supposedly all review samples have that issue. The retail version are supposed to not have that issue though. That would drive me nuts!


That squel and other sounds are a Cooler Master Seidon AIO trademark.


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> 20W whilst running crysis 3.
> 
> 408W vs 388W. please call the power company and inform them it could affect your neighborhood.
> 
> sorry.....all the trolls have rubbed off on me.


Now you're trolling too. 550W in Tomb Raider 4K? TINY TOM LOGAN states it in his Video.


----------



## szeged

Quote:


> Originally Posted by *criminal*
> 
> Yeah, it is the pump. Supposedly all review samples have that issue. The retail version are supposed to not have that issue though. That would drive me nuts!


i seriously hope no one that buys one from retail has to put up with that noise, id get a refund for it as soon as i turned it on and it started doing that.


----------



## criminal

Quote:


> Originally Posted by *Smanci*
> 
> That squel and other sounds are a Cooler Master Seidon AIO trademark.


Really? Well that blows. Not that I am interested anymore anyway.

Now I want to see what Fury is all about. A lightning Fury card would be really nice.








Quote:


> Originally Posted by *szeged*
> 
> i seriously hope no one that buys one from retail has to put up with that noise, id get a refund for it as soon as i turned it on and it started doing that.


I am almost hoping the first few will have this issue. Then I could buy one off Amazon to play with and then have a reason to get a refund!







I kid, I kid, but I do see that happening.


----------



## sugalumps

Quote:


> Originally Posted by *szeged*
> 
> i seriously hope no one that buys one from retail has to put up with that noise, id get a refund for it as soon as i turned it on and it started doing that.


The pump noise happens on most aio's, it's just most people dont here it because of the insanely bad fans(corsair fains) that come with them. As soon as I changed my fans out for noctuas and put them down low you could hear the pump. I went air cooling and will never go back unless it's a full custom loop.

Clc's/aio is such a niche product, you get worse noise to temps performance at a higher cost than a big air cooler.


----------



## Z-Kev

Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't see how that's related to the questions I asked, that you quoted.
> 
> The fury is similar, both have an AIO, 980ti performs slightly better across resolutions, and probably got higher marks when reviewed,as all 980ti's did lol.
> 
> Has no bearing on review sites being in AMD's pocket.


sorry thought you were saying it was cooler than any nvidia released.......................


----------



## G woodlogger

Fast thread - Cant we call a truce like first world war and eat?


----------



## hollowtek

People that waited must be... Furyous.


----------



## Apexii22

I think we should wait until DX12. AMD said DX12 is a game changer for their products. Fury X might pull ahead of Titan X when DX12 is available, who knows?

Hopefully, it is time we see AMD on top for once.


----------



## Casey Ryback

Quote:


> Originally Posted by *maltamonk*
> 
> Not in the slightest. I would be much more concerned if they claimed doom and gloom, like a lot of posters here do, when realistically it's comparable. The 980ti was rated highly. This card is comparable to the 980ti, thus also highly rated. Wouldn't really make sense otherwise. But ofc sense does fly out the window in most of these conversations here.


Welcome to OCN, everything is a conspiracy or a failure, it's too negative atm.

Raise that shield you're going to need it friend.


----------



## szeged

Quote:


> Originally Posted by *sugalumps*
> 
> The pump noise happens on most aio's, it's just most people dont here it because of the insanely bad fans(corsair fains) that come with them. As soon as I changed my fans out for noctuas and put them down low you could hear the pump. I went air cooling and will never go back unless it's a full custom loop.
> 
> Clc's/aio is such a niche product, you get worse noise to temps performance at a higher cost than a big air cooler.


ive only used one AIO, a h100i, i took off the garbage can stock fans it came with and replaced them with some gentle typhoons, pump was nearly silent (had to put my head next to it to hear it). are other AIOs really that bad?


----------



## provost

Quote:


> Originally Posted by *maltamonk*
> 
> Not in the slightest. I would be much more concerned if they claimed doom and gloom, like a lot of posters here do, when realistically it's comparable. The 980ti was rated highly. This card is comparable to the 980ti, thus also highly rated. Wouldn't really make sense otherwise. But ofc sense does fly out the window in most of these conversations here.


I understand what you are saying, but if this card is targeted at the "enthusiasts"/high end, comparable is not good enough, especially if the comp is late to the party.

May be Serandur was on to something when he sarted experssing his disappointment at AMD "holding back"... not sure if accurate, but if I were AMD, I would have gone all out, balls to the wall

That's how you get people's attention with the flagship enthusiast card, especially considering context of the competition being Nvidia here.... lol


----------



## PureBlackFire

Quote:


> Originally Posted by *BigMack70*
> 
> Just a reminder that the 7950 launched well before the 680 and 670.
> 
> When it launched, its only real competition was the 7970 and if you were manually overclocking, it was basically just as good as the 7970 since both could typically hit ~1200 MHz and IIRC the 7950 was only like 5% slower clock for clock than the 7970.


a month and 10 days before the GTX 680 and 3 months before the 670. kind of like how the Titan X and 980ti had the same lead on Fury X, but the later the launch, the slower the card in this case. and the 7950 didn't oc to 1200mhz on average. if you can remember back then it was the 7970 that oc much better and 7950's that used either a total custom pcb or a 7970 pcb were the ones that oc good, and those were pretty few. the 7950 became a sweet deal 7-12 months after it's launch. after they started coming out with better cards and Nvidi'a 670 had forced a couple price drops.


----------



## MapRef41N93W

Quote:


> Originally Posted by *szeged*
> 
> i seriously hope no one that buys one from retail has to put up with that noise, id get a refund for it as soon as i turned it on and it started doing that.


I never used to hear the pump noise till I built my custom loop. Now when I turn on the old rig that AIO pump is gyrating. Such a night and day difference when you have a 30w D5 pump that is literally dead silent vs those weak and loud Asetek pumps.


----------



## ZealotKi11er

In other news, PC gaming scene is terribad right now so no point on buying a GPU.


----------



## sugalumps

Quote:


> Originally Posted by *szeged*
> 
> ive only used one AIO, a h100i, i took off the garbage can stock fans it came with and replaced them with some gentle typhoons, pump was nearly silent (had to put my head next to it to hear it). are other AIOs really that bad?


It was the h100i that was the one I had the noise problem with, two of them I tried ahah.

Oh well, I must have just gotten real unlucky then.


----------



## SuprUsrStan

TTL says the whole setup was silent...

That's a stark difference from what others are reporting. Perhaps the production version is good?


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> Now you're trolling too. 550W in Tomb Raider 4K? TINY TOM LOGAN states it in his Video.


No trolling, anandtech results.

You really believe a system with a 275W TDP card with a said power limit of 375W would be using 550W in a system during a game...........

come on.............please.


----------



## SpeedyVT

I still think we know little with just a release driver of it's genuine performance. For all we know the bandwitdh may not be patched. It's size vs Titan X is great, not sure about the 980 ti to compare to.


----------



## keikei

Quote:


> Originally Posted by *zealord*
> 
> 40-50% at 4K
> 40% at 1080p.


Aint too shabby.


----------



## Smanci

Quote:


> Originally Posted by *criminal*
> 
> Really? Well that blows. Not that I am interested anymore anyway.


https://www.youtube.com/watch?v=TjSClT2w79o


----------



## tconroy135

Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't see how that's related to the questions I asked, that you quoted.
> 
> The fury is similar, both have an AIO, 980ti performs slightly better across resolutions, and probably got higher marks when reviewed,as all 980ti's did lol.
> 
> Has no bearing on review sites being in AMD's pocket.


I wouldn't say review sights are in AMD's pocket; it's more like review sights are in both AMD and NVIDIA's pockets. Rarely can you go to the conclusion section of a review and get any real information. It is unfortunate that you have to dig through benchmarks, overclockig, etc. to find even a glimpse of the truth about a product.

That being said AMD fan boys are being a little bit absurd calling the Fury X a good card because it is "comparable" to the 980ti. If the 980 TI came with a factory water cooler it would obliterate the Fury X. And while you can argue that since they are the same price the Fury coming stock with a water cooler doesn't mean anything; the truth is it doesn't overclock well and that fact make the Fury X not really even comparable to the 980 Ti


----------



## 47 Knucklehead

Quote:


> Originally Posted by *CasualCat*
> 
> Brick and mortars still carry video cards?


Sure. Just look at all the photos from Best Buy when they were selling the 300-series (aka the 200-series rebrands) before they were SUPPOSED to.


















Keep hunting AMD ... keep hunting.


----------



## Mygaffer

Wow, looks like a really nice card. Small, quiet, runs really cool for a monster sized chip, and performance is neck and neck with the 980ti.

If I hadn't have already purchased my GTX 980 when it came out I would certainly have bought a Fury X.


----------



## Forceman

Quote:


> Originally Posted by *Apexii22*
> 
> I think we should wait until DX12. AMD said DX12 is a game changer for their products. Fury X might pull ahead of Titan X when DX12 is available, who knows?
> 
> Hopefully, it is time we see AMD on top for once.


The problem is that is always the AMD refrain. Wait for TruAudio, wait for Mantle, wait for Freesync, wait for the Omega drivers, wait for Crossfire Freesync, wait for Fury, wait for DX12. At some point you just need to start judging what is here and now.


----------



## PureBlackFire

Quote:


> Originally Posted by *ZealotKi11er*
> 
> In other news, PC gaming scene is terribad right now so no point on buying a GPU.


true. every time there's a console launch you may as well take two - three years break from pc gaming. literally every cycle since 2001.


----------



## SuprUsrStan

I can't keep up with this thread


----------



## szeged

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I never used to hear the pump noise till I built my custom loop. Now when I turn on the old rig that AIO pump is gyrating. Such a night and day difference when you have a 30w D5 pump that is literally dead silent vs those weak and loud Asetek pumps.


idk if i got a really good sample or what (or the pump is dead and i never checked lol) but my h100i pump is pretty quiet.

but youre right, my d5 is still quieter.


----------



## Phaster89

Quote:


> Originally Posted by *Apexii22*
> 
> I think we should wait until DX12. AMD said DX12 is a game changer for their products. Fury X might pull ahead of Titan X when DX12 is available, who knows?
> 
> Hopefully, it is time we see AMD on top for once.


well from this http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/6 i don't see a bright future for amd


----------



## BoredErica

Quote:


> Originally Posted by *szeged*
> 
> i seriously hope no one that buys one from retail has to put up with that noise, id get a refund for it as soon as i turned it on and it started doing that.


I think the video could have exaggerated the problem. Ryan Shrout said it wasn't really that bad.


----------



## HeadlessKnight

Quote:


> Originally Posted by *47 Knucklehead*
> 
> They 980Ti Hybrid goes up to 1550MHz up from stock 1102MHz.


Oh my god this is getting old. No. It is not. Considering you have GTX 980 and owned 780s before you should have known that since Keplers Nvidia cards boost way above their marketed clocks, the days were Nvidia cards run at the marketed clocks are long dead with Fermi. GTX 980 Ti boosts way above 1102 MHz at stock, 1200+ MHz easily, highly clocked models easily boost in the 1300s range. I doubt you don't know this easy fact and that you lied intentionally because of your AMD hate. You should be ashamed.


----------



## BigMack70

Quote:


> Originally Posted by *PureBlackFire*
> 
> a month and 10 days before the GTX 680 and 3 months before the 670. kind of like how the Titan X and 980ti had the same lead on Fury X, but the later the launch, the slower the card in this case. and the 7950 didn't oc to 1200mhz on average. if you can remember back then it was the 7970 that oc much better and 7950's that used either a total custom pcb or a 7970 pcb were the ones that oc good, and those were pretty few. the 7950 became a sweet deal 7-12 months after it's launch. after they started coming out with better cards and Nvidi'a 670 had forced a couple price drops.


Fair enough, though I would debate the idea that the slower card came first in either case... the 7970 was always a better card overall than the 680, unless you had some weird complex where you liked running your 7970 at 925 MHz. I'd say that in both cases, the better card released first.


----------



## Blameless

I'm in near complete agreement with HardOCP's conclusions, yet Fury isn't really a disappointment in my eyes. It's not falling far short of what I expected. It has it's upsides and downsides. Some things are still unknown. I wouldn't recommend it over a 980Ti at this point, but something as small as a $50 price advantage or demonstrable proof of solid OCing after voltage control is unlocked could change that.

Maybe my expectations were simply reasonable ones...
Quote:


> Originally Posted by *lacrossewacker*
> 
> God help the air cooled version released later....


The air cooled version should be ~100 dollars less and only slightly slower, making it the most appealing of the Fury SKUs for many.
Quote:


> Originally Posted by *raghu78*
> 
> With 45% more shaders and the same clocks thats miserable scaling.


Shaders aren't everything.
Quote:


> Originally Posted by *airisom2*
> 
> And about the 104C VRMs, I'm pretty sure that card wasn't screwed together well .
> 
> ...
> 
> As G3D didn't remove the backplate, focus on the 8-pin connector area. On Hardware.fr's thermal image shot, they're hot, and on G3D's, they're far from it. There is no reason for VRMs that are actively cooled by an AIO unit with a copper tube going over them go get 100C+ on load unless there is little to no contact.


I hope that 104C is an isolated case of a defective or badly (re)assembled sample, but I'd definitely like more tests to confirm.
Quote:


> Originally Posted by *Forceman*
> 
> Well, if that little cool pipe isn't making good contact with the VRMs that could explain the high temps in a way that wouldn't affect the air cooled cards. There is no airflow at all in there, so bad contact with the water pipe would be a sort-of-disaster. Hopefully the larger heatsink/cold-plate of the air cooled cards would keep the VRMs cooler.


Yeah, I made the point earlier that it could be a broken solder/TIM joint, or loose screws.
Quote:


> Originally Posted by *Alatar*
> 
> I don't get why everyone thinks the nano will be priced really well. I see it as a niche card that gets the best binned PRO GPUs or something. I'd expect to see it priced higher or on par with the normal Fury.


I doubt it will get any special binning or cost as much as the Fury. Personally, I think it's just going to have more SMs disabled and be clocked significantly lower.
Quote:


> Originally Posted by *Alatar*
> 
> The overall expectations due to the big die size and HBM advantage


The idea that HBM in and of itself would automatically translate into a huge advantage was a profoundly flawed assumption that has been repeatedly perpetuated. HBM exists to lower total board power and provide better memory bandwidth; it doesn't magically do anything else...yet people insisted on regarding it as a performance game changer.
Quote:


> Originally Posted by *raghu78*
> 
> actually the way its going AMD would rather not sell than sell below cost. That 596 sq mm die with 4 GB HBM and that AIO cooler is not going to be cheap to manufacture.


Doesn't cost anywhere near MSRP to manufacture. Even half MSRP would be surprising.

Margin is probably pretty good at $650, and sacrificing some of that to maintain or grow market share may be wise...unless they can resolve some nagging issues with a driver and soon.
Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't follow.............I don't really see where they have lied.


They haven't.
Quote:


> Originally Posted by *texni*
> 
> can someone explain to me how nvidia cards can compete despite having less "cores"?


Higher clock speeds, 50% more ROPs, other miscellaneous architectural differences, mature drivers...


----------



## keikei

Why no dual release to include the Fury Pro?


----------



## FallenFaux

Quote:


> Originally Posted by *Noufel*
> 
> Why didnt AMD hired nvidia driver team to help them


Because they didn't want to get 10 minutes into a benchmark and crash to desktop.


----------



## zealord

Quote:


> Originally Posted by *ZealotKi11er*
> 
> In other news, PC gaming scene is terribad right now so no point on buying a GPU.


Sadly this is a good post








Quote:


> Originally Posted by *keikei*
> 
> Aint too shabby.


Actually very shabby. The 290X overclocks better and I compared both reference designs. The Fury X is only like 30-35% faster than a 290X that comes with an aftermarket cooler a slight overclock and can hold its clock


----------



## XxOsurfer3xX

I don't really know what to do anymore. Fury is definitely an improvement over 980, but I can't help but feel they are trying to milk us for everything we've got. I was completely set on buying fury or Ti, but not anymore. I may wait for 14 nm....


----------



## Casey Ryback

Quote:


> Originally Posted by *tconroy135*
> 
> I wouldn't say review sights are in AMD's pocket; it's more like review sights are in both AMD and NVIDIA's pockets. Rarely can you go to the conclusion section of a review and get any real information. It is unfortunate that you have to dig through benchmarks, overclockig, etc. to find even a glimpse of the truth about a product.
> 
> That being said AMD fan boys are being a little bit absurd calling the Fury X a good card because it is "comparable" to the 980ti. If the 980 TI came with a factory water cooler it would obliterate the Fury X. And while you can argue that since they are the same price the Fury coming stock with a water cooler doesn't mean anything; the truth is it doesn't overclock well and that fact make the Fury X not really even comparable to the 980 Ti


have you even read the fury reviews? Some are very harsh in the conclusion......

I think you've just got this idea in your head....which may or may not be true, and are assuming this is the way it is for fury reviews.

Why are almost all sites linear in their results, if each one is in one or the others pockets?

The fury X is not a great card, but, it's not a terrible card.

I dislike the whole package....AIO etc. No way am i buying one.

like you say OC potential regardless of unlocked voltage probs won't be there.

Hence why i don't like the overall package.

The air cooled fury, and fury nano could still be promising, and it's all about price.

I'd prefer a bang for buck 4GB fury, over an expensive 980ti, as would many others.


----------



## sugalumps

Quote:


> Originally Posted by *PureBlackFire*
> 
> true. every time there's a console launch you may as well take two - three years break from pc gaming. literally every cycle since 2001.


True but the fps and res is worth it alone imo.

Going between bloodborne at 1080p 30fps and dark souls 2 scholar at 60fps 1440p the difference is huge imo, I will tolerate 30fps if the game is worth it(bloodborne) but the experience is no where near as good as it could have been. I really hate how sluggish 30fps feels, I am not being a snob it honestly detracts from the game at times. My wife cares not about things like that and she even pointed it out when watching me playing bloodborne "why is it so choppy", cause she is used to watching me play pc games.


----------



## CasualCat

Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> I don't really know what to do anymore. Fury is definitely an improvement over 980, but I can't help but feel they are trying to milk us for everything we've got. I was completely set on buying fury or Ti, but not anymore. I may wait for 14 nm....


Honestly if you were ever considering the FIJI/GM200 you never should have bought the 980 unless you intended to do an incremental upgrade from the start, but it sounds like you didn't.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Forceman*
> 
> The problem is that is always the AMD refrain. Wait for TruAudio, wait for Mantle, wait for Freesync, wait for the Omega drivers, wait for Crossfire Freesync, wait for Fury, wait for DX12. At some point you just need to start judging what is here and now.


EXACTLY!









+Rep


----------



## Kuivamaa

Quote:


> Originally Posted by *SKYMTL*
> 
> Tell me again that DX11 is pointless when DX11 features BETTER performance in BF4 than Mantle....we experienced that and so did TechReport.


It was certainly the case with previous cards.I don't know why it doesn't apply with Fury X, If HBM is at fault or if you got a false positive from SP campaign. In any case, it something that needs to be tested and from a quick view in the reviews only TR did it.


----------



## SuprUsrStan

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *47 Knucklehead*
> 
> They 980Ti Hybrid goes up to 1550MHz up from stock 1102MHz.
> 
> 
> 
> Oh my god this is getting old. No. It is not. Considering you have GTX 980 and owned 780s before you should have known that since Keplers Nvidia cards boost way above their marketed clocks, the days were Nvidia cards run at the marketed clocks are long dead with Fermi. GTX 980 Ti boosts way above 1102 MHz at stock, 1200+ MHz easily, highly clocked models easily boost in the 1300s range. I doubt you don't know this easy fact and that you lied intentionally because of your AMD hate. You should be ashamed.
Click to expand...

I can overclock my 980Ti up to 1520Mhz on water.

On air Anandtech got up to 1477.


----------



## Gilles3000

Quote:


> Originally Posted by *Darkwizzie*
> 
> I think the video could have exaggerated the problem. Ryan Shrout said it wasn't really that bad.


Some samples seem to have has worse whine than others.

TTL claimed his was nearly silent, and didn't mention anything about a whine.
PCper, like you said, claimed it wasn't terrible but needs some work.
HardwareCanuks had absolutely horrible whine.


----------



## PureBlackFire

Quote:


> Originally Posted by *BigMack70*
> 
> Fair enough, though I would debate the idea that the slower card came first in either case... the 7970 was always a better card overall than the 680, unless you had some weird complex where you liked running your 7970 at 925 MHz. I'd say that in both cases, the better card released first.


oh no, I wasn't calling the 680 better by any stretch. I was a 7950 owner early on. looking at the card get destroyed in stock benches by nvidia cards with a 300mhz clock speed advantage. I was just saying that the 7950 wasn't a great card at launch. even when overclocked, it hit a wall much lower than the 7970. the 7970 easily kept 100Mhz clock speed ceiling over the 7950 until better 7950's came out that could clock as well. to nvidia's credit, their late release were more efficient and at least faster at stock. AMD's latr Fury X release comes up short on too many fronts.


----------



## GamerDork

Why are people defending the card saying it's 'not that bad'. Nobody wanted a wanna be 980 ti.. we expected something to take the lead and show us what this new technology and wait was for. AMD did nothing to help us as consumers.. if anything this will keep 980 ti prices higher, or even increase them because there's no reason no to at this point.


----------



## maltamonk

Quote:


> Originally Posted by *tconroy135*
> 
> I wouldn't say review sights are in AMD's pocket; it's more like review sights are in both AMD and NVIDIA's pockets. Rarely can you go to the conclusion section of a review and get any real information. It is unfortunate that you have to dig through benchmarks, overclockig, etc. to find even a glimpse of the truth about a product.
> 
> That being said AMD fan boys are being a little bit absurd calling the Fury X a good card because it is "comparable" to the 980ti. If the 980 TI came with a factory water cooler it would obliterate the Fury X. And while you can argue that since they are the same price the Fury coming stock with a water cooler doesn't mean anything; the truth is it doesn't overclock well and that fact make the Fury X not really even comparable to the 980 Ti


How in the world is that absurd? You're honestly saying that they are not comparable cards? Where do you draw that line? Somehow I don't think calling the two comparable makes me an amd fanboy. That seems a bit absurd.


----------



## ZealotKi11er

One think is more relevant with AMD GPU that as the performance goes up DX11 overhead shows up more often. I really want to see this card in DX12 vs GTX 980 Ti since the CPU will not be a factor.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Oh my god this is getting old. No. It is not.


Have you seen the overclocking reviews of the 980Ti Hybrid? No, I doubt you have. So please, don't tell me I am wrong.














So what's your next move ... whine that some people get to 1550, some get to 1500, some only get to 1480? Well no matter, that is a hell of a lot better than going from 1050 to 1150 ... a whopping 10% from the Fury X.

13-15% > 10%

An no, it's not "the new math".


----------



## P.J




----------



## BigMack70

Quote:


> Originally Posted by *Syan48306*
> 
> I can overclock my 980Ti up to 1520Mhz on water.


I think the point of contention was the stated "stock" clock which was stated misleadingly so as to make the overclock sound larger than it actually is. For example, my Titan X cards are currently running at 1460 MHz on the core, which is a whopping 36% overclock compared to the "stock" clock of 1075. However, in reality, a Titan X runs at about ~1175 MHz in a fully stock setup, so it's actually only a 24% overclock.


----------



## Slaughterem

Quote:


> Originally Posted by *Ganf*
> 
> It's depressing to think that Zen won't pull their chestnuts out of the fire on the processor side also. Everyone keeps harping on the idea of Samsung buying AMD, I'd rather IBM bought them and jumped back into the consumer market. IBM knows how to get raw power out of hardware architectures, and AMD can simply provide them with the licensing to do whatever they want. Samsung is too infatuated with the mobile side of things and will push AMD to focus on that, whereas IBM knows it doesn't have a chance in hell in mobile with it's power hungry architectures and will be going for Intel and Nvidia's throats. Especially Intel's, since they'll be able to offer two architecture's for servers.


It won't be any of those you mention. It will be Qualcomm http://techfrag.com/2015/06/23/qualcomm-may-acquire-amd/


----------



## tconroy135

Quote:


> Originally Posted by *maltamonk*
> 
> How in the world is that absurd? You're honestly saying that they are not comparable cards? Where do you draw that line? Somehow I don't think calling the two comparable makes me an amd fanboy. That seems a bit absurd.


The Fury X draws more power, runs hotter and has a much lower overclocking ability while being beat in almost every benchmark compared to the 980Ti. And before you argue about hotter, I mean every area of the card other than the core.

The Fury X is basically a factory OC card that AMD is selling as stock because they want to compete with nVidia.


----------



## XxOsurfer3xX

Quote:


> Originally Posted by *CasualCat*
> 
> Honestly if you were ever considering the FIJI/GM200 you never should have bought the 980 unless you intended to do an incremental upgrade from the start, but it sounds like you didn't.


I got a new 4K monitor and my 680 SLI was a horrible experience, so I bought the 980 for a decent experience. But its not enough and I don't want to go SLI.


----------



## harney

Anybody know where to buy a 980ti G1 in the EU for a fair price


----------



## Casey Ryback

Quote:


> Originally Posted by *GamerDork*
> 
> Why are people defending the card saying it's 'not that bad'.


probably because....it's not that bad?


----------



## gamervivek

The saving grace for AMD would be if they could optmize their drivers for a way different GPU setup than Hawaii and get it by the time they get a 8GB version out(yes that too). Not really a 980TI's equal.


----------



## sugalumps

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *P.J*






Look how low maxwell is on that list.

Maxwell


----------



## zealord

Quote:


> Originally Posted by *harney*
> 
> Anybody know where to buy a 980ti G1 in the EU for a fair price


Edit : my bad. you said G1.


----------



## MerkageTurk

After this fiasco AMD have turned ATI into their rubbish CPUS

Sorry but benchmarks speak for themselves

They are using the same tactics and same foundation as their CPUS

Samsung please buy them, this is really not helping the consumer and now nVidia can bring up the prices


----------



## CasualCat

Quote:


> Originally Posted by *XxOsurfer3xX*
> 
> I got a new 4K monitor and my 680 SLI was a horrible experience, so I bought the 980 for a decent experience. But its not enough and I don't want to go SLI.


Well I don't blame you for not wanting to rely on SLI, sounds like your situation/setup would require an upgrade though to get the most out of your monitor.

Still don't see that as Nvidia/AMD milking people though.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> The problem is that is always the AMD refrain.


Not even AMD. Just a certain segment of AMD fans.
Quote:


> Originally Posted by *GamerDork*
> 
> Why are people defending the card saying it's 'not that bad'.


Because it's not that bad.
Quote:


> Originally Posted by *GamerDork*
> 
> we expected something to take the lead and show us what this new technology and wait was for.


Expecting the Fury to be dramatically better than the 980Ti was never a reasonable expectation.
Quote:


> Originally Posted by *GamerDork*
> 
> It's a pre-overclocked piece of crap out paced by Nvidia.. as always, so i shouldn't be surprised I guess.


Sounds like your "always" doesn't stretch back more than a generation.


----------



## Ashura

Quote:


> Originally Posted by *Blameless*
> 
> The air cooled version should be ~100 dollars less and only slightly slower, making it the most appealing of the Fury SKUs for many.


Yes, I hope it overclocks well too. A fully custom watercooled Fury might be an option.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *sugalumps*
> 
> 
> Look how low maxwell is on that list.
> 
> Maxwell


Yup, 66W for a Fury X vs 14W for a 980Ti.


----------



## szeged

its definitely not a bad card.

its also not the uber amazing titan killing diamond encrusted solid platinum card hand crafted by god that everyone was trying to make it out to be.

also...

"an overclockers dream"

lol.


----------



## GamerDork

Quote:


> Originally Posted by *Blameless*
> 
> Not even AMD. Just a certain segment of AMD fans.
> Because it's not that bad.
> Expecting the Fury to be dramatically better than the 980Ti was never a reasonable expectation.


Haha okay. once again AMD is the 'lowered expectations' crowd. There is ZERO reason to buy this epic pile.


----------



## sugalumps

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Yup, 66W for a Fury X vs 14W for a 980Ti.


Oh man tempted to sell my 980 and spare 780 and grab the zotac amp ti, that's the cheapest of the aftermarket ti's here in the UK. Looks really nice aswell!

Such a sucka, nvidia must love me









Though after all this my 980 is still looking like a great card, I could always look out for a cheap second hand 980 for sli.


----------



## criminal

Quote:


> Originally Posted by *sugalumps*
> 
> 
> Look how low maxwell is on that list.
> 
> Maxwell


I will admit, I love my 980 right now. And Fury X turning out like this saves me from looking to upgrade.

Quote:


> Originally Posted by *MerkageTurk*
> 
> After this fiasco AMD have turned ATI into their rubbish CPUS
> 
> Sorry but benchmarks speak for themselves
> 
> They are using the same tactics and same foundation as their CPUS
> 
> Samsung please buy them, this is really not helping the consumer and now nVidia can bring up the prices


I really wish AMD hadn't bought ATI. ATI really did compete better.


----------



## texni

Quote:


> Originally Posted by *harney*
> 
> Anybody know where to buy a 980ti G1 in the EU for a fair price


you can find the inno 3d 980 ti that is just as good or even better for a "decent" eu price here :

http://www.ldlc.com/fiche/PB00189154.html

reviews:
http://www.computerbase.de/2015-06/geforce-gtx-980-ti-custom-test-partnerkarten-vergleich/5/

http://nl.hardware.info/reviews/6124/inno3d-geforce-gtx-980-ti-airboss-x3-ultra-review?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=hardwareinfo


----------



## 47 Knucklehead

Quote:


> Originally Posted by *szeged*
> 
> its definitely not a bad card.
> 
> its also not the uber amazing titan killing diamond encrusted solid platinum card hand crafted by god that everyone was trying to make it out to be.
> 
> also...
> 
> "an overclockers dream"
> 
> lol.


This.









It's a good card ... it has issues for non-gaming reasons ... but it's a good card.

But it isn't the enthusiasts wet dream as it was hyped. It is basically AMD's answer to the GTX 980Ti.

To be honest, AMD needed this card, otherwise they would have been TOTALLY hosed ... since the rest of their 300-series line is just a 200-series rebrand.

As HardOCP said, it's just a test platform for HBM GEN 1 memory.


----------



## xundeadgenesisx

I think this card is going to be a worthy successor to my 290X.


----------



## HeadlessKnight

Quote:


> Originally Posted by *BigMack70*
> 
> I think the point of contention was the stated "stock" clock which was stated misleadingly so as to make the overclock sound larger than it actually is. For example, my Titan X cards are currently running at 1460 MHz on the core, which is a whopping 36% overclock compared to the "stock" clock of 1075. However, in reality, a Titan X runs at about ~1175 MHz in a fully stock setup, so it's actually only a 24% overclock.


Well said. Respect to this guy







. some people just can't use their brains







. It doesn't matter whether it is 1550 or 1600 MHz but he intentionally said it overclocks to 1550 MHz from 1100 MHz just to put AMD in even a more bad light than it already is. That's a 41% OC while in fact when measured from the real clock it is only 20% or 25%.


----------



## szeged

i do applaud amd for taking the plunge on hbm 1 instead of pushing it back to use it vs pascal and just going with gddr5 for fury


----------



## Nightbird

Time to wait for 14nm! It's great that both sides managed to squeeze out so much from the current process, but the double node shrink is where the 4K single GPU support is going to be at.


----------



## zealord

Quote:


> Originally Posted by *szeged*
> 
> i do applaud amd for taking the plunge on hbm 1 instead of pushing it back to use it vs pascal and just going with gddr5 for fury


The problem with HBM1 for this cards seems to be that the consumer is suffering because :

1) While it does offer higher performance for memory bandwidth it doesn't really offer higher performance in games. Maybe only slightly

2) More expensive than GDDR5 which makes the whole card more expensive. AMD would've never released the Fury X at 649$ if it had 8GB GDDR5.

3) It comes out probably way later than AMD originally intended. An GDDR5 Fury X probably would've been out by March 2015.

4) We only get 4GB. Even if it is HBM it is still only 4GB.

5) No custom cards. No option for cheap water-cooling solutions for people who dislike the Fury X cooling solution. (As of now. This may change in the future)


----------



## BigMack70

Quote:


> Originally Posted by *GamerDork*
> 
> there isn't any reason to buy it


If I were to build a 1080p or 1440p ITX gaming rig, it would almost certainly get a Fury X inside it...


----------



## Casey Ryback

Quote:


> Originally Posted by *GamerDork*
> 
> Nobody wanted a card priced equally to already available Nvidia options that can't even out compete them.. that's redundant and foolish.


Well of course that's why people are wanting price drops, and/or they are looking at the air cooler fury and fury nano.

The way you depicted the card it was garbage you wouldn't even put in your system if someone gave you one.

People just love exaggerating how bad something is for kicks.


----------



## keikei

This is from one of the reviews and i'm inclined to agree based on comparable performance.

Quote:


> I think a more appropriate price point for the Fury X would be *$599.*


----------



## LancerVI

Quote:


> Originally Posted by *Casey Ryback*
> 
> You just shouldn't have expected them to take 'the crown' that's a myth created by your imagination. (and the hypetrain)
> 
> The performance isn't the worst thing, the price is though.
> 
> AMD need to dominate the tiers up to 980ti/titan X with value offerings.
> 
> You (and others) say it like the card is a pile of horse turd, yet plug it into your system and I'm sure you could play your games with good frame rates.


I understand where you're coming from and your right to a degree; but considering the time it took to get this thing to market, I can't help but feel somewhat disappointed. I was really rooting for AMD here. A strong AMD is good for EVERYONE. I love my R9 290s; they're awesome. However, the Fury X doesn't seem to have taken advantage of the development time taken to make the thing. Don't get me wrong; it looks like a solid card that will work great for 90% of people out there, but just not me.

Also, this piecemeal delivery of products is killing them. They needed to show Fury X2, nano and plain Fury and let those be reviewed as well. Gives us more options to consider, even if they're not on store shelves.

People are tired of waiting. AMD has had plenty of time. Based on that, I believe they fell short.


----------



## Blameless

Quote:


> Originally Posted by *tconroy135*
> 
> That being said AMD fan boys are being a little bit absurd calling the Fury X a good card because it is "comparable" to the 980ti.


Failing to see the absurdity. If the card performs comparably for a comparable price, then it's a good card if the card it's being compared to a is a good card.

The 980Ti is a very good card. The Fury X, as it is now, is slightly less good.
Quote:


> Originally Posted by *tconroy135*
> 
> If the 980 TI came with a factory water cooler it would obliterate the Fury X.


Indeed it would, but it doesn't...not unless you pay considerably more.
Quote:


> Originally Posted by *tconroy135*
> 
> And before you argue about hotter, I mean every area of the card other than the core.


Still uncertain whether the temps shown are typical, or the result of a damaged review sample.

Hard to know how many times that particular sample had changed hands, how many times it was dismantled, or what could have been done to it.

Something as simple as damaging a thermal pad, or not tightening all the screws down all the way can dramatically affect VRM temps.


----------



## Serandur

Quote:


> Originally Posted by *criminal*
> 
> I really wish AMD hadn't bought ATI. ATI really did compete better.


Strong and possibly hype-induced opinion warning (and yes, there's hyperbole too):

ATi were a proud GPU company giving Nvidia a run for their money especially during the few years before AMD bought them out. AMD are a failing parasitic leech of a company sucking ATi dry. They're the EA of hardware companies having bought out ATi during their golden years, then subsequently running them into the ground.

I always thought the ATi purchase was a mistake, but didn't feel so strongly about this until relatively recently when the magnitude of AMD's GPU R&D cuts and the results were known.


----------



## GorillaSceptre

Ah well, a bit disappointing but i'm glad i waited, now i can get a 980Ti instead of a TX









There's just more pro's with Nvidias current lineup. You don't have to worry about hdmi 2.0, no uncertainty involving 4GB of ram, better overclocking, custom AIB's..

980Ti it is


----------



## MerkageTurk

Amd should of just put normal gddr and let nVidia release their hbm first trial and error


----------



## harney

Quote:


> Originally Posted by *sugalumps*
> 
> Oh man tempted to sell my 980 and spare 780 and grab the zotac amp ti, that's the cheapest of the aftermarket ti's here in the UK. Looks really nice aswell!
> 
> Such a sucka, nvidia must love me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though after all this my 980 is still looking like a great card, I could always look out for a cheap second hand 980 for sli.


Yep tempted too £529.99 at OC...but would prefer msi 6g or GA g1


----------



## CasualCat

Quote:


> Originally Posted by *keikei*
> 
> This is from one of the reviews and i'm inclined to agree based on comparable performance.


I think $550 would be better (if possible) as it'd put pressure on the 980 where it is a clear win performance wise and make people question if the performance gain from the 980Ti is worth $100. $50 difference though when spending $650 isn't a lot.


----------



## Rei86

Money saved!


----------



## Xuper

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Sure. Just look at all the photos from Best Buy when they were selling the 300-series (aka the 200-series rebrands) before they were SUPPOSED to.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep hunting AMD ... keep hunting.


You know it's fake Slide ?


----------



## BigMack70

Quote:


> Originally Posted by *CasualCat*
> 
> I think $550 would be better (if possible) as it'd put pressure on the 980 where it is a clear win performance wise and make people question if the performance gain from the 980Ti is worth $100. $50 difference though when spending $650 isn't a lot.


I think this is why the air cooled Fury Pro is going to be the better option. The AIO cooler on the Fury X likely was what kept the cost up at the $650 mark and hopefully we'll see similar performance for $100 less when we get the air cooled version.


----------



## Liranan

You come to this thread hoping to get some information and it's just drama queenery, crying of the usual suspects and the usual putting down of the competition just because.

At the end of the day it comes down to what you want. You want performance now, you get a 980Ti or overpriced Titan (which is often bested by the Ti), if you want performance that doesn't decrease because nVidia drops support for your hardware and hardware that not only maintains longevity but keeps improving over time you get the Furry.
Quote:


> Originally Posted by *MerkageTurk*
> 
> Amd should of just put normal gddr and let nVidia release their hbm first trial and error


When the 4870 was first released it was equal to the GTX 260 but thanks to nVidia caring about their hardware the way they care about Kepler now Maxwell is out the 4870 just kept increasing in performance and in the end was almost equal to a GTX 280.

AMD were the first with DDR5 and now they're the first with HBM. I commend them for it.


----------



## tconroy135

Quote:


> Originally Posted by *Blameless*
> 
> Failing to see the absurdity. If the card performs comparably for a comparable price, then it's a good card if the card it's being compared to a is a good card.
> 
> The 980Ti is a very good card. The Fury X, as it is now, is slightly less good.
> Indeed it would, but it doesn't...not unless you pay considerably more.
> Still uncertain whether the temps shown are typical, or the result of a damaged review sample.
> 
> Hard to know how many times that particular sample had changed hands, how many times it was dismantled, or what could have been done to it.
> 
> Something as simple as damaging a thermal pad, or not tightening all the screws down all the way can dramatically affect VRM temps.


All of this may be true, but I think my overall point is that with the current GPU atmosphere, where AMD has come under a lot of criticism, at the same price point as the 980 Ti, in my opinion, the Fury X underwhelms a bit. Whereas if the Fury X was even 10% faster than the 980Ti/Titan X it would have been a huge success.

What consumer, size issues aside, would reasonably buy a Fury X over a 980Ti?


----------



## sugalumps

Quote:


> Originally Posted by *Liranan*
> 
> You come to this thread hoping to get some information and it's just drama queenery, crying of the usual suspects and the usual putting down of the competition just because.
> 
> At the end of the day it comes down to what you want. You want performance now, you get a 980Ti or overpriced Titan (which is often bested by the Ti), if you want performance that doesn't decrease because nVidia drops support for your hardware and hardware *that not only maintains longevity* but keeps improving over time you get the Furry.


How is it going to do that when their rebrand bottom line has double the vram


----------



## wholeeo

Quote:


> Originally Posted by *MerkageTurk*
> 
> Amd should of just put normal gddr and let nVidia release their hbm first trial and error


With its big die how much power would it had consumed if they went with GDDR5 though?


----------



## provost

Quote:


> Originally Posted by *Liranan*
> 
> You come to this thread hoping to get some information and it's just drama queenery, crying of the usual suspects and the usual putting down of the competition just because.
> 
> At the end of the day it comes down to what you want. You want performance now, you get a 980Ti or overpriced Titan (which is often bested by the Ti), *if you want performance that doesn't decrease because nVidia drops support for your hardware and hardware that not only maintains longevity but keeps improving over time you get the Furry*.


If I can get this in writing from AMD with the optimization period clearly stated, I am prepared to pay $850 for this card.


----------



## KarathKasun

Quote:


> Originally Posted by *criminal*
> 
> I will admit, I love my 980 right now. And Fury X turning out like this saves me from looking to upgrade.
> I really wish AMD hadn't bought ATI. ATI really did compete better.


LOL, if you dont remember ATI's last solo card... HD 2900 XT. It launched into the fray against these cards...
8800 GTS 320/640
8800 GTX 768
8800 ULTRA

Guess where it ended up landing performance wise?

8800 GTS 320 < HD 2900 XT < 8800 GTS 640

Literally the bottom of Nvidias high end lineup, even though it was the first 512bit memory bus card with epic bandwidth for the time.

Fury is nowhere near as bad as all of you are trying to make it sound. The current lineup looks something like this (without overclocking)...

GTX 980 < Fury X/GTX 980 Ti < Titan X

Something fishy is going on with current overclocks and once it gets sorted you may see another ~100mhz or so. If that ends up being the case, it comes out to ~20% on OC headroom.
With the board temps in the TPU review, I would assume something went awry when they re-seated the cooler or something not being quite right with the VRM cooling solution on the review samples.


----------



## MerkageTurk

Yep seems the 390x is the better option at half the cost


----------



## p4inkill3r

Quote:


> Originally Posted by *provost*
> 
> If I can get this in writing from AMD with the optimization period clearly stated, I am prepared to pay $850 for this card.


Ask Mr. Kepler, he'll issue a doctor's note.


----------



## DampMonkey

Quote:


> Originally Posted by *provost*
> 
> If I can get this in writing from AMD with the optimization period clearly stated, I am prepared to pay $850 for this card.


Not that its a 1:1 comparison, but the 290/x's have aged very well compared to the 780/titan/780ti. Most attribute this to GCN. If you need an example, check out benchmarks for new games like GTAV or Witcher 3. the 290x pretty much wipes the floor with them whereas a year or 2 ago it would have been neck and neck.


----------



## Liranan

Quote:


> Originally Posted by *provost*
> 
> If I can get this in writing from AMD with the optimization period clearly stated, I am prepared to pay $850 for this card.


You need AMD to put into writing that nVidia have abandoned optimising drivers for Kepler, resulting in the 290X equalling and sometimes beating a Titan or 780Ti, when at launch the 290X was clearly inferior?


----------



## provost

Quote:


> Originally Posted by *Liranan*
> 
> You need AMD to put into writing that nVidia have abandoned optimising drivers for Kepler, resulting in the 290X equalling and sometimes beating a Titan or 780Ti, when at launch the 290X was clearly inferior?


Yeah, I like my contracts to be black and white, I already signed a gray contract with Nvidia, which did not turn out too well for me..


----------



## bigkahuna360

Well this is disappointing. I was going to give my friend the extra $100 so he could buy the Fury X, but if they're going to OC like trash and not even put up a good competition with my 980 Ti, then why bother?


----------



## decimator

Quote:


> Originally Posted by *zealord*
> 
> AMD would've never released the Fury X at 649$ if it had 8GB GDDR5.


It's not that simple. 8GB GDDR5 would take up much more space on the die and require more power. There's no way Fury X would have 4096 SP's if it had 8GB GDDR5 instead of 4GB HBM. Don't have any issue with the rest of your post.
Quote:


> Originally Posted by *MerkageTurk*
> 
> Amd should of just put normal gddr and let nVidia release their hbm first trial and error


This doesn't really make much sense seeing as how AMD were the ones to develop HBM alongside Hynix. nVidia was developing their own version of new memory, but JEDEC adopted HBM as the standard in October 2013. What AMD gets out of this is they get a full year to play around with HBM1 while nVidia has to wait until Pascal for HBM2. Even if Fury X is disappointing performance-wise, AMD is gaining valuable experience with HBM and should be going into 2016 with a much better insight in regards to how to use it than nVidia.


----------



## harney

Quote:


> Originally Posted by *provost*
> 
> Yeah, I like my contracts to be black and white, I already signed a gray contract with Nvidia, which did not turn out too well for me..


was that the nvidia gray contract where the value of 4 turned out to be 3.5 i fell for that one too


----------



## Bartouille

Quote:


> Originally Posted by *BigMack70*
> 
> I think this is why the air cooled Fury Pro is going to be the better option. The AIO cooler on the Fury X likely was what kept the cost up at the $650 mark and hopefully we'll see similar performance for $100 less when we get the air cooled version.


They should simply release an air cooled version of the Fury X for 550$. IMO Fury X is already slow enough as-is. Fury Pro is going to have like 3500 stream processor so it will be around 10% slower than Fury X and will be getting close to the 980 in terms of performance.

Prices should be like this:
Fury X water cooled version: 650$ (like atm)
Fury X air cooled: 550$
Fury Pro: 450$
Fury Nano: ??? (tbh judging by these performance numbers it will probably be 290x in terms of performance)


----------



## joeh4384

Quote:


> Originally Posted by *tpi2007*
> 
> Here is my opinion on this: assuming that they indeed fixed the pump noise, I think this card will do moderately well based on the novelty factor: HBM, high quality materials, cool and quiet, the power consumption at this performance level shouldn't make much of a difference for enthusiasts, I'd say that in that regard AMD managed to keep the card domesticated. But the 1.4a HDMI, inconsistent performance meaning more driver optimization is needed and the generally below 980 Ti performance, up to 10%, and let's see about that 4 GB limit in further testing, make it a tough sell other than for those that really want one of the first HBM equipped cards. And even then, it's a bit hard to get excited by HBM since it doesn't seem to be doing anything extraordinary for performance. Also, odd decision to keep the ROP number at 64, I wonder if that has any implications in the performance scaling against the 390X.
> 
> All in all, apart from people wanting one of the first HBM cards and a quiet flagship, I'd wait for the Fury. Should still beat the 980 (well, it has to, otherwise it's going to be competing against the 390X), the 4 GB of VRAM shouldn't be a concern at that cheaper price point and performance and hopefully the WHQL drivers that will be released alongside Windows 10, but also available for 7 and 8.1, will make it perform better.
> 
> It's nice to see AMD keeping up without the long delay that took them to answer the original Titan, but I still think that their lineup is not overall competitive. The Fury (non X) should help, but otherwise:
> 
> 1. R7 370: the rumours originally pointed to it being a 1280 cores part with a $135 pricetag. Now, even though it's a GCN 1.0 part, at that price point it would be a very compelling product that would steal the show from the 750 Ti, power consumption notwithstanding. But then we learned that it is in fact a 1024 cores part with a $149 price tag I was very disappointed. We've had the Radeon HD 7850 1 GB selling for around $160 in October of 2012. The 1 GB size didn't make much of a difference in testing back then, but now the factory overclock and newer games certainly will, so 2 GB is the way to go, but still, it's very hard to get excited for what is otherwise the same card all over again two years and 8 months later for $149.
> 
> 2. R9 380. This would have done moderately well if AMD had priced the 4 GB version at $199. That, together with the core clock and VRAM clock increase, while keeping the same TDP, would make for a compelling offer over the 960, being overall even faster than the 285 already was. As is, $199 for the same 2 GB of VRAM as the GTX 960, and considering that the 960 has more overclock headroom and a lower TDP to boot with, meh. Not to mention that they could have locked the $250 segment with full Tonga, which again, continues only in the Retina iMac as the R9 M295X.
> 
> 3. R9 390. Sort of ok. The clockspeed bumps made it perform on par or slightly above the 290X and occupy its price point, while using a little less power. There isn't much appeal here. People looking for Hawaii would do better to get a 290X now, it has more headroom and costs the same. It should cost a bit less.
> 
> 4. R9 390X. Not really ok. Way too overpriced for a card that doesn't at least have Tonga's feature set and uses 100w more than the 980 to achieve slightly slower or the same results.


I agreee. AMD would have done really well with a full tonga for 250 and with the nano with the same price as the 390x. If they can somehow release that card at 400-450 and price cut the 390x, I think they will have a solid hit.


----------



## wermad

Sigh, I was expecting more but I was prepared for this. Its not disappointing, but its not spectacular. Ti will be my next setup down the road (4way







).


----------



## provost

Quote:


> Originally Posted by *harney*
> 
> was that the nvidia gray contract where the value of 4 turned out to be 3.5 i fell for that one too


Nah, this one is more nuanced and revolves around optimization period for my GK110s.... Lol


----------



## BigMack70

Quote:


> Originally Posted by *Bartouille*
> 
> They should simply release an air cooled version of the Fury X for 550$. IMO Fury X is already slow enough as-is. Fury Pro is going to have like 3500 stream processor so it will be around 10% slower than Fury X and will be getting close to the 980 in terms of performance.
> 
> Prices should be like this:
> Fury X water cooled version: 650$ (like atm)
> Fury X air cooled: 550$
> Fury Pro: 450$
> Fury Nano: ??? (tbh judging by these performance numbers it will probably be 290x in terms of performance)


I agree that's what they _should_ do. Not sure if it's what they _will_ do.


----------



## zealord

Quote:


> Originally Posted by *decimator*
> 
> It's not that simple. 8GB GDDR5 would take up much more space on the die and require more power. There's no way Fury X would have 4096 SP's if it had 8GB GDDR5 instead of 4GB HBM. Don't have any issue with the rest of your post.


that is probably right. So AMD would've need a different architecture then or something. Hope they can get one until 2016 14/16nm HBM2 cards come out


----------



## Rei86

Quote:


> Originally Posted by *harney*
> 
> was that the nvidia gray contract where the value of 4 turned out to be 3.5 i fell for that one too


You're post makes no sense since you actually have 4 still.


----------



## dubldwn

Am I reading different reviews? There's no way I would take Fury X over a 980. Clear performance lead? Who doesn't overclock?


----------



## SuprUsrStan

Wasn't the Fury X touted for it's 4K gaming experience? For some reason the Fury X just chokes on medium in BF4 and still can't compete at Ultra.


----------



## tconroy135

What I really wonder is what is AMD's profit margin for the Fury X vs. NVIDIA for the 980 Ti


----------



## revro

finaly 8gb 390 cards for 350eur nice and 390x for 460eur. great, amd can keep its 4gb furies lol and nvidia its 3.5gb cards









will wait till amd zen then get 2 390x which should have also gone down in price by then


----------



## Blameless

Quote:


> Originally Posted by *tconroy135*
> 
> What consumer, size issues aside, would reasonably buy a Fury X over a 980Ti?


Very few.

I'd only recommend one, as it is today, if performance in one of the minority of games where the Fury X does better was the prime concern, or if the individual in question was going to make heavy use of OCL.
Quote:


> Originally Posted by *Lex Luger*
> 
> Time to switch sides unless you want to be like 3dfx Voodoo fanboys and using hardware from a company that no longer exists.


This is a puzzling statement. Once mature or third party drivers are available, the continued existence of the company that made a particular piece of hardware is largely irrelevant...not that AMD is in imminent danger of collapse. They've been doing badly for almost a decade and they have 600 million in liquid assets on hand; they can do badly for at least a few more years until they shape up or are bought out.

Also, I have a Voodoo 4 4500 in my retro gaming box; it's still the best DOS/Win 9x era 3D accelerator that exists. It's had utility going on fifteen years longer than the ATI and NVIDIA parts that I once considered replacements for it.
Quote:


> Originally Posted by *Syan48306*
> 
> For some reason the Fury X just chokes on medium in BF4 and still can't compete at Ultra.


That really looks like a fill rate or driver issue, not that this excuses anything.

Could also be feeling that 4GiB VRAM limitation.


----------



## Serandur

Quote:


> Originally Posted by *KarathKasun*
> 
> LOL, if you dont remember ATI's last solo card... HD 2900 XT. It launched into the fray against these cards...
> 8800 GTS 320/640
> 8800 GTX 768
> 8800 ULTRA
> 
> Guess where it ended up landing performance wise?
> 
> 8800 GTS 320 < HD 2900 XT < 8800 GTS 640
> 
> Literally the bottom of Nvidias high end lineup, even though it was the first 512bit memory bus card with epic bandwidth for the time.


AMD acquired ATi in 2006, the 2900XT was released in 2007, so... not truly solo. But in any case, the TeraScale microarchitecture (last "solo" one ATi probably conceived) it was based on quickly turned around starting with the 3000 series and blossomed into what eventually became the HD 4000, 5000, and 6000 series, all of which shamed Nvidia in raw efficiency (which they never took advantage of to capture the high-end segment). GCN was the first truly AMD/ATi microarchitecture and that's where the R&D cuts started kicking in and everything went wrong. There's a lot of conjecture on who was responsible for what, but... the GCN strategy is where current pricing madness and the lack of Radeon progression really kicked in.


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> Still uncertain whether the temps shown are typical, or the result of a damaged review sample.
> 
> Hard to know how many times that particular sample had changed hands, how many times it was dismantled, or what could have been done to it.
> 
> Something as simple as damaging a thermal pad, or not tightening all the screws down all the way can dramatically affect VRM temps.


Tom's sample wasn't as hot, but showed those same hotspots where heat is leaking from under the backplate.


----------



## CrazyElf

Quote:


> Originally Posted by *47 Knucklehead*
> 
> It's a good card ... it has issues for non-gaming reasons ... but it's a good card.
> 
> But it isn't the enthusiasts wet dream as it was hyped. It is basically AMD's answer to the GTX 980Ti.
> 
> To be honest, AMD needed this card, otherwise they would have been TOTALLY hosed ... since the rest of their 300-series line is just a 200-series rebrand.
> 
> As HardOCP said, it's just a test platform for HBM GEN 1 memory.


It's not terrible for sure, as in Bulldozer-level fail, but there is a huge gap between hype and reality. I guess it's just middling. I don't want to sound like an Nvidia fanboy (I'm pretty critical of both companies), but right now AMD does have a history of the hype being way short of expectations. Bulldozer may be the most blatant example, but this one too, with the "Hunting Titans" and other hype. By this logic, it should have been as good, if not better than the Titan X.

There's always the waiting for the "next" thing. The AMD K10, Bulldozer, now Fury have all been underwhelming. The question at this point is, can AMD ever recover? There will be a "next" architecture, but it will be underwhelming relative to the competition. At some point, we gotta judge based on what's been released now on AMD's track record in the past, and make judgements. In some areas, they have improved it's true, but in others, they have in many ways regressed relative to the competition, and their products seem to always fall short of the hype.

The problem I see is that AMD desperately needs cash right now. It's a death spiral. You are losing cash so you don't have enough money to spend on R&D, which in turn means that you don't have a better architecture than the competition, which worsens the market share and cash flow problems even more. Barring a cash injection, I don't see a turnaround. The only way that happens is if they get bought out (probably tons of patent issues there) or if someone (potentially the buyers of GF) gives them a ton of money.

Quote:


> Originally Posted by *Blameless*
> 
> I'm in near complete agreement with HardOCP's conclusions, yet Fury isn't really a disappointment in my eyes. It's not falling far short of what I expected. It has it's upsides and downsides. Some things are still unknown. I wouldn't recommend it over a 980Ti at this point, but something as small as a $50 price advantage or demonstrable proof of solid OCing after voltage control is unlocked could change that.
> 
> ...
> I hope that 104C is an isolated case of a defective or badly (re)assembled sample, but I'd definitely like more tests to confirm.
> 
> ...
> Higher clock speeds, 50% more ROPs, other miscellaneous architectural differences, mature drivers...


I'm of similar opinion.

I'm at a loss to explain likewise why AMD did not make the decision to put 96 ROPs on this GPU. The die penalty would have been modest and the performance gains could have been much better. I suppose AMD could release a new variant and ideally 8GB of VRAM. There is a precedent for this, the 4890 which brought modest improvements (higher clocks so it's not a perfect analogy but bear with me - let's say they bring 96 ROPs on a Fury X v2), and the 4890 doubled VRAM as well.

The other issues:

Nvidia seems to be much better at power management right now (note the idle power consumption between the 980Ti vs the Fury X and even load, the Fury X does worse).
OC headroom of course is much larger on the 980Ti because of power consumption. An apples to apples comparison would be something like an AIO 980Ti vs a Fury X and the 980Ti will probably be in excess of 1450 MHz OC on the core.
Elsewhere, the VRM issue has been discussed. Taking off the backplate could help, but I am concerned about the longevity of the card if the VRMs are over >100C.
If there are no custom PCBs, this problem could persist.
The big advantage of HBM seems to be lower power consumption and yes, the Fury X is more power efficient than the 290X (quite a bit more), but it still doesn't close the gap with Maxwell. Bandwidth wise, it doesn't seem like the cards are that bandwidth bottlenecked with GDDR5 (although the Fury X does seem to close the performance gap considerably at 4K), but HBM1 comes at the cost of only 4GB of VRAM, which is problematic since it's the higher resolutions that need it the most.

With custom PCBs of the 980Ti coming out, I fear that this is a no brainer. I had been hoping originally that Nvidia might release a 980Ti "3072" but that will not happen now.

Quote:


> Originally Posted by *Serandur*
> 
> AMD acquired ATi in 2006, the 2900XT was released in 2007, so... not truly solo. But in any case, the TeraScale microarchitecture (last "solo" one ATi probably conceived) it was based on quickly turned around starting with the 3000 series and blossomed into what eventually became the HD 4000, 5000, and 6000 series, all of which shamed Nvidia in raw efficiency (which they never took advantage of to capture the high-end segment). GCN was the first truly AMD/ATi microarchitecture and that's where the R&D cuts started kicking in and everything went wrong. There's a lot of conjecture on who was responsible for what, but... the GCN strategy is what's responsible for current market pricing.


It's this that's the serious problem. It's a death spiral as I've noted above.

To be honest, I will say this much - AMD's engineers do deserve some credit for holding on as well as they have given the disparity in resources.


----------



## Bartouille

Quote:


> Originally Posted by *tconroy135*
> 
> What I really wonder is what is AMD's profit margin for the Fury X vs. NVIDIA for the 980 Ti


Same. I have a feeling selling this card at 650$ (980ti left them no choice) is already a struggle considering all the new technology involved plus that complex AIO cooler and high quality fan.


----------



## aphixus

Quote:


> Originally Posted by *harney*
> 
> Anybody know where to buy a 980ti G1 in the EU for a fair price


This is the cheapest I've seen it, though they don't have them in stock just yet:

http://www.mindfactory.de/product_info.php/6144MB-Gigabyte-GeForce-GTX-980-Ti-Gaming-G1-Aktiv-PCIe-3-0-x16--Retail-_1005048.html


----------



## harney

they should have released a fury x on air along with the water at the same time but air being a £100 $150 cheaper than the 980 ti ..then we would have a real fight ....this would force the 980ti to drop and then i would purchase the TI win win well for me anyhow


----------



## kizwan

Quote:


> Originally Posted by *dubldwn*
> 
> Am I reading different reviews? There's no way I would take Fury X over a 980. Clear performance lead? Who doesn't overclock?


Overclocked but with stock voltage I think.
Quote:


> Originally Posted by *Syan48306*
> 
> Wasn't the Fury X touted for it's 4K gaming experience? For some reason the Fury X just chokes on medium in BF4 and still can't compete at Ultra.


It's completely different in the other review.


----------



## decimator

Quote:


> Originally Posted by *tconroy135*
> 
> What I really wonder is what is AMD's profit margin for the Fury X vs. NVIDIA for the 980 Ti


There's no comparison. nVidia must be making money hand over fist with the 980 Ti compared to the Fury X. AMD invested heavily into the development of HBM and packaged the card with a full-coverage waterblock with a high-quality Gentle Typhoon fan and a 120mm radiator. nVidia on the other hand is just taking bad GM200 chips (yields for GM200 must be pretty decent compared to Fiji as is seeing as how 28nm is mature and they're not dealing with finicky HBM), cutting 2 SMM units and selling them for $650.


----------



## Bartouille

Quote:


> Originally Posted by *decimator*
> 
> There's no comparison. nVidia must be making money hand over fist with the 980 Ti compared to the Fury X. AMD invested heavily into the development of HBM and packaged the card with a full-coverage waterblock with a high-quality Gentle Typhoon fan and a 120mm radiator. nVidia on the other hand is just taking bad GM200 chips (yields for GM200 must be pretty good as is seeing as how 28nm is mature and they're not dealing with finicky HBM), cutting 2 SMM units and selling them for $650.


And recycling the same old cooler from the original Titan.


----------



## Gorea

So since they are both the same price it looks like it's a situation where if you play AMD optimized games get Fury X; if you play Nvidia optimized games get 980 ti.


----------



## sugalumps

Can get two 8gb 390's for about £500 or one 4gb fury x for £550 here.


----------



## Kuivamaa

Quote:


> Originally Posted by *Liranan*
> 
> You need AMD to put into writing that nVidia have abandoned optimising drivers for Kepler, resulting in the 290X equalling and sometimes beating a Titan or 780Ti, when at launch the 290X was clearly inferior?


It wasn't inferior to the OG Titan.


----------



## Sickened1

Well this is slightly underwhelming. Luckily I have 2-3 months before i commit to building a new rig. So i'll just sit back and see what happens to performance with some driver updates. If it goes the route of the 290x ill pick one up solely for not having to worry about whatever it is going on in the green house over there.


----------



## i7monkey

It really sucks to be a PC gamer.

Stagnant CPU market
Bloated overpriced 28nm GPUs
DX12 on the horizon which may finally make 6/8 core CPUs relevant, except current 6/8 core CPUs are too expensive and sometimes lose out to cheaper 4 core CPUs in single threaded applications

14/16n with HBM2 will finally give us the big step in performance that we need, but that's at least a year away, and so is the next decent 6/8 core CPU.

What's a buyer to do? Wait a year and buy Skylake-E and Pascal? The current crop really sucks now.


----------



## MapRef41N93W

Quote:


> Originally Posted by *decimator*
> 
> There's no comparison. nVidia must be making money hand over fist with the 980 Ti compared to the Fury X. AMD invested heavily into the development of HBM and packaged the card with a full-coverage waterblock with a high-quality Gentle Typhoon fan and a 120mm radiator. nVidia on the other hand is just taking bad GM200 chips (yields for GM200 must be pretty decent compared to Fiji as is seeing as how 28nm is mature and they're not dealing with finicky HBM), cutting 2 SMM units and selling them for $650.


The 980ti margin isn't as big as you think. We are still talking about a 601mm^2 die. The only reason we got $650 980ti's is because NVIDIA must have had a lot of defective Titan's laying around with absolutely nothing to do with them (can't brand them into the midrange Quadro/Tesla as there is only 1 Quadro SKU and 0 Tesla for Maxwell). I'm sure there's still profit there, but it's not even close to the killing they were making off the GTX 980 from September to the June price cut.


----------



## Kuivamaa

Quote:


> Originally Posted by *dubldwn*
> 
> Am I reading different reviews? There's no way I would take Fury X over a 980. Clear performance lead? Who doesn't overclock?


That's DX11. BF4 runs crap on DX11 and radeons.TR tests indicated it doesn't run well on mantle either but I would like to see this verified on MP, personally. But for the bench above, no surprise really.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *tconroy135*
> 
> What I really wonder is what is AMD's profit margin for the Fury X vs. NVIDIA for the 980 Ti


Good question. Honestly, I don't see how it is a whole lot different (HBM cost is the big variable) but I really can't see it being MUCH lower even being totally optimistic. Considering that AMD is massively in debt and their loans are coming due soon, and have had bad quarter after bad quarter, coupled with this lack luster performance on their "top of the line" card, I just don't see people rushing out and buying this card in vast numbers to gain any marketshare at all on nVidia (maybe 3-5% TOPS), and when you couple that with margins, I just don't see anything in this entire graphics card launch that will put a ton of money in AMD's pocket to make them a good quarter and give them money to pay back the billions in debt they have accumulated.

Quote:


> Originally Posted by *Lex Luger*
> 
> Even if this card was a runaway success it wouldnt be enough to save the AMD Titanic. They can no longer compete on GPU front now just like CPU front. ITS OVER!


Gotta agree with you there. The last thing before the AMD break up is Zen.


----------



## Tennobanzai

Is there any difference between this Crossfire without any physical connectors and the old Crossfire? My old 4670 didnt require one.


----------



## iLeakStuff

I think that although the performance of Fury X is great (it really is), people, rumors, news sites, have hyped up the product so much in anticipation of a godlike card from AMD that was suppsed to sweep Nvidia of their feets and take the crown by a decent percentage over 980Ti.
HBM was boasted to be the saviour of high resolution gaming, the key element to further increase lead in the 4K benchmarks.

None of this happened. AMD have built an incredible efficient piece of hardware, but they are _almost as efficient_ as GM200 and _almost as fast as 980Ti._ They took their sweet time on the card and I think this waiting and post poning until they released a card thats similar to 980Ti means it met a much smaller positive crowd than what it would be if it launched earlier this year while the expensive Titan X roamded the street.

Although I think Fury X is great, I can`t happen to have lost a little bit of my interest in Fury after watching the reviews. Right now Im sitting here watching conclusions of reviews and thinking: "Is this it?"
Sorry about that.

At last I want to link to Hardware.fr and Hardware.info results. I think I feel safe saying that GTX 980Ti is 10-12% faster in 1080p up to 1600p. While the Fury X and GTX 980Ti are much closer in 4K, maybe down to 0-5% difference.




Anyone else that share my view on this card?


----------



## Rickles

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Oh my god this is getting old. No. It is not. Considering you have GTX 980 and owned 780s before you should have known that since Keplers Nvidia cards boost way above their marketed clocks, the days were Nvidia cards run at the marketed clocks are long dead with Fermi. GTX 980 Ti boosts way above 1102 MHz at stock, 1200+ MHz easily, highly clocked models easily boost in the 1300s range. I doubt you don't know this easy fact and that you lied intentionally because of your AMD hate. You should be ashamed.


Quote:


> Originally Posted by *HeadlessKnight*
> 
> Well said. Respect to this guy
> 
> 
> 
> 
> 
> 
> 
> . some people just can't use their brains
> 
> 
> 
> 
> 
> 
> 
> . It doesn't matter whether it is 1550 or 1600 MHz but he intentionally said it overclocks to 1550 MHz from 1100 MHz just to put AMD in even a more bad light than it already is. That's a 41% OC while in fact when measured from the real clock it is only 20% or 25%.


My reference 980 Ti clocks valley stable at 1504 mhz on air, from a stock boost of 1076 mhz. That is a 40% gain on air without flashing the bios.


----------



## decimator

Quote:


> Originally Posted by *MapRef41N93W*
> 
> The 980ti margin isn't as big as you think. We are still talking about a 601mm^2 die. The only reason we got $650 980ti's is because NVIDIA must have had a lot of defective Titan's laying around with absolutely nothing to do with them (can't brand them into the midrange Quadro/Tesla as there is only 1 Quadro SKU and 0 Tesla for Maxwell). I'm sure there's still profit there, but it's not even close to the killing they were making off the GTX 980 from September to the June price cut.


Everything is relative and I was comparing 980 Ti to the Fury X, so yeah, in comparison to AMD, I'd say nVidia is making a killing







. Besides, 980 Ti's are basically a sunk cost for nVidia. The alternative is they toss out the bad GM200 chips that they can't sell as Titan X's, so might as well cut some SMM units and sell them for $650.


----------



## lajgnd

I'd really like to hear an AMD fan convince me to buy a Fury X...

Is it Price/Performance?
-Because it performs worse than the 980Ti at the same price point.

Is it Noise?
-Because it's one of the loudest cards at idle, which is where quietness really makes a difference

Is it Driver Support?
-Lol, we already know AMD's drivers are a joke, or perhaps slightly below competent.

Is it Heat?
-The thing ain't cool. The fact that it comes with a watercooler stock and has barely OC headroom shows its not.

What is it then?

There's no point in this card even existing. The thing is surpassed in every single quantifiable metric.

Wouldn't surprise me if NVidia is footing the bill for some of the Fury X just to keep the feds off of their backs because this is such a lopsided victory it's incredible.

Whenever I see someone say they just ordered a Fury X I just think uh...







How do you even manage to afford a $650 GPU if you can't understand the unbelievable simple reasoning for why it's so bad? It literally has no redeemable qualities, is behind the tech curve in time and performance. Just a day late and a dollar short. I'm not trying to troll here, just asking a legitimate question.

When you come to the game THIS late you either need to come with incredible performance or incredible price to prove you're worthy. This thing does neither. It's just a second rate knock off as far as I'm concerned.


----------



## Lex Luger

This is it. The Fury X and Fury XTX 8gb will be the last flagship from AMD. Mark my words. Unless Zen is somehow better than Skylake Xeons, AMD is going under.


----------



## sugalumps

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *iLeakStuff*
> 
> I think that although the performance of Fury X is great (it really is), people, rumors, news sites, have hyped up the product so much in anticipation of a godlike card from AMD that was suppsed to sweep Nvidia of their feets and take the crown by a decent percentage over 980Ti.
> HBM was boasted to be the saviour of high resolution gaming, the key element to further increase lead in the 4K benchmarks.
> 
> None of this happened. AMD have built an incredible efficient piece of hardware, but they are _almost as efficient_ as GM200 and _almost as fast as 980Ti._ They took their sweet time on the card and I think this waiting and post poning until they released a card thats similar to 980Ti means it met a much smaller positive crowd than what it would be if it launched earlier this year while the expensive Titan X roamded the street.
> 
> Although I think Fury X is great, I can`t happen to have lost a little bit of my interest in Fury after watching the reviews. Right now Im sitting here watching conclusions of reviews and thinking: "Is this it?"
> Sorry about that.
> 
> At last I want to link to Hardware.fr and Hardware.info results. I think I feel safe saying that GTX 980Ti is 10-12% faster in 1080p up to 1600p. While the Fury X and GTX 980Ti are much closer in 4K, maybe down to 0-5% difference.
> 
> 
> 
> 
> Anyone else that share my view on this card?






"Almost" is not good enough mate, especially when you are late. They themselves said they did not want to be known as the budget brand, prove that to us amd. Be better than your competitors.


----------



## iamhollywood5

What a flop. I can't wait for AMD to be acquired by somebody competent.

Should have just ordered my 980 Ti on launch, wasted several weeks waiting. Oh well, it'll be here in 2 days.


----------



## Tivan

Quote:


> Originally Posted by *kizwan*
> 
> Overclocked but with stock voltage I think.
> It's completely different in the other review.


Hardware Canucks were using catalyst 15.15 (the benchmark in your post is from Hardware Canucks), while I couldn't find any info on the driver TT was using. But yeah who knows what's going on.


----------



## GamerDork

But hey! It was a great effort. Even though it's not better than the competition, it's late to the show and there's generally no reason to ever even consider buying it. Good job AMD gold star for your efforts.


----------



## keikei

Quote:


> Originally Posted by *iLeakStuff*
> 
> At last I want to link to Hardware.fr and Hardware.info results. _I think I feel safe saying that GTX 980Ti is 10-12% faster in 1080p up to 1600p. While the Fury X and GTX 980Ti are much closer in 4K, maybe down to 0-5% difference_.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else that share my view on this card?


This is what i'm reading as well. I dont agree with price point, but then again this is an ultra highend enthusiast uber level card. I want to see what happens when AMD allows memory OC and the Pro version comes out.


----------



## Scorpion49

I was really hoping for something good with this, I would have sold my 970+Gsync and moved over to probably the Fury Pro (don't want the CLC) and a Freesync 1440p screen, but as it stands my overclocked 970 is pretty close to the Fury X in the games that I play anyways so there is no real point.

I'm really wondering though if something different will show up with DX12 testing on this card. Seems like a lot of its resources over the 290/390X are going unused or not scaling.


----------



## Kriant

If it wasn't for the mere 4gb hbm1, I would've snatched it, but I can't see this card holding up in 2016 with that limit.


----------



## Master__Shake

slower than a 980ti at the same price?

wow.

i'd pay 549 tops.


----------



## provost

Quote:


> Originally Posted by *Lex Luger*
> 
> This is it. The Fury X and Fury XTX 8gb will be the last flagship from AMD. Mark my words. Unless Zen is somehow better than Skylake Xeons, AMD is going under.


It would be collectors item then.









Don't understand, you bidding the card up or down


----------



## SKYMTL

Quote:


> Originally Posted by *Bartouille*
> 
> They should simply release an air cooled version of the Fury X for 550$. IMO Fury X is already slow enough as-is. Fury Pro is going to have like 3500 stream processor so it will be around 10% slower than Fury X and will be getting close to the 980 in terms of performance.
> 
> Prices should be like this:
> Fury X water cooled version: 650$ (like atm)
> Fury X air cooled: 550$
> Fury Pro: 450$
> Fury Nano: ??? (tbh judging by these performance numbers it will probably be 290x in terms of performance)


With all the videos of Huddy and other AMD reps stating it, I think it should be evident by now. There are NO PLANS for an air cooled Fury X at a lower price point.


----------



## Serandur

Quote:


> Originally Posted by *SKYMTL*
> 
> With all the videos of Huddy and other AMD reps stating it, I think it should be evident by now. There are NO PLANS for an air cooled Fury X at a lower price point.


Aye, the only question remaining is what, exactly, is Fury Pro's shader configuration.


----------



## infranoia

Remember when you were a kid and you kicked that one guy's legs out from underneath him and he fell and really hurt himself and then you felt really bad for him because he had a fake tooth for the rest of his life?

Way to go Nvidia, you competitive jerk.









I'm hoping the Fury X2 is a work of art but that heat... Best save my cash and hunker down with what I have until the next process.


----------



## iLeakStuff

Here are a couple of other 4K reviews that makes me question If I rather should buy 980Ti instead. Maybe just dish out $100 more for a EVGA Hybrid with even faster clocks and water cooling as well. I think that will be worth it over Fury X.

Man this is difficult. I did not expect to see results like this today

https://www.hardwareheaven.com/content/reviews/graphics-cards/58089/amd-radeon-r9-fury-x-review
http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/5


----------



## SKYMTL

Quote:


> Originally Posted by *Tivan*
> 
> Hardware Canucks were using catalyst 15.15 (the benchmark in your post is from Hardware Canucks), while I couldn't find any info on the driver TT was using. But yeah who knows what's going on.


Settings, levels, etc.
Quote:


> Originally Posted by *Kriant*
> 
> If it wasn't for the mere 4gb hbm1, I would've snatched it, but I can't see this card holding up in 2016 with that limit.


I can see it handling 2016 games better than it does today's titles. DX12's resource allocation allows for much more efficient use of cache and on chip resources, lowering the need for massive amounts of buffering.


----------



## harney

Quote:


> Originally Posted by *armartins*
> 
> I'll make a break from trying to keep up with this thread to answer to this post. You deserve some time in the fridge to learn to show some respect. A suspension would fit you just right. You're clamoring for AMD/ATIs death like it would benefit you or anyone else. And also you go out and talk about 3dfx as it was "garbage" you clearly did not experience the Glide3D days... have you heard about a technology called SLI? Do some research, learn where it came from. Show some respect you little .... nevermind... you're not worthy. 3dfx's was bought by who? Do you know? Its technology contributed greatly for the rise of nvidia. Back in the day when Nvidia launched the RIVA TNT *months after* the Vodoo2 couldn't be touched in a *much worse scenario that we're seing today*. So insulting 3dfx is like insulting your beloved nvidia. And that speaks by itself about how "right" you are.*/rantover*
> 
> On topic: Yes disapointed with price/perf, without HDMI 2.0, without DVI-D (I run @1440P 120Hz over DVI-D) and also about 4GB framebuffer, my rocking 7970 is still solid exactly because it has 3gb and it was "too much" at launch time. The problem is the core, it's not efficient enough to compete with Maxwell... Imagine Maxwell without the added TDP from GDDR5... That will be Pascal and honestly the improved perf/watt amd is claiming is mostly coming from HBM since the core is GCN 1.2 with brute force applied...










Glide was like a breath of fresh air back in the day oh the joy of seeing quake running on glide oh the sparks the sparks


----------



## Master__Shake

i never looked what was the average overclock on these cards?


----------



## magicc8ball

Quote:


> Originally Posted by *Master__Shake*
> 
> i never looked what was the average overclock on these cards?


I believe it was anywhere from 5 to 10 percent.


----------



## p4inkill3r

Quote:


> Originally Posted by *Master__Shake*
> 
> i never looked what was the average overclock on these cards?


100mhz on stock voltages.

Voltage locked until we get some Trixx/Afterburner updates AFAIK.


----------



## Serandur

Quote:


> Originally Posted by *Master__Shake*
> 
> i never looked what was the average overclock on these cards?


Well, they still don't have any voltage control options, but it's like ~1130-1170 MHz at stock voltage.


----------



## mltms

its a greet card like i see it who play this days with 2k when the price for 4k is down


----------



## kizwan

Quote:


> Originally Posted by *Tivan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Overclocked but with stock voltage I think.
> It's completely different in the other review.
> 
> 
> 
> 
> 
> Hardware Canucks were using catalyst 15.15 (the benchmark in your post is from Hardware Canucks), while I couldn't find any info on the driver TT was using. But yeah who knows what's going on.
Click to expand...

The huge difference means either one of them not doing it properly. Tomshardware also show almost same result with the Hardware Canucks but with older drivers 14.12 & with unknown graphics settings.
Quote:


> Originally Posted by *armartins*
> 
> On topic: Yes disapointed with price/perf, without HDMI 2.0, *without DVI-D* (I run @1440P 120Hz over DVI-D) and also about 4GB framebuffer, my rocking 7970 is still solid exactly because it has 3gb and it was "too much" at launch time. The problem is the core, it's not efficient enough to compete with Maxwell... Imagine Maxwell without the added TDP from GDDR5... That will be Pascal and honestly the improved perf/watt amd is claiming is mostly coming from HBM since the core is GCN 1.2 with brute force applied...


I read there's adapter to DVI port in the package.


----------



## szeged

Quote:


> Originally Posted by *mltms*
> 
> its a greet card like i see it who play this days with 2k when the price for 4k is down


4k is so far from being the mainstream its not even funny. 1080p is unfortunately going to be the go to thing for a while i bet.

Quote:


> Originally Posted by *toncij*
> 
> Ignore ignorant fools. Don't waste time on them.
> 
> AMD made Mantle, they're those responsible for DX12, Vulkan and Metal. All those technologies, including HBM are there because of AMD.


amd made mantle but mantle is no longer in use.

amd isnt responsible for dx12, 12 was in the works long before mantle was even announced.


----------



## wermad

Quote:


> Originally Posted by *szeged*
> 
> 4k is so far from being the mainstream its not even funny. 1080p is unfortunately going to be the go to thing for a while i bet.


Last I saw some sort of pie-chart for market share, 1080 was first (obviously) and surprisingly, 4k was ahead of WQHD. This was for monitors not tv's iirc.


----------



## iLeakStuff

Can someone please help me understand this:

You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).


Then you have Fury X which is also water cooled, but the noise is 50.9dB!!


The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
Even at idle the 980Ti is muuch more quiet.

The hell?! SKYHTML from Hardwarecanucks, can you explain this?
The whole idea behind water cooling is less noise than air cooling. I might not have any other choice than going with Hybrid 980Ti it seems


----------



## jcde7ago

Nvidia hedged their bets on dropping the 980 Ti at the right time, priced it right and pretty much just finessed their way to winning this round. I just don't see how anyone could want to choose a Fury X over a 980 Ti at this point, considering they're the same price.

Nvidia this round:

- Releases Titan X, sells out constantly for weeks...probably causes AMD to delay the Fury X even further as they saw what the TX was capable of.
- Drops 980Ti as expected by most people, late enough that AMD can't really delay Fury X any further, but early enough to steal a ton of sales away from those waiting for the Fury X because it's a 6GB VRAM card with similar performance to a Titan X.
- Prices the 980Ti in a good spot to fill the gap between the 970/980 and the TX.

AMD this round:

- Delayed their successor to the 200 series for what seemed like a ridiculously long time
- Needs a watercooled solution and has a TDP of 300w on their latest card
- Released HBM, which is awesome, but can't fill the void needed by games/resolutions that NEED extra frame buffer more than a faster memory architecture.
- What are they doing?

Completely disappointing for AMD....and I was really, really rooting for them to surprise everyone and come out on top, even over the big Maxwells. If i'm Nvidia, i'm smelling the blood in the water now. This was NOT the competition we were hoping for from AMD as PC gamers and hardware enthusiasts.


----------



## wermad

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> 
> Then you have Fury X which is also water cooled, but the noise is 50.9dB!!
> 
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?


Keep in mind the evga hybrid keeps the turbine fan on the pcb for the vrm and vram cooling. the aio cools the core. The fury has the one fan on the radiator cooling all three.

EDIT:


----------



## sugalumps

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> 
> Then you have Fury X which is also water cooled, but the noise is 50.9dB!!
> 
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?


Amd's coolers have always been the worst, even when they design a new water cooler it's the same story









Or maybe it's because they have to be really loud because amd gpu's usually run really hot. Either way it's better to wait for the fury and see how that fairs or go 980ti.


----------



## Phaethon666

Wow, this is quite depressing to be honest. It looks as if I will hold onto the R290 a bit longer. I don't see these things selling like hotcakes as the previous R9 series did. Maybe the Bitcoiners will take this batch off AMD's hands, it doesn't look as if any of us will be lol.


----------



## Orivaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> 
> Then you have Fury X which is also water cooled, but the noise is 50.9dB!!
> 
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?


The review samples had a design flaw in the cooler. This was fixed, so no retailer cards will have the noise issue, but all the review samples had already been sent out.


----------



## magicc8ball

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> 
> Then you have Fury X which is also water cooled, but the noise is 50.9dB!!
> 
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?
> The whole idea behind water cooling is less noise than air cooling. I might not have any other choice than going with Hybrid 980Ti it seems


They also mentioned that the pump has a loud whine but AMD found this issue and was able to correct it before the started shipping out to retailers/etailers. in a couple of days when people start getting the card they should be able to confirm this.


----------



## mouacyk

Quote:


> Originally Posted by *magicc8ball*
> 
> They also mentioned that the pump has a loud whine but AMD found this issue and was able to correct it before the started shipping out to retailers/etailers. in a couple of days when people start getting the card they should be able to confirm this.


AMD will probably just tune down the RPM... another work around. With 50C max, they know they have some headroom.


----------



## canislupusan

Quote:


> Originally Posted by *wermad*
> 
> Keep in mind the evga hybrid keeps the turbine fan on the pcb for the vrm and vram cooling. the aio cools the core. The fury has the one fan on the radiator cooling all three.
> 
> EDIT:
> 
> 
> Spoiler: Warning: Spoiler!


And still manages to draw more power overall...


----------



## ZealotKi11er

I think Fury X is 6 months too late. If it came out before Titan X then it would have been much different.


----------



## sugalumps

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Fury X is 6 months too late. If it came out before Titan X then it would have been much different.


Agreed, but remember when ragu said it didn't matter. That nvidia would slowly bleed while amd gained all it's market share back from hbm blah blah









Sorry ragu, I know your stance has changed I just find it funny.


----------



## Orivaa

Quote:


> Originally Posted by *mouacyk*
> 
> AMD will probably just tune down the RPM... another work around. With 50C max, they know they have some headroom.


What does pump whine have to do with Fan RPM?


----------



## Phaethon666

Quote:


> Originally Posted by *sugalumps*
> 
> Agreed, but remember when ragu said it didn't matter. That nvidia would slowly bleed while amd gained all it's market share back from hbm blah blah
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry ragu, I know your stance has changed I just find it funny.


So what do you think, AMD will have 5% market share of for dedicated GPUs by the end of the year?


----------



## mouacyk

Quote:


> Originally Posted by *Orivaa*
> 
> What does pump whine have to do with Fan RPM?


Pump also spins, doesn't it? Co'mon, you know what I'm referring to.


----------



## yesitsmario

They should have just named this the 390X... There was no need for a special branding. The 290X re-brands should have been the 380X.


----------



## Luxer

It's not worth it for the same price as the 980ti. If it was 500 I'd buy one.

Pascal is probably going to blow this thing out of the water next year though.


----------



## Ganf

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?


Fan has a more aggressive idle speed than it needs. You can tell by the small gap in the minimum and maximum. It's not like we don't have custom fan profiles. I believe it was 8pack or someone that was saying they tested the card with the fan turned off, passively cooling, and still couldn't get it to thermal throttle, so take that as you will.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Phaethon666*
> 
> So what do you think, AMD will have 5% market share of for dedicated GPUs by the end of the year?


Market share will not change. Bunch of AMD fanboys will get the Fury X and bunch of Nvidia fanboys will get GTX 980 Ti. The market share test is GTX970 class GPUs. That will be next year.


----------



## iLeakStuff

Quote:


> Originally Posted by *sugalumps*
> 
> Amd's coolers have always been the worst, even when they design a new water cooler it's the same story
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or maybe it's because they have to be really loud because amd gpu's usually run really hot. Either way it's better to wait for the fury and see how that fairs or go 980ti.


But the Fury X is locked to the reference design, which means the water cooling that was tested here is what we get from maybe a full Fiji.
Are they trying to push people to buy Fury with air cooling and maybe less cores?
Quote:


> Originally Posted by *wermad*
> 
> Keep in mind the evga hybrid keeps the turbine fan on the pcb for the vrm and vram cooling. the aio cools the core. The fury has the one fan on the radiator cooling all three.


I thought silent water cooling all components would be more quiet than a card with water and a fan. Guess I was maybe wrong?!

Quote:


> Originally Posted by *Orivaa*
> 
> The review samples had a design flaw in the cooler. This was fixed, so no retailer cards will have the noise issue, but all the review samples had already been sent out.


Quote:


> Originally Posted by *magicc8ball*
> 
> They also mentioned that the pump has a loud whine but AMD found this issue and was able to correct it before the started shipping out to retailers/etailers. in a couple of days when people start getting the card they should be able to confirm this.


Where did you guys read this? Is the pump the culprint behind the noise? Not the radiator fan? Sounds a bit weird to me. The pump is inside the desktop right, while the radiator is on the outside which should be easier to hear. Or am I thinking wrong?
Quote:


> Originally Posted by *Ganf*
> 
> Fan has a more aggressive idle speed than it needs. You can tell by the small gap in the minimum and maximum. It's not like we don't have custom fan profiles. I believe it was 8pack or someone that was saying they tested the card with the fan turned off, passively cooling, and still couldn't get it to thermal throttle, so take that as you will.


So hopefully AMD will issue out a firmware thats not so agressive during idle then. But what about when gaming? Thats also important


----------



## Gobigorgohome

Will there be a voltage mod for the Fury X? I am tempted to get it for the price, but if there is no possibility for adding any voltage then I do not see myself going down the route of getting the card.


----------



## edo101

Quote:


> Originally Posted by *undeadhunter*
> 
> No need to be like me, enjoy spending your cash on whatever fills your needs (even if that means paying the same or close for less right? lol ) to each it's own I guess


Considering I have a 290 right now, don't see how I'm paying more for less.
Quote:


> Originally Posted by *Kane2207*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wow, someone really needs to take AMDs marketing department out the back like Old Yeller. Everybody (including AMD) would be the better for it...


Really I think I'm gonna apply for head of marketing. I could do this job better and with passion too.


----------



## Sickened1

Quote:


> Originally Posted by *iLeakStuff*
> 
> Where did you guys read this? Is the pump the culprint behind the noise? Not the radiator fan? Sounds a bit weird to me. The pump is inside the desktop right, while the radiator is on the outside which should be easier to hear. Or am I thinking wrong?


They stated it was tested from 18" away. That in my mind says its either on an open bench or with the side panel off to get an accurate reading.


----------



## Forceman

Quote:


> Originally Posted by *Orivaa*
> 
> The review samples had a design flaw in the cooler. This was fixed, so no retailer cards will have the noise issue, but all the review samples had already been sent out.


Quote:


> Originally Posted by *magicc8ball*
> 
> They also mentioned that the pump has a loud whine but AMD found this issue and was able to correct it before the started shipping out to retailers/etailers. in a couple of days when people start getting the card they should be able to confirm this.


I don't get the whole 'review cards had it but real cards won't' angle. What's going to be different about them? Did AMD just not bother testing the noise of their cards before they sent review samples and then were like, whoops, better fix that noise? How do review samples that went out last week have a flaw that retail cards, which likely also shipped last week, don't have?


----------



## Vesku

100s of posts about how Fury X is terrible.

Get a chance to read Guru3D, Techpowerup, and PCPer reviews.

Turns out Fury X is actually alright. Just if they don't allow core voltage adjustments, which is unknown atm, then it's mainly a card for SFF or quiet builds.

Funny seeing posters who have consistently said "AMD could not even come close to competing with Nvidia in the big die department" being in such a rage that a card with a brand new memory architecture isn't stomping all over Nvidia's flagship cards from day one of hitting retail.


----------



## CasualCat

Quote:


> Originally Posted by *Forceman*
> 
> I don't get the whole 'review cards had it but real cards won't' angle. What's going to be different about them? Did AMD just not bother testing the noise of their cards before they sent review samples and then were like, whoops, better fix that noise? *How do review samples that went out last week have a flaw that retail cards, which likely also shipped last week, don't have?*


I'm skeptical about that too unless as someone said they're just adjusting the pump rpm. You don't just redesign the pump overnight and as it is a coolermaster surely even if the overall cooling design is new to this board, the pump is likely a proven design.


----------



## Aonex

Despite all the doom and gloom, still don't think this card is as bad as it's being made out to be on here... maybe just needs a price adjustment to say, $599/549? Also, what's AMD's roadmap going forward? Is there a new GPU architecture on the horizon or are they just waiting for the node shrink and hope GCN can be optimized some more?


----------



## magicc8ball

Quote:


> Originally Posted by *iLeakStuff*
> 
> Where did you guys read this? Is the pump the culprint behind the noise? Not the radiator fan? Sounds a bit weird to me. The pump is inside the desktop right, while the radiator is on the outside which should be easier to hear. Or am I thinking wrong?


https://www.youtube.com/watch?v=rfCb6oiJ6EI&t=8m5s


----------



## sugalumps

Quote:


> Originally Posted by *Vesku*
> 
> 100s of posts about how Fury X is terrible.
> 
> Get a chance to read Guru3D, Techpowerup, and PCPer reviews.
> 
> Turns out Fury X is actually alright. Just if they don't allow core voltage adjustments, which is unknown atm, then it's mainly a card for SFF or *quiet builds.*
> 
> Funny seeing posters who have consistently said "AMD could not even come close to competing with Nvidia in the big die department" being in such a rage that a card with a brand new memory architecture isn't stomping all over Nvidia's flagship cards from day one of hitting retail.


Maxwell aftermarket cards are quieter.


----------



## Orivaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> Where did you guys read this? Is the pump the culprint behind the noise? Not the radiator fan? Sounds a bit weird to me. The pump is inside the desktop right, while the radiator is on the outside which should be easier to hear. Or am I thinking wrong?


I got the info from the old Fury X thread on OcUK. Besides the official confirmation that the problem had been fixed before launch, the community manager had one of the fixed ones, and he claimed it was extremely silent.


----------



## anujsetia

Semiaccurate review:

http://semiaccurate.com/2015/06/24/amds-radeon-goes-premium-with-the-fury-x/
Quote:


> Moving to Mantle versus DirectX 11 performance we can see that our Fury X picks up almost 20 percent more performance by switching to AMD's own API. This bodes well for performance in upcoming DirectX 12 and Vulcan games.


----------



## Slomo4shO

Quote:


> Originally Posted by *jcde7ago*
> 
> - Released HBM, which is awesome, but *can't fill the void needed by games/resolutions that NEED extra frame buffer* more than a faster memory architecture.


Please do enlighten me, what games/resolutions are you referring to?


----------



## keikei

Quote:


> Originally Posted by *Aonex*
> 
> Despite all the doom and gloom, still don't think this card is as bad as it's being made out to be on here... maybe just needs a price adjustment to say, $599/549? Also, what's AMD's roadmap going forward? Is there a new GPU architecture on the horizon or are they just waiting for the node shrink and hope GCN can be optimized some more?


Fury Pro (possibly cut down version of Fiji on air), and the Fury X2 (dual Fiji).


----------



## MapRef41N93W

Quote:


> Originally Posted by *anujsetia*
> 
> Semiaccurate review:
> 
> http://semiaccurate.com/2015/06/24/amds-radeon-goes-premium-with-the-fury-x/


NVIDIA cards won't be able to use DX12/Vulkan according to this user ^


----------



## Vesku

Quote:


> Originally Posted by *sugalumps*
> 
> Maxwell aftermarket cards are quieter.


Techpowerup has Fury X at 32Db under load quieter than their Gigabyte 980Ti G1 and EVGA GTX 980 Ti SC+ measurements.


----------



## maarten12100

Quote:


> Originally Posted by *anujsetia*
> 
> Semiaccurate review:
> 
> http://semiaccurate.com/2015/06/24/amds-radeon-goes-premium-with-the-fury-x/


As expected. AMD has more to gain from it than Nvidia considering their poor cpu scaling.
Quote:


> Originally Posted by *MapRef41N93W*
> 
> NVIDIA cards won't be able to use DX12/Vulkan according to this user ^


Nvidia has lesser to gain from it since their drivers are already good in the cpu overhead department under DX11.


----------



## tconroy135

Quote:


> Originally Posted by *MapRef41N93W*
> 
> NVIDIA cards won't be able to use DX12/Vulkan according to this user ^


maybe that's why it is posted on "semiaccurate"


----------



## Awsan

Numbers wise if they just made a normal card with 4096 cores that would've had a better performance than the r9 fury considering how the r9 390x with 2816 performs


----------



## Alatar

Quote:


> Originally Posted by *Awsan*
> 
> Numbers wise if they just made a normal card with 4096 cores that would've had a better performance than the r9 fury considering how the r9 390x with 2816 performs


Linear scaling doesn't really happen.


----------



## jamaican voodoo

i'm still getting fury x nvidia is a shady business, DX12 will make this card shine mark my words


----------



## Elmy

295X2 @ 600,00

Fury X @ 650.00

Titan X @ 1000.00

980 Ti @ 650.00

Which one is faster?

I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


----------



## maarten12100

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i'm still getting fury x nvidia is a shady business, DX12 will make this card shine mark my words


Even if it doesn't you can use it for other purposes


Spoiler: Do you have what it takes to flick the switch?






Quote:


> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


You're comparing dual gpus to single gpus. Something we can't really do until DX12 comes in and makes dual gpus more viable once and for all.


----------



## sugalumps

Quote:


> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


Would be able to tell from the micro stutter on the dual setup(295x2).


----------



## Cyclonic

Quote:


> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


I would notiche the 295x frame skipping crap







And game crashing when CF doenst work


----------



## Phaethon666

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i'm still getting fury x nvidia is a shady business, DX12 will make this card shine mark my words


By the time there is a plentiful amount of DX12 games, Pascal will be out and AMD will still be a year away from launching their next card. I'm not saying don't buy the Fury X, just know that we will be dealing with new DX 11.1 releases for a long time. Maybe buying a card that runs DX 11 better would be more prudent at this point.


----------



## jamaican voodoo

Quote:


> Originally Posted by *maarten12100*
> 
> Even if it doesn't you can use it for other purposes
> 
> 
> Spoiler: Do you have what it takes to flick the switch?


i guess winter is going to be quite comfortable this year lol


----------



## maarten12100

Quote:


> Originally Posted by *sugalumps*
> 
> Would be able to tell from the micro stutter on the dual setup(295x2).


This isn't a 7970 CF setup in 2013








If it works it is smooth sailing mostly. But if there is no profile then it is probably not so smooth.
Quote:


> Originally Posted by *Cyclonic*
> 
> I would notiche the 295x frame skipping crap
> 
> 
> 
> 
> 
> 
> 
> And game crashing when CF doenst work


Frame skipping huh?
Quote:


> Originally Posted by *jamaican voodoo*
> 
> i guess winter is going to be quite comfortable this year lol


If they drop in price I might get one too but Fury Nano is way more kawaii!


----------



## Ha-Nocri

Fury X is actually faster than 980ti @1440p and 4k in older games. Those r drivers ppl. This card is a beast well worth the price

HardwareCanucks


----------



## Elmy

Quote:


> Originally Posted by *sugalumps*
> 
> Would be able to tell from the micro stutter on the dual setup(295x2).


Proof?


----------



## criminal

Quote:


> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


I still want a 295X2.


----------



## edo101

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Fury X is actually faster than 980ti @1440p and 4k in older games. Those r drivers ppl. This card is a beast well worth the price
> 
> HardwareCanucks


Nonsense get out of here. When have better drivers ever fixed anything


----------



## MapRef41N93W

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Fury X is actually faster than 980ti @1440p and 4k in older games. Those r drivers ppl. This card is a beast well worth the price
> 
> HardwareCanucks


And then you woke up from your dream


----------



## sugalumps

Quote:


> Originally Posted by *maarten12100*
> 
> This isn't a 7970 CF setup in 2013
> 
> 
> 
> 
> 
> 
> 
> 
> If it works it is smooth sailing mostly. But if there is no profile then it is probably not so smooth.
> Frame skipping huh?
> If they drop in price I might get one too but Fury Nano is way more kawaii!


I am talking about both sides, this is not a dig at amd. Dual set/triple set ups it's always noticable. It wont bother everyone, but it is more noticable to some. Just because some may not notice it does not mean it's not there.


----------



## rizla1

Why do people keep posting " just you wait, maxwell is coming with hbm2 amd will have nothing". AMD will also be releasing 14nm cards and cards will be rushed asap by both companies.. 3 gen 28nm cards they need to move on.. Unless we get an early hd 4770 or 750ti.. They will both have a release in the same quarter. Q3 2016 maybe. More likly q1 2017. Back on topic , what was amd thinking with 64 rops? Dont get me wrong. They are competing but seems to be some shortfalls with fiji


----------



## criminal

Quote:


> Originally Posted by *Elmy*
> 
> Proof?


He has none in 2015. AMD fixed that issue awhile back.


----------



## Ganf

Quote:


> Originally Posted by *sugalumps*
> 
> Maxwell aftermarket cards are quieter.


Quieter than passive cooling?


----------



## SKYMTL

Quote:


> Originally Posted by *iLeakStuff*
> 
> Can someone please help me understand this:
> 
> You have EVGA 980Ti Hybrid which is a watercooled overclocked 980Ti output 36.4dB during gaming (and idle).
> 
> 
> Then you have Fury X which is also water cooled, but the noise is 50.9dB!!
> 
> 
> The results above are both taken by the same guys, using the same equipment. So the results are most likely very true.
> As someone who was/are interested in Fury X, I have to question that the cooler they use on the Fury X is really bad compared to 980Ti`s cooler.
> Even at idle the 980Ti is muuch more quiet.
> 
> The hell?! SKYHTML from Hardwarecanucks, can you explain this?
> The whole idea behind water cooling is less noise than air cooling. I might not have any other choice than going with Hybrid 980Ti it seems


It's BIAS!!!

j/k

I explained this below the chart actually. Our dB meter is tuned for for both high and low frequencies so high pitched pump and coil whine noise causes spikes. At idle the card exhibits horrible pump bearing whine while at load there's coil noise.

If you look at our YouTube video of the card (I follow the forum's rules so I can't post it here, sorry!) and fast forward to 8:10 you will see and hear what I am talking about.


----------



## Themisseble

I prefer R9 Fury X over GTX 980Ti.. i can see that there is not performance difference and that GTX 980Ti is overclocking great. But the problems is that I wouldnt overclock GPU that cost 650$hz and also i wouldnt OC FURY X. I woudl OC smaller cheaper cards or only for benchmarking which actually takes a lot of time... time I could spend for something even better. I love that Fury X is using only 220W in gaming loop and that means less than GTX 980Ti.

I think that AMD made great deal. I know that some people will not agree with me but I cannot blame them ...
Reasons why I like R9 FURY X
- better support for GPU ( if we remmember kepler support in latest games)...
- better for 4K ( I prefer 4K with medium-high details, no msaa)
- HBM (trying hbm)
- 8.6TFLOPS - i know that this is not as important but yet I will need compute
- AMD didnt hype it as much as fans did

Reasons why I hate
- no HDMI2.0 (how, how AMD???!)

Reasons why I dont like NVIDIA
- no reasons at all (actually 1 reason - NVIDIA titles cripple AMD GPUs) - This is what I hate the most !! I just hope that AMD wont do the same in future.
- I just dont want to see that GTX 980Ti will loose support after pascal gets in

reason why i wont buy GTX 980 or Fury X
- R9 Nano


----------



## sugalumps

Quote:


> Originally Posted by *Ganf*
> 
> Quieter than passive cooling?




?


----------



## maarten12100

Quote:


> Originally Posted by *Ganf*
> 
> Quieter than passive cooling?


I had a passive card that made more noticeable noise than a card with a fan. Electrical noise is literally the worst high pitched








The genius that came up with the idea to make a fanless HD4850 in an era when cases had horrible airflow is beyond me. It ran 90C and throttled to hell and back before I puched some holes in the case for extra fans.


----------



## edo101

Quote:


> Originally Posted by *sugalumps*
> 
> 
> 
> ?


Didn't someone say that there was an issue which has been fixed that makes it the quiestest card yet? Pretty sure it's been mentioned


----------



## tiborrr12

For everybody tired of pump grinding noise take a look at this: http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


----------



## MapRef41N93W

Quote:


> Originally Posted by *Themisseble*
> 
> I prefer R9 Fury X over GTX 980Ti.. i can see that there is not performance difference and that GTX 980Ti is overclocking great. But the problems is that I wouldnt overclock GPU that cost 650$hz and also i wouldnt OC FURY X. I woudl OC smaller cheaper cards or only for benchmarking which actually takes a lot of time... time I could spend for something even better. I love that Fury X is using only 220W in gaming loop and that means less than GTX 980Ti.
> 
> I think that AMD made great deal. I know that some people will not agree with me but I cannot blame them ...
> Reasons why I like R9 FURY X
> - better support for GPU ( if we remmember kepler support in latest games)...
> - better for 4K ( I prefer 4K with medium-high details, no msaa)
> - HBM (trying hbm)
> - 8.6TFLOPS - i know that this is not as important but yet I will need compute
> - AMD didnt hype it as much as fans did
> 
> Reasons why I hate
> - no HDMI2.0 (how, how AMD???!)
> 
> Reasons why I dont like NVIDIA
> - no reasons at all (actually 1 reason - NVIDIA titles cripple AMD GPUs) - This is what I hate the most !! I just hope that AMD wont do the same in future.
> - I just dont want to see that GTX 980Ti will loose support after pascal gets in
> 
> reason why i wont buy GTX 980 or Fury X
> - R9 Nano


So basically no reasons it's better since everything you said was either false (it being faster at 4K), or makes absolutely no difference/is grasping at straws.

Let me give you one and only one reason to buy a GTX 980ti that is 100x more relevant than any of your reasons.

-It's faster at every resolution


----------



## maarten12100

Quote:


> Originally Posted by *Themisseble*
> 
> I prefer R9 Fury X over GTX 980Ti.. i can see that there is not performance difference and that GTX 980Ti is overclocking great. But the problems is that I wouldnt overclock GPU that cost 650$hz and also i wouldnt OC FURY X. I woudl OC smaller cheaper cards or only for benchmarking which actually takes a lot of time... time I could spend for something even better. I love that Fury X is using only 220W in gaming loop and that means less than GTX 980Ti.
> 
> I think that AMD made great deal. I know that some people will not agree with me but I cannot blame them ...
> Reasons why I like R9 FURY X
> - better support for GPU ( if we remmember kepler support in latest games)...
> - better for 4K ( I prefer 4K with medium-high details, no msaa)
> - HBM (trying hbm)
> - 8.6TFLOPS - i know that this is not as important but yet I will need compute
> - AMD didnt hype it as much as fans did
> 
> Reasons why I hate
> - no HDMI2.0 (how, how AMD???!)
> 
> Reasons why I dont like NVIDIA
> - no reasons at all (actually 1 reason - NVIDIA titles cripple AMD GPUs) - This is what I hate the most !! I just hope that AMD wont do the same in future.
> - I just dont want to see that GTX 980Ti will loose support after pascal gets in
> 
> reason why i wont buy GTX 980 or Fury X
> - R9 Nano


Good news there will be an active adapter allegedly for your HDMi needs








But who knows about Fury Nano.


----------



## mouacyk

Quote:


> Originally Posted by *MapRef41N93W*
> 
> NVIDIA cards won't be able to use DX12/Vulkan according to this user ^


That's quite a stretch... it seems fair to point out Mantel advantages for the Fury X.
Quote:


> Originally Posted by *EK_tiborrr*
> 
> For everybody tired of pump grinding noise take a look at this: http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


Right in the nick of time. Great timing, lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *EK_tiborrr*
> 
> For everybody tired of pump grinding noise take a look at this: http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


EK to the rescue. I just hope Fury Air Cooled full chip so we can put that $100 to get a EK block.


----------



## tconroy135

The EK waterblock looks really nice. Even though I have been crapping on the Fury X it will be interesting to see what happens under a custom loop.


----------



## BigMack70

There are 3 reasons I can see to buy a Fury X:
1) You are curious about new tech (HBM)
2) You dislike Nvidia and/or just want to support AMD
3) You want to build a small form factor PC and will benefit from the smaller physical form factor of the card

Somehow I don't think that will do much for their market share


----------



## Themisseble

I have been looking at tomshardware.

and this one for OC benchmarks GTX 980Ti at 1400core clock barely beats Fury At 1200core clocks..
https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url

PS. Also some of them OCed HBM


----------



## Offender_Mullet

AMD's problem isn't innovating. They've obviously done that quite a few times. Their issue is falling quickly behind after they innovate said tech, letting their competition gain ground in which they either barely recover from or don't recover at all. They should really revamp most of their card pricing a.s.a.p. Also, their marketing department and driver engineers need a complete overhaul.

Kudos to them for using hbm but they probably should have let their board partners create Fury pcb's instead. I'm a bit surprised they didn't actually.

On a random note: I believe (someone correct me if I'm wrong) Richard Huddy said last week @ the PC Gamer E3 show, quote: "you can overclock the hell out of these cards".


----------



## maarten12100

Quote:


> Originally Posted by *Offender_Mullet*
> 
> On a random note: I believe (someone correct me if I'm wrong) Richard Huddy said last week @ the PC Gamer E3 show, quote: "you can overclock the hell out of these cards".


Nah it wasn't huddy it was some guy with hippy hair.


----------



## mouacyk

Quote:


> Originally Posted by *Themisseble*
> 
> I have been looking at tomshardware.


...
Quote:


> Originally Posted by *Offender_Mullet*
> 
> AMD's problem isn't innovating. They've obviously done that quite a few times. Their issue is falling quickly behind after they innovate said tech, letting their competition gain ground in which they either barely recover from or don't recover at all. They should really revamp most of their card pricing a.s.a.p. Also, their marketing department and driver engineers need a complete overhaul.
> 
> Kudos to them for using hbm but they probably should have let their board partners create Fury pcb's instead. I'm a bit surprised they didn't actually.
> 
> On a random note: I believe (someone correct me if I'm wrong) Richard Huddy said last week @ the PC Gamer E3 show, quote: "you can overclock the hell out of these cards".


Umm... no you can't. This is a flagship GPU -- what have they innovated, besides throwing everything together with a 65nm (8-year old lith) interposer?


----------



## sugalumps

Quote:


> Originally Posted by *Offender_Mullet*
> 
> AMD's problem isn't innovating. They've obviously done that quite a few times. Their issue is falling quickly behind after they innovate said tech, letting their competition gain ground in which they either barely recover from or don't recover at all. They should really revamp most of their card pricing a.s.a.p. Also, their marketing department and driver engineers need a complete overhaul.
> 
> Kudos to them for using hbm but they probably should have let their board partners create Fury pcb's instead. I'm a bit surprised they didn't actually.
> 
> On a random note: I believe (someone correct me if I'm wrong) Richard Huddy said last week @ the PC Gamer E3 show, quote: "you can overclock the hell out of these cards".


The problem is for gaming their innovation is pointless, look at the fx cpu's. They went way ahead of the market and gave us 8 cores when the bulk of games only use 2. Then hbm instead of more vram, developers are bloating the amount of vram being used not the bandwidth of it. I respect them for trying to innovate, just that it's pointless going ahead of the game. They have spent all their r n d only to fall short of nvidias old memory tech anyways. While nvidia must be making a killing off each 980ti, cause they played it safe.


----------



## aDyerSituation

Disappointed, but at the same time not really. I see it passing up the 980 Ti soon with driver improvements. To be fair a lot of these reviews contain games like project cars which run horribly on AMD cards.

Still going to buy one of the furies regardless. AMD needs all the support they can get, and nvidia just doesn't deserve it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *maarten12100*
> 
> Nah it wasn't huddy it was some guy with hippy hair.


290X - Horrible stock cooler -> AMD (Not really overclocking friendly)
Fury X - Good AIO -> AMD (Overclockers Dream)

Cooling alone does nothing for overclocking. If cooling did matter my R9 290X would be hitting 1400MHz but it does not go above 1275MHz.


----------



## Themisseble

http://us.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overklocking-results

HBM OC-ed?


----------



## provost

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i'm still getting fury x nvidia is a shady business, DX12 will make this card shine mark my words


I don't know about marking your word, but my crashing drivers and limited optimization for GK110 Titans, fortunately or unfortunately, keeps reminding me why I don't want to buy Nvidia cards right now.

I am going to buy one, and if it goes all well, may be I will go crossfire. So, where the heck are these in stock for $650?


----------



## Ha-Nocri

Quote:


> Originally Posted by *aDyerSituation*
> 
> Disappointed, but at the same time not really. I see it passing up the 980 Ti soon with driver improvements. To be fair a lot of these reviews contain games like project cars which run horribly on AMD cards.
> 
> Still going to buy one of the furies regardless. AMD needs all the support they can get, and nvidia just doesn't deserve it.


Indeed. Plus they don't test games that have mantle in mantle








Too bad AMD isn't faster with drivers. It hurts them on release days


----------



## Rocozaur

@TheMentalist: You should also add Lab501 review up on the list









http://lab501.ro/placi-video/review-amd-radeon-r9-fury-x-furia-rosie-turbo


----------



## edo101

Quote:


> Originally Posted by *aDyerSituation*
> 
> Disappointed, but at the same time not really. I see it passing up the 980 Ti soon with driver improvements. To be fair a lot of these reviews contain games like project cars which run horribly on AMD cards.
> 
> Still going to buy one of the furies regardless. AMD needs all the support they can get, and *nvidia just doesn't deserve it.*


No they don't. Not with what they've done lately. Some people just don't care though


----------



## PontiacGTX

Another review
http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693
Quote:


> Originally Posted by *SKYMTL*
> 
> Makes you wonder what will happen with Battlefront...


if BF4 and BFIII are the same, it wont be a console port


----------



## Sickened1

Quote:


> Originally Posted by *Themisseble*
> 
> http://us.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overklocking-results
> 
> HBM OC-ed?


Ummm is this real??? Wut.


----------



## geoxile

Quote:


> Originally Posted by *mouacyk*
> 
> ...
> Umm... no you can't. This is a flagship GPU -- what have they innovated, besides throwing everything together with a 65nm (8-year old lith) interposer?


HBM, along with the particular interposer and interconnects used in conjunction with HBM, were all designed by AMD.

http://techreport.com/review/28294/amd-high-bandwidth-memory-explained
Quote:


> Making this sort of innovation happen was a broadly collaborative effort. *AMD did much of the initial the heavy lifting, designing the interconnects, interposer, and the new DRAM type*. Hynix partnered with AMD to produce the DRAM, and UMC manufactured the first interposers. JEDEC, the standards body charged with blessing new memory types, gave HBM the industry's blessing, which means this memory type should be widely supported by various interested firms. HBM made its way onto Nvidia's GPU roadmap some time ago, although it's essentially a generation behind AMD's first implementation.


Hynix and UMC helped them produce it, but AMD created the tech.

As for the Fury X, AMD should seriously consider working closer with game devs, creating their own gameworks programs, exclusive features, etc. It's bad enough Nvidia is slightly ahead of them in hardware, but the real problem is that Nvidia's grip around devs is much stronger. Hopefully Lisa Su will see to it, she seems to be a lot more domineering than previous CEOs and more pragmatic.


----------



## Themisseble

Quote:


> Originally Posted by *Sickened1*
> 
> Ummm is this real??? Wut.


me also WuuT!


----------



## tconroy135

Quote:


> Originally Posted by *aDyerSituation*
> 
> Disappointed, but at the same time not really. I see it passing up the 980 Ti soon with driver improvements. To be fair a lot of these reviews contain games like project cars which run horribly on AMD cards.
> 
> Still going to buy one of the furies regardless. AMD needs all the support they can get, and nvidia just doesn't deserve it.


If this happens to be the case why in the hell wouldn't AMD wait until they have a better driver to release the card. Sales are often very dependent on initial reviews so this would be the worst marketing strategy I can imagine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Indeed. Plus they don't test games that have mantle in mantle
> 
> 
> 
> 
> 
> 
> 
> 
> Too bad AMD isn't faster with drivers. It hurts them on release days


Yeah. That is a big problem for AMD. HD 7970 was like 20% faster then GTX 580 with release drivers. Now it's almost as fast as GTX 780. For single GPU I have no problem recommending this card to people. Dual GPU (Hell no) DX11 and Dual GPU just not enough CPU overhead is you are 1080p,1440p. Maybe once DX12 start popping then this card might be considered faster. In a way it better for a card to get faster as it gets older because games will be more demanding.


----------



## Sickened1

http://us.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overklocking-results

Well if this is accurate. That 100mhz bump in memory freq is very impressive in terms of increasing the score it got.


----------



## Alatar

Quote:


> Originally Posted by *PontiacGTX*
> 
> Another review
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693
> if BF4 and BFIII are the same, it wont be a console port


pcghw.de disables GPU boost on Nvidia cards in all their tests so their reviews are also somewhat pointless when comparing between AMD/NV


----------



## SKYMTL

Quote:


> Originally Posted by *rizla1*
> 
> Why do people keep posting " just you wait, maxwell is coming with hbm2 amd will have nothing". AMD will also be releasing 14nm cards and cards will be rushed asap by both companies.. 3 gen 28nm cards they need to move on.. Unless we get an early hd 4770 or 750ti.. They will both have a release in the same quarter. Q3 2016 maybe. More likly q1 2017. Back on topic , what was amd thinking with 64 rops? Dont get me wrong. They are competing but seems to be some shortfalls with fiji


It's getting really tiring.

HD 7970 GHz gets beat literally weeks after launch: WAIT till you see Hawaii!

Hawaii turns out to be a power hungry, hot running card: WAIT for those custom cards!!

GTX 780 Ti overcomes custom Hawaii cards: WAIT till you see Sea Islands!!

AMD skips Sea Islands, NVIDIA moves to Maxwell: FIJI is COMING!!! TITAN X Beater!

Fiji turns out to match or lose to GM200: 14nm is right around the corner!

Honestly, people need to stop waiting. Just take the plunge already!
Quote:


> Originally Posted by *EK_tiborrr*
> 
> For everybody tired of pump grinding noise take a look at this: http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


Pump noise is less of a problem at load. I'm guessing the waterblock doesn't overcome coil whine?








Quote:


> Originally Posted by *Themisseble*
> 
> http://us.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overklocking-results
> 
> HBM OC-ed?


Yeah, no. Even Afterburner, which reports the ability for higher HBM speeds doesn't actually apply them.


----------



## Woundingchaney

Well looks like with driver maturation it will be dead even with a 980ti, which in itself is impressive enough. Depending on how the card will OC will determine which card is the solid choice at the price point. If anything its very nice to see AMD competing for the performance crown once again.

Its not so much the Fury series that I am interested, but what they launch to compete with Pascal. I could easily see myself going back to the red team. I would just like a little more confidence in their Xfire drivers.


----------



## PontiacGTX

Quote:


> Originally Posted by *Alatar*
> 
> pcghw.de disabled GPU boost on Nvidia cards in all their tests so their reviews are also somewhat pointless when comparing between AMD/NV


why if amd doesnt have boost...


----------



## wermad

Quote:


> Originally Posted by *canislupusan*
> 
> [/SPOILER]
> And still manages to draw more power overall...


fans draw very little power. I have 66 corsair fans, at most @ full power (12v), I'm pulling under 140w. But I typically keep them all at 4.8v. Fury and Hybrid only have one and two (respectively) so power draw has nothing to do with the cooling (even if you factor in the small quantity the pump also uses up for both). Amd has not been shy on the power usage of their top tier cards. I should know, seeing how they warn you on ensure the proper power setup and usage the 295x2 needs









Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Fury X is 6 months too late. If it came out before Titan X then it would have been much different.


qft, yeah, back in 2014, this would have been a beast. At least during the fall for the anniversary of Hawaii


----------



## Alatar

Quote:


> Originally Posted by *PontiacGTX*
> 
> why if amd doesnt have boost...


AMD does actually have boost. They just report the max boost clock while NV reports the min clock.

So essentially in pcghw tests AMD cards are at stock while they manually downclock the Nvidia cards.


----------



## tconroy135

Quote:


> Originally Posted by *SKYMTL*
> 
> It's getting really tiring.
> 
> HD 7970 GHz gets beat literally weeks after launch: WAIT till you see Hawaii!
> 
> Hawaii turns out to be a power hungry, hot running card: WAIT for those custom cards!!
> 
> GTX 780 Ti overcomes custom Hawaii cards: WAIT till you see Sea Islands!!
> 
> AMD skips Sea Islands, NVIDIA moves to Maxwell: FIJI is COMING!!! TITAN X Beater!
> 
> Fiji turns out to match or lose to GM200: 14nm is right around the corner!
> 
> Honestly, people need to stop waiting. Just take the plunge already!
> Pump noise is less of a problem at load. I'm guessing the waterblock doesn't overcome coil whine?
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, no. Even Afterburner, which reports the ability for higher HBM speeds doesn't actually apply them.


I used to buy AMD products, ^this^, is why I don't anymore


----------



## SKYMTL

Quote:


> Originally Posted by *Alatar*
> 
> pcghw.de disabled GPU boost on Nvidia cards in all their tests so their reviews are also somewhat pointless when comparing between AMD/NV


Where's the shocked emoticon? Seriously? Link?


----------



## Themisseble

Quote:


> Originally Posted by *SKYMTL*
> 
> It's getting really tiring.
> 
> HD 7970 GHz gets beat literally weeks after launch: WAIT till you see Hawaii!
> 
> Hawaii turns out to be a power hungry, hot running card: WAIT for those custom cards!!
> 
> GTX 780 Ti overcomes custom Hawaii cards: WAIT till you see Sea Islands!!
> 
> AMD skips Sea Islands, NVIDIA moves to Maxwell: FIJI is COMING!!! TITAN X Beater!
> 
> Fiji turns out to match or lose to GM200: 14nm is right around the corner!
> 
> Honestly, people need to stop waiting. Just take the plunge already!
> Pump noise is less of a problem at load. I'm guessing the waterblock doesn't overcome coil whine?
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, no. Even Afterburner, which reports the ability for higher HBM speeds doesn't actually apply them.


here you have both OC-ed
https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url

maybe HBM can actually OC quite well.. we will see ... not expecting to much.

AMD stated fast mouse Input? wut?


----------



## Alastair

I don't think the Fury at the moment clocks well because they are on locked volts. I imagine that in order for AMD to get as much power efficiency as possible they probably used as low of a voltage as they could and a tiny bit more to maintain stability. I doubt AMD would say they are an overclockers dream if their in house testing didn't show it. So I just think the fact that they are voltage locked for now is what's holding them back.


----------



## PontiacGTX

4k review
http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/#abschnitt_tests_in_3840__2160_ultra_hd_4k
Quote:


> Originally Posted by *SKYMTL*
> 
> Where's the shocked emoticon? Seriously? Link?


this


----------



## maarten12100

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X - Horrible stock cooler -> AMD (Not really overclocking friendly)
> Fury X - Good AIO -> AMD (Overclockers Dream)
> 
> Cooling alone does nothing for overclocking. If cooling did matter my R9 290X would be hitting 1400MHz but it does not go above 1275MHz.


They talked about how it had 400A worth of rails to power the board's components. I really do hope it wasn't some idiot that thinks a cool card does much better at overclocking...


----------



## zealord

Quote:


> Originally Posted by *Woundingchaney*
> 
> Well looks like with driver maturation it will be dead even with a 980ti, which in itself is impressive enough. Depending on how the card will OC will determine which card is the solid choice at the price point. If anything its very nice to see AMD competing for the performance crown once again.
> 
> Its not so much the Fury series that I am interested, but what they launch to compete with Pascal. I could easily see myself going back to the red team. I would just like a little more confidence in their Xfire drivers.


I think they have had enough time to make good drivers for the Fiji cards.

Believe me when I tell you that I am waiting since December 2014 for a driver that fixes Metal Gear Solid V Ground Zeroes for my 290X.
A 290X is performing on par with a GTX 770 in this game.


----------



## Attomsk

I don't think you should buy a videocard based on your hopes that drivers will improve performance drastically. Seems like a poor purchase decision.


----------



## Asmodian

Quote:


> Originally Posted by *Horsemama1956*
> 
> Another I forgot to add. Why did AMD give up on Boost? Personally I hate it, but it seems like the difference in these reviews and in reviews in recent years in general. In "stock" situations the nVidia cards are gettting 100+ increase in clocks speeds which is obviously going to show in benchmarks.


It is all about power usage. Power usage is throttling both Nvidia and AMD right now. Nvidia cards clock up when running non-power virus loads (like Crysis 3) and down clock when power usage goes too high (like furmark). AMD cannot boost clocks above stock without melting or blowing VRMs. At ~100 MHz over stock Fiji seems to hit a stability wall too, that doesn't leave much room for boosting.
Quote:


> Originally Posted by *zealord*
> 
> I really had a lot of hope for the Fury X. AMD gambled with HBM, but they should've sticked with GDDR5 for the 28nm generation and had released this card earlier this year before the Titan X with 8GB GDDR5.


That the fury exists at all is because it uses HBM. It would be very difficult for AMD to make the Fury with GDDR5 because it is already pulling over 400W (Anandtech's Crysis 3 load power usage), the power savings from HBM are required to implement this many GCN 1.2 shaders on 28nm within a reasonable power budget.


----------



## Rickles

Quote:



> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.


I bet 100% of the people would notice when they tab out of a game to respond to a message.

hint: only one of those MUST run in fullscreen.


----------



## Assirra

Quote:


> Originally Posted by *PontiacGTX*
> 
> why if amd doesnt have boost...


Because you are artificially limiting it one making a head to head pointless?
One of the key features of the nvidia gpu's is that boost, you cannot just ignore it.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Themisseble*
> 
> *- better support for GPU ( if we remmember kepler support in latest games)...*
> *- better for 4K ( I prefer 4K with medium-high details, no msaa)*


Kepler was patched in the last two drivers for performance. How about your Crossfire with FreeSyne or that HDMI 2.0 on that AMD product? See, I can make silly statements too; mine are just factual.

Frankly Fury X performs less than expected. Expectations, at least for myself, was trading blows with the 980 Ti - it clearly doesn't. It might have a slight hand up in a couple games, but it gets whooped in the others.


----------



## maarten12100

Quote:


> Originally Posted by *Alatar*
> 
> AMD does actually have boost. They just report the max boost clock while NV reports the min clock.
> 
> So essentially in pcghw tests AMD cards are at stock while they manually downclock the Nvidia cards.


Still useful for clock for clock comparisons though.


----------



## ZealotKi11er

Back wen HD 7970 came out @ 925 MHz and most reviews got it 1125 MHz with stock volts we knew it will be a good overclcoker. Fury X should have been clocked @ 900 MHz and match GTX 980 Ti at that clock and OC to 1200 MHz because GCN just does not go more then that.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Back wen HD 7970 came out @ 925 MHz and most reviews got it 1125 MHz with stock volts we knew it will be a good overclcoker. Fury X should have been clocked @ 900 MHz and match GTX 980 Ti at that clock and OC to 1200 MHz because GCN just does not go more then that.


The big question is; what about those "Overclockers dream" statements from AMD?


----------



## hamzta09

Quote:


> Originally Posted by *PontiacGTX*
> 
> Another review
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693
> if BF4 and BFIII are the same, it wont be a console port


Nah it'll just be a really long Beta.


----------



## Alatar

Quote:


> Originally Posted by *SKYMTL*
> 
> Where's the shocked emoticon? Seriously? Link?


They started doing it during the 680 release. When GK110 came out it resulted in some awesome reviews that basically said 7970CFX = Titan SLI. I can't be bothered with digging those up because the normal googling doesn't work with german articles (since I don't know german).

Here's the current stance however:
Quote:


> Since the exact clock speeds but always also depend on the load and cooling situation, we give, as is known, in our benchmarks three performance scenarios:
> 
> with base clock of 1,000MHz ~
> in standard boost of ~ 1,075 MHz
> and free lasting boost (on average at 1136-1150 MHz)


https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=fi&ie=UTF-8&u=http%3A%2F%2Fwww.pcgameshardware.de%2FNvidia-Geforce-Grafikkarte-255598%2FTests%2FGTX-980-Ti-Test-Benchmarks-1159952%2F&edit-text=&act=url

The different versions (base, locked minimum boost, and "freeboost" (actual stock)) are clearly indicated in the performance charts of the review. So basically when their chart says x MHz they actually mean it. If it's a variable frequency (as in the actual stock setting) they label it "free boost". If a pcghw.de chart says that the card has been clocked at a certain frequency it means that they don't want it to vary from that frequency so they'll lock it there.

And in the case of normal game performance reviews or reviews of some AMD card that means that they wont be running multiple NV results per card and they'll always default to the option of locking the cards to the reported boost frequency, or in other words the minimum boost frequency.

And this of course means that in the pcghw.de Fury X review the 980Ti for example is running at exactly 1075MHz. Their normal standard option.


----------



## PontiacGTX

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Kepler was patched in the last two drivers for performance. How about your Crossfire with FreeSyne or that HDMI 2.0 on that AMD product? See, I can make silly statements too; mine are just factual.
> 
> Frankly Fury X performs less than expected. Expectations, at least for myself, was trading blows with the 980 Ti - it clearly doesn't. It might have a slight hand up in a couple games, but it gets whooped in the others.


the patch was just for TW3 and pcars.. none solved the kepler issue


----------



## Phaethon666

Not to get too far off topic, but has any reviewer tried putting the rad in a push pull config yet? Doesn't seem to be an extra fan connector built into the car so don't know if its possible to even synch the fans but I am still curious.


----------



## iateab

Sad news. Was hoping to blaze a trail with HBM. Just ordered a 980 Ti.


----------



## zealord

Quote:


> Originally Posted by *Asmodian*
> 
> That the fury exists at all is because it uses HBM. It would be very difficult for AMD to make the Fury with GDDR5 because it is already pulling over 400W (Anandtech's Crysis 3 load power usage), the power savings from HBM are required to implement this many GCN 1.2 shaders on 28nm within a reasonable power budget.


Yeah a couple of people have told me by now. It is cool to know why stuff works like that, but in the end I am just a customers who cares for the final product that is presented to me. So far it looks like the Fury X is a very disappointing GPU for 649$.

Nvidia manages to bring a product that has a big die the same size as Fiji and 6GB GDDR5 for 649$ performs better and overclocks better. So it is definitely not impossible to do so. HBM is cool, innovation is cool, moving forward is cool, but hitting the wrong timing and completely rendering a GPU useless by having it limited to 4GB and being slower than a competitors product that comes in at the same price is not cool for the people that plan on buying a GPU.

The only thing I can do is vote with my wallet and I intend on doing so by neither buying Fury X nor 980 Ti for 649$. I guess I will keep my 290X a while longer.


----------



## BigMack70

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The big question is; what about those "Overclockers dream" statements from AMD?


I think they meant you'll only ever dream of getting a good overclock out of the card


----------



## criminal

Quote:


> Originally Posted by *EK_tiborrr*
> 
> For everybody tired of pump grinding noise take a look at this: http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


Sexy.








Quote:


> Originally Posted by *zealord*
> 
> Yeah a couple of people have told me by now. It is cool to know why stuff works like that, but in the end I am just a customers who cares for the final product that is presented to me. So far it looks like the Fury X is a very disappointing GPU for 649$.
> 
> Nvidia manages to bring a product that has a big die the same size as Fiji and 6GB GDDR5 for 649$ performs better and overclocks better. So it is definitely not impossible to do so. HBM is cool, innovation is cool, moving forward is cool, but hitting the wrong timing and completely rendering a GPU useless by having it limited to 4GB and being slower than a competitors product that comes in at the same price is not cool for the people that plan on buying a GPU.
> 
> The only thing I can do is vote with my wallet and I intend on doing so by neither buying Fury X nor 980 Ti for 649$. I guess I will keep my 290X a while longer.


I wanted to give AMD money, but really not worth it over my 980. Oh well.

Now I am really curious of the performance and price of Nano. Seems like it will end up around 970 speeds, so pricing will be crucial.


----------



## Sheyster

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The big question is; what about those "Overclockers dream" statements from AMD?


Maybe it was meant as overclockers will dream it OC's better?
















BAH - BMAC beat me to it!


----------



## PontiacGTX

Quote:


> Originally Posted by *Alatar*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> They started doing it during the 680 release. When GK110 came out it resulted in some awesome reviews that basically said 7970CFX = Titan SLI. I can't be bothered with digging those up because the normal googling doesn't work with german articles (since I don't know german).
> 
> Here's the current stance however:
> https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=fi&ie=UTF-8&u=http%3A%2F%2Fwww.pcgameshardware.de%2FNvidia-Geforce-Grafikkarte-255598%2FTests%2FGTX-980-Ti-Test-Benchmarks-1159952%2F&edit-text=&act=url
> 
> The different versions (base, locked minimum boost, and "freeboost" (actual stock)) are clearly indicated in the performance charts of the review. So basically when their chart says x MHz they actually mean it. If it's a variable frequency (as in the actual stock setting) they label it "free boost". If a pcghw.de chart says that the card has been clocked at a certain frequency it means that they don't want it to vary from that frequency so they'll lock it there.
> 
> And in the case of normal game performance reviews or reviews of some AMD card that means that they wont be running multiple NV results per card and they'll always default to the option of locking the cards to the reported boost frequency, or in other words the minimum boost frequency.
> 
> And this of course means that in the pcghw.de Fury X review the 980Ti for example is running at exactly 1075MHz. Their normal standard option.


where is amd boost?


----------



## zealord

Quote:


> Originally Posted by *criminal*
> 
> Sexy.
> 
> 
> 
> 
> 
> 
> 
> 
> I wanted to give AMD money, but really not worth it over my 980. Oh well.
> 
> *Now I am really curious of the performance and price of Nano. Seems like it will end up around 970 speeds, so pricing will be crucial.*


It should be a bit faster I think. They said 2 x the efficiency of the 290X and the Nano runs at 175W. So I'd assume it is like 10-15% faster than a 290X. Maybe around 980 levels ~. maybe


----------



## Slomo4shO

Quote:


> Originally Posted by *SKYMTL*
> 
> Just take the plunge already!


Been saying it for nearly two years now, wait for 14/16nm GPUs to upgrade...


----------



## aDyerSituation

The 390/x is really starting to seem more and more appealing


----------



## szeged

Quote:


> Originally Posted by *PontiacGTX*
> 
> where is amd boost?


amd cards boost and show their maximum under load clock

nvidia cards boost and show their minimum under load clock

why is that so hard to understand, amd cards do boost up but in a different way, so disabling nvidia boost on cards for reviews is basically handicapping it just so you can be like " look guys i was right, amd is better all along yay go me"


----------



## pengs

Quote:


> Originally Posted by *maarten12100*
> 
> They talked about how it had 400A worth of rails to power the board's components. I really do hope it wasn't some idiot that thinks a cool card does much better at overclocking...


Leakage? Poole-Frenkel effect The core and it's power delivery are also effected by it and causes instability. It's not as if keeping it cool does nothing for it.


----------



## fatmario

didn't Amd mention AMD FURY NANO going to significantly faster then r9 290x? that would make it right near gtx 980 spot. how come some of the bench showing fury x close to gtx 980


----------



## Alatar

Quote:


> Originally Posted by *aDyerSituation*
> 
> The 390/x is really starting to seem more and more appealing


Buy a 290X (8GB version if you want) while you still can and while they're still cheap.


----------



## Slomo4shO

Quote:


> Originally Posted by *aDyerSituation*
> 
> The 390/x is really starting to seem more and more appealing


You mean the 290X at liquidation pricing?


----------



## criminal

Quote:


> Originally Posted by *fatmario*
> 
> didn't Amd mention AMD FURY NANO going to significantly faster then r9 290x? that would make it right near gtx 980 spot. how come some of the bench showing fury x close gtx 980


Yep. Those were my thoughts too.


----------



## kizwan

Quote:


> Originally Posted by *Asmodian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Horsemama1956*
> 
> Another I forgot to add. Why did AMD give up on Boost? Personally I hate it, but it seems like the difference in these reviews and in reviews in recent years in general. In "stock" situations the nVidia cards are gettting 100+ increase in clocks speeds which is obviously going to show in benchmarks.
> 
> 
> 
> It is all about power usage. Power usage is throttling both Nvidia and AMD right now. Nvidia cards clock up when running non-power virus loads (like Crysis 3) and down clock when power usage goes too high (like furmark). AMD cannot boost clocks above stock without melting or blowing VRMs. *At ~100 MHz over stock Fiji seems to hit a stability wall too, that doesn't leave much room for boosting.*
Click to expand...

With stock voltage, of course yes. We'll see what happen once we have voltage control.
Quote:


> Originally Posted by *Rickles*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Elmy*
> 
> 295X2 @ 600,00
> 
> Fury X @ 650.00
> 
> Titan X @ 1000.00
> 
> 980 Ti @ 650.00
> 
> Which one is faster?
> 
> I bet 90% of the people in these forums were at an event and you had 4 identical systems sitting in front of you and you were allowed to play each system for 15 minutes. You wouldn't be able to tell the difference between these 4 cards but the fact is one of them is the cheapest and the fastest.
> 
> 
> 
> I bet 100% of the people would notice when they tab out of a game to respond to a message.
> 
> hint: only one of those MUST run in fullscreen.
Click to expand...

MUST? No, not really because when I responded to a message, I will not be able to play games at the same time but I can resume playing the games anytime in fullscreen if in the case of DX11 games.


----------



## aDyerSituation

Quote:


> Originally Posted by *Slomo4shO*
> 
> You mean the 290X at liquidation pricing?


No, actually. I mean the 390x. The card that smacks the 980 for $70 less


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> Tom's sample wasn't as hot, but showed those same hotspots where heat is leaking from under the backplate.


Tom's image is much more plausible than the results Guru3D was getting.

~68C that far from the VRM suggests the VRM could well be running at 90C+, which is not particularly promising.
Quote:


> Originally Posted by *lajgnd*
> 
> Is it Driver Support?
> -Lol, we already know AMD's drivers are a joke, or perhaps slightly below competent.


I'm not going to try to convince you to buy a Fury X because I don't think it's the best option for the money right now and I wouldn't get one myself.

That said, AMD's drivers are by and large fine, especially in single GPU scenarios.
Quote:


> Originally Posted by *Serandur*
> 
> Aye, the only question remaining is what, exactly, is Fury Pro's shader configuration.


Unless it's significantly crippled in this regard, I don't think it matters.

Like the 5800 series, Fury seems pretty shader heavy. As long as it's not missing any ROPs, I doubt most people would miss a few hundred shaders or a couple dozen TMUs.
Quote:


> Originally Posted by *sugalumps*
> 
> Amd's coolers have always been the worst, even when they design a new water cooler it's the same story


7900 series had a solid reference cooler...actually it was the same cooler that was on the reference 290s, but it was moving 30-40% less heat.

Prior to that, neither the 5000 nor 6000 series coolers weren't that bad. 4000 series was pretty obnoxious though.
Quote:


> Originally Posted by *Orivaa*
> 
> What does pump whine have to do with Fan RPM?


Nothing, but it may well have something to do with pump rpm.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Market share will not change. Bunch of AMD fanboys will get the Fury X and bunch of Nvidia fanboys will get GTX 980 Ti. The market share test is GTX970 class GPUs. That will be next year.


I bought a 295 dollar non-reference 290X.

I'll see where GM200 and Fiji are in six months.
Quote:


> Originally Posted by *szeged*
> 
> amd cards boost and show their maximum under load clock
> 
> nvidia cards boost and show their minimum under load clock
> 
> why is that so hard to understand, amd cards do boost up but in a different way, so disabling nvidia boost on cards for reviews is basically handicapping it just so you can be like " look guys i was right, amd is better all along yay go me"


Yep, AMD and NVIDIA's power/thermal limits work pretty similarly in practice, but AMD throttles back while NVIDIA boosts up.

Both make it a massive pain in the ass to compare out of box experience since no two cards are going to be exactly the same, especially in different environments.


----------



## PontiacGTX

Quote:


> Originally Posted by *szeged*
> 
> amd cards boost and show their maximum under load clock
> 
> nvidia cards boost and show their minimum under load clock
> 
> why is that so hard to understand, amd cards do boost up but in a different way, so disabling nvidia boost on cards for reviews is basically handicapping it just so you can be like " look guys i was right, amd is better all along yay go me"


Well it seems that my pitcairn doesnt have that feature but Tahiti, Hawaii and Fiji do


----------



## Sheyster

Quote:


> Originally Posted by *aDyerSituation*
> 
> The 390/x is really starting to seem more and more appealing


Why?? Just get a used non-reference 290x for < $250.


----------



## szeged

Quote:


> Originally Posted by *PontiacGTX*
> 
> Well it seems that my pitcairn doesnt have that feature but Tahiti, Hawaii and Fiji do


i dont think my 7870 i have does it but i know my 7970 did something like that. I only tested 290x on water and sub zero so i never got to see any temp or power related throttling.


----------



## Slomo4shO

Quote:


> Originally Posted by *aDyerSituation*
> 
> No, actually. I mean the 390x. The card that smacks the 980 for $70 less


Or get a 290X and overclock it yourself for around $250 in a few weeks... They are the EXACT SAME CHIP...

ASUS Radeon R9 290X DirectCU II = $270 after rebate. Can slap a water block on it and still be well under the price of a R9 390X.


----------



## Alatar

Quote:


> Originally Posted by *aDyerSituation*
> 
> No, actually. I mean the 390x. The card that smacks the 980 for $70 less


This only works if you play at 4K and are comparing a reference 980 against a non ref 390X.

Take a non ref 980 and you're right back to where you begun with the 290X and 980 comparisons.

390X = 290X with better cooling and higher clocks. For enthusiasts it doesn't beat the 980 any more than the 290X did.


----------



## PostalTwinkie

Quote:


> Originally Posted by *PontiacGTX*
> 
> the patch was just for TW3 and pcars.. none solved the kepler issue












Nvidia has made performance improvements to Kepler the last two driver sets. Beyond just Witcher 3. Your statement only proves my point, that people need to stop acting like Kepler isn't getting support. Hell, the Titan and 780 Ti still get used as comparison points for AMD products.

We just watched them fall flat on their face going after the 980 and 980 Ti, so let's keep the bar low for them for now. Maybe Zen will save their ass, but I doubt it.


----------



## sugalumps

Quote:


> Originally Posted by *aDyerSituation*
> 
> No, actually. I mean the 390x. The card that smacks the 980 for $70 less


If you are out to save money then the 290x is a better buy than the 390x.


----------



## Gray Fox

With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


----------



## kizwan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Themisseble*
> 
> *- better support for GPU ( if we remmember kepler support in latest games)...*
> *- better for 4K ( I prefer 4K with medium-high details, no msaa)*
> 
> 
> 
> Kepler was patched in the last two drivers for performance. How about your Crossfire with FreeSyne or that HDMI 2.0 on that AMD product? See, I can make silly statements too; mine are just factual.
> 
> Frankly Fury X performs less than expected. Expectations, at least for myself, was trading blows with the 980 Ti - it clearly doesn't. It might have a slight hand up in a couple games, but it gets whooped in the others.
Click to expand...

Well, actually the 15.15 drivers have Crossfire + Freesync support.
Quote:


> Originally Posted by *maarten12100*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alatar*
> 
> AMD does actually have boost. They just report the max boost clock while NV reports the min clock.
> 
> So essentially in pcghw tests AMD cards are at stock while they manually downclock the Nvidia cards.
> 
> 
> 
> Still useful for clock for clock comparisons though.
Click to expand...

Both of you have a point there. Real perf. vs real perf. is important as clock vs. clock, methinks.


----------



## criminal

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


Don't count on a price drop from Nvidia now. They have no reason to.


----------



## jcde7ago

Quote:


> Originally Posted by *Slomo4shO*
> 
> Please do enlighten me, what games/resolutions are you referring to?


Yeah, nice try, but i'm not falling into this trap of going back and forth on this. HBM doesn't fill the void of a lack of additional frame buffer, end of story. You can use your amazing sense of imagination and knowledge of the various resolutions/games/framerates/setups as a hardware enthusiast and long-enough member of OCN to determine whether or not 4GB of VRAM is a worthwhile investment for $650 when there is an arguably better, competing product for the same amount of money but with 6GB of GDDR5 instead of HBM. If I didn't spend money on Titan Xs months ago, I know where my money would go today, and it's not on the Fury X.

That's all i'll say about that, so feel free to quote me again using the ignorance card if you want, it's pretty clear where you're going with this and for me it's not going past this response.


----------



## Majin SSJ Eric

Very disappointing results for Fiji. I cant think of any reasonable excuse to recommend this card over a 980Ti. Guess we all just have to admit that Nvidia is just dominant and theres nothing AMD can do about it. By all means, if you want the best card available today get a 980Ti (preferrably a non reference one). Congrats to all of you that snagged a 980Ti already. You were right...


----------



## szeged

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


i doubt we will see any ti price drops now.

maybe if the fury x was an undisputed winner in every benchmark even by just 1 fps we might have seen a drop.


----------



## Yvese

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


No chance with these numbers.


----------



## Alatar

Quote:


> Originally Posted by *PontiacGTX*
> 
> Well it seems that my pitcairn doesnt have that feature but Tahiti, Hawaii and Fiji do


Pitcairn came out before AMD hopped on the GPU boost train.

All the modern GPUs have boost capabilities. The stock clocks are just reported differently compared to Nvidia.


----------



## intelfan

Quote:


> Originally Posted by *Sheyster*
> 
> Why?? Just get a used non-reference 290x for < $250.


Where do you find them? I've seen them go for $270 and even $300.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


Hell no!

Not only will Nvidia market the 980 Ti as faster (it is) but also as having 6 GB of VRAM instead of "Just 4GB".

980 Ti pricing is staying, I almost can promise that. I would be blown away if Nvidia lowered the pricing of it.


----------



## maarten12100

Quote:


> Originally Posted by *pengs*
> 
> Leakage? Poole-Frenkel effect The core and it's power delivery are also effected by it and causes instability. It's not as if keeping it cool does nothing for it.


Notice how I said
Quote:


> some idiot that thinks a cool card does much better at overclocking


A cool card could do better just not much.


----------



## Blameless

Quote:


> Originally Posted by *Alatar*
> 
> 390X = 290X with better cooling and higher clocks. For enthusiasts it doesn't beat the 980 any more than the 290X did.


I'm a value enthusiast.

The (GM204) 980 is a fine piece of hardware, but it's a complete rip off. The GTX 970 and R9 290, 290X, or 390 are 150-250 dollars less, and no where near proportionally that far behind in performance.
Quote:


> Originally Posted by *maarten12100*
> 
> Notice how I said
> A cool card could do better just not much.


AMD's GPUs are quite sensitive to temperature, as far as max stable OCs go, in my experience.


----------



## PostalTwinkie

Quote:


> Originally Posted by *kizwan*
> 
> Well, actually the 15.15 drivers have Crossfire + Freesync support.


Are they out, and do they actually work? AMD saying one thing and delivering are clearly different.

EDIT:

This kind of makes me want to go buy a 980 Ti G1.......


----------



## Sheyster

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


I would not bet on that. 980 Ti OC's well and the Fury really didn't meet expectations/hopes/dreams. Maybe think about picking up a used 980 if price is a factor?


----------



## Yvese

If any price drops were to happen it might be the 980 once the Nano and Fury ( non X ) release.


----------



## CasualCat

Quote:


> Originally Posted by *Gray Fox*
> 
> With the performance numbers the Fury X is putting out, is there any chance Nvidia would be lowering the price of their 980ti's anytime in the very near future? One of my 580's died and I can't game on just one at 1440p haha. Any thoughts on a potential price drop? I'm looking to upgrade in the next few days


Seems unlikely unless AMD drops the price of the FuryX or the air Fury has comparable performance also for less money. Even then they may not feel the need to.


----------



## Kane2207

Well, given the recent news, I think I'll just pick up a second OG Titan off fleabay now. They're going for £200. I'm pretty sure I'll get playable frame rates on a single 1440p.

Time to wait for the node shrink for me.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Yvese*
> 
> If any price drops were to happen it might be the 980 once the Nano and Fury ( non X ) release.


If the Fury X is sitting around the 980 to low, low, 980 Ti performance levels; where the hell are they going to put the Fury and Nano in the line-up?
Quote:


> Originally Posted by *Kane2207*
> 
> Well, given the recent news, I think I'll just pick up a second OG Titan off fleabay now. They're going for £200. I'm pretty sure I'll get playable frame rates on a single 1440p.
> 
> Time to wait for the node shrink for me.


My single 780 Ti does pretty solid for 1440P, and insanely well with G-Sycn on. A little more power would be nice, but you would be fine with a Titan or two.


----------



## Alatar

So going back to Fiji:

We know that the Fury X has 1/16 FP64 but do we know if Fiji as a chip actually has proper double precision capabilities?
Quote:


> Originally Posted by *Blameless*
> 
> I'm a value enthusiast.
> 
> The (GM204) 980 is a fine piece of hardware, but it's a complete rip off. The GTX 970 and R9 290, 290X, or 390 are 150-250 dollars less, and no where near proportionally that far behind in performance.


Of course but I was talking about pure performance. If we take money into account then the 390X actually beats the 980 by a smaller amount than the 290X









980 has always been an absolute rip off


----------



## Olivon

http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/


----------



## criminal

Quote:


> Originally Posted by *PostalTwinkie*
> 
> If the Fury X is sitting around the 980 to low, low, 980 Ti performance levels; where the hell are they going to put the Fury and Nano in the line-up?
> My single 780 Ti does pretty solid for 1440P, and insanely well with G-Sycn on. A little more power would be nice, but you would be fine with a Titan or two.


That is what I keep saying. That is why I believe it to be a driver issue. Otherwise Nano will be equal to or slower than the 390x.


----------



## scorpscarx

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/


Maybe that "copper vrm block that doubles as a tube" isn't making good contact in some way, that's a pretty big let down.


----------



## Sheyster

Quote:


> Originally Posted by *intelfan*
> 
> Where do you find them? I've seen them go for $270 and even $300.


There was one for $235 in the OCN marketplace recently, in the Video section.


----------



## sugalumps

Quote:


> Originally Posted by *criminal*
> 
> That is what I keep saying. That is why I believe it to be a driver issue. Otherwise Nano will be equal to or slower than the 390x.


Did they ever state it was going to be better than their 290x/390x? They just said more efficient and better performance to watt ratio. It could very well just be a more efficient small form factor 290x/390x.

I think the people waiting on it thinking it's going to be at the heels of a fury are going to be dissapointed.


----------



## royalkilla408

Very disappointing. Goodbye AMD. I don't see them surviving at all. Will the performance be better with drivers? Hahaha hope for AMD driver? Please. AMD drivers team suck. Takes weeks to get a game working after release. What a let down. I wouldn't pick this card up at all against a 980Ti. Sorry but I don't see AMD getting back enough money from this to survive.


----------



## PostalTwinkie

Quote:


> Originally Posted by *sugalumps*
> 
> Did they ever state it was going to be better than their 290x/390x? They just said more efficient and better performance to watt ratio. It could very well just be a more efficient small form factor 290x/390x.


Kind of part of my other side of that same thinking.

Technically they didn't say it had to perform better, but just given the size and performance that it would be great for SFF. Getting a full length 390X performance into a 6" package is actually pretty cool, and I think will be the real gem out of this mess.


----------



## toncij

Quote:


> Originally Posted by *szeged*
> 
> 4k is so far from being the mainstream its not even funny. 1080p is unfortunately going to be the go to thing for a while i bet.
> amd made mantle but mantle is no longer in use.
> 
> amd isnt responsible for dx12, 12 was in the works long before mantle was even announced.


No, no and no, not really.
Mantle was the first API and it was generally idea of AMD and DICE to implement it. Only after Mantle went spinning up, Direct3D12 was set on Mantle's path. Vulkan (Khronos group) is basically Mantle in almost entirety. Apple's Metal was made the same way, based on same ideas directly influenced by Mantle.
Mantle is in use, but not in the way you think it is. AMD, DICE and partners continue to use Mantle API as a development and testing platform for further advancements in API field.


----------



## Ha-Nocri

Quote:


> Originally Posted by *sugalumps*
> 
> Did they ever state it was going to be better than their 290x/390x? They just said more efficient and better performance to watt ratio. It could very well just be a more efficient small form factor 290x/390x.
> 
> I think the people waiting on it thinking it's going to be at the heels of a fury are going to be dissapointed.


Yeah, they did say it will be a bit faster than 290X


----------



## criminal

Quote:


> Originally Posted by *sugalumps*
> 
> Did they ever state it was going to be better than their 290x/390x? They just said more efficient and better performance to watt ratio. It could very well just be a more efficient small form factor 290x/390x.


Some AMD rep said that the Nano would be substantially faster than the 290x. I wouldn't call anything less than 15% substantial. But that is just me.


----------



## provost

Quote:


> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nvidia has made performance improvements to Kepler the last two driver sets. Beyond just Witcher 3. Your statement only proves my point, that people need to stop acting like Kepler isn't getting support. Hell, the Titan and 780 Ti still get used as comparison points for AMD products.
> 
> We just watched them fall flat on their face going after the 980 and 980 Ti, so let's keep the bar low for them for now. Maybe Zen will save their ass, but I doubt it.


Nvidia might have said they added support, in reality, the new drivers are highly unstable for a lot of people, including myself. I crash on any driver other iCafe latest drivers or 350.12 , and these don't have the Gk110 optimizations. But, I can't run any of the new drivers without crashing. have tried all kinds of novelty fixes HW acc off, Phsx to cpu, no ge force, ddu, polling rate fix, etc,etc, etc ( please don't say overclock related, as the whole system is dead stock clock)

and, sli support has been terrible for games for the past year or so


----------



## PontiacGTX

Quote:


> Originally Posted by *royalkilla408*
> 
> Very disappointing. Goodbye AMD. I don't see them surviving at all. Will the performance be better with drivers? Hahaha hope for AMD driver? Please. AMD drivers team suck. Takes weeks to get a game working after release. What a let down. I wouldn't pick this card up at all against a 980Ti. Sorry but I don't see AMD getting back enough money from this to survive.


where the 290x cf fails?wait ,you own one right?...


----------



## armartins

Has anyone considered how the HBM layout will work with LN2 Benching? What if there are different cold bug temps... this will definitely be an interesting dynamic since basically the pot will cool memory and core simultaneously. Any thoughts LN2 Enthusiasts over here?


----------



## Blackops_2

Quote:


> Originally Posted by *PostalTwinkie*
> 
> If the Fury X is sitting around the 980 to low, low, 980 Ti performance levels; where the hell are they going to put the Fury and Nano in the line-up?


I saw it match some 980Ti benches but yes i agree. They need the Fury X at 550$, Fury at 480$-500$, and Nano at 400ish$. 390X drop to 350$, and 390 at 300$. They're not going to sell many of these things at 650$. The real disappointment isn't so much the speed as it seems to be the OC headroom. Lets be frank even if it matched the 980Ti it still wouldn't matter because that thing apparently clocks like crazy. Though it's now inbetween the 980-980Ti but can't hardly move clock speed, your just kind of stuck with that performance, discounting drivers.


----------



## Slomo4shO

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, nice try, but i'm not falling into this trap of going back and forth on this. HBM doesn't fill the void of a lack of additional frame buffer, end of story. You can use your amazing sense of imagination and knowledge of the various resolutions/games/framerates/setups as a hardware enthusiast and long-enough member of OCN to determine whether or not 4GB of VRAM is a worthwhile investment for $650 when there is an arguably better, competing product for the same amount of money but with 6GB of GDDR5 instead of HBM. If I didn't spend money on Titan Xs months ago, I know where my money would go today, and it's not on the Fury X.
> 
> That's all i'll say about that, so feel free to quote me again using the ignorance card if you want, it's pretty clear where you're going with this and for me it's not going past this response.


A claim without any supporting data... I suppose you also believe that the Titan X has an edge over the 980 Ti due to it having double the ram...

Now, you can try to play the red herring by converting the discussion value between the Fury X and the 980 Ti. I for one am not interested in such a discussion as the benchmarks speak for themselves. I would, however, like to see the "*various resolutions/games/framerates/setups*" where 4GB of VRAM is no longer adequate considering that the bar for VRAM on high end GPUs was only recently raised by the he GTX 980 Ti only 3 weeks ago...


----------



## Alatar

Quote:


> Originally Posted by *armartins*
> 
> Has anyone considered how the HBM layout will work with LN2 Benching? What if there are different cold bug temps... this will definitely be an interesting dynamic since basically the pot will cool memory and core simultaneously. Any thoughts LN2 Enthusiasts over here?


We're just going to have to wait until someone does put the thing under LN2 and reports back on what happened.

Gonna have to ghetto mount any pot on these though since none of the mounting holes on current pots is going to work.


----------



## harney

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/


that is what i wanted to see ......OUCH!


----------



## Kinaesthetic

Quote:


> Originally Posted by *royalkilla408*
> 
> Very disappointing. Goodbye AMD. I don't see them surviving at all. Will the performance be better with drivers? Hahaha hope for AMD driver? Please. AMD drivers team suck. Takes weeks to get a game working after release. What a let down. I wouldn't pick this card up at all against a 980Ti. Sorry but I don't see AMD getting back enough money from this to survive.


I think people that are expecting driver improvements like the 7970 are going to be sorely disappointed. There were a lot of optimizations to be had for the 7970s because it was using the entirely new GCN architecture after having used VLIW for quite some time. That gave a lot of headroom in terms of what they had for optimizing for the best performance. Fiji is just another slightly newer, but still based on the same GCN base architecture to begin with. So a VAST majority of optimizations are going to have already been made for GCN. There might be some, but it isn't going to be on the same level. Or at least common sense and history shows this to be true. Kinda why I laugh that people think Kepler over for Nvidia has a ton of optimization still left. That architecture is old. At this point, they've already probably gotten about 98% of what it has to offer. Don't expect older architectures to perform up to par with a newer architecture that still has a lot of optimization headroom to begin with.


----------



## Yvese

Quote:


> Originally Posted by *PostalTwinkie*
> 
> If the Fury X is sitting around the 980 to low, low, 980 Ti performance levels; where the hell are they going to put the Fury and Nano in the line-up?


The Fury is already confirmed for $549. Nano might be $499.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Lex Luger*
> 
> If you look at our YouTube video of the card (I follow the forum's rules so I can't post it here, sorry!) and fast forward to 8:10 you will see and hear what I am talking about.


That would drive me nuts.

AMD needs to offer basically the EVGA GTX Hydro Copper version of this card (I mean they basically ripped off EVGA with their Hybrid design, so why not rip off EVGA a bit more?) for the same price, then make one for $100 less and then people can use the $100 to go buy a REAL water block ... which covers the VRAM ... from EKWB.


----------



## royalkilla408

Quote:


> Originally Posted by *PontiacGTX*
> 
> where the 290x cf fails?wait ,you own one right?...


What? I do. Check my system. I have 290X crossfire. Trust me. AMD drivers suck. CF has been broken on every new game the past year+. Studdering, flashing, and flickering all over the place. Single card drivers are better but only for old games. Want to play a brand new game? You have to wait a month for drivers (forget CF because they will be broken no matter what AMD tells you).


----------



## rv8000

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/


I find that hard to believe when guru3D thermal imaging results show 50c at the same point, sure the ambients are different but a 50c delta???? Sounds like they had a malfunctioning pump/fan or something of the sorts. Bet that resulted in a buttload of throttling in their tests. Makes me wonder why Guru3D benchmarks are with +/-5% of the 980ti most of the time and other places aren't.

I've also noticed there are certain games where the Fury is right with the 980ti and titan, and leagues of ahead of the 290x, and other times where its 30% behind the 980ti and the gap between the 980/290x is so small. I wonder if some of these results are more directly related to the poor overhead in drivers for certain games, makes me a bit more hopeful for newer drivers and w10 in some cases.

Regardless looking forward to having my card come in where I can test it to my hearts content/


----------



## Alatar

Quote:


> Originally Posted by *rv8000*
> 
> I find that hard to believe when guru3D thermal imaging results show 50c at the same point, sure the ambients are different but a 50c delta???? Sounds like they had a malfunctioning pump/fan or something of the sorts. Bet that resulted in a buttload of throttling in their tests. Makes me wonder why Guru3D benchmarks are with +/-5% of the 980ti most of the time and other places aren't.
> 
> I've also noticed there are certain games where the Fury is right with the 980ti and titan, and leagues of ahead of the 290x, and other times where its 30% behind the 980ti and the gap between the 980/290x are so small. I wonder if some of these games are more directly related to the poor overhead in drivers for certain games, makes me a bit more hopeful for newer drivers and w10 in some cases.
> 
> Regardless looking forward to having my card come in where I can test it to my hearts content/


Were the guru3d thermal results with both the front plate and the back plate on?

Because both of those are just surrounded by air and insulated pretty well from the actual hot components of the card.


----------



## PontiacGTX

Quote:


> Originally Posted by *royalkilla408*
> 
> What? I do. Check my system. I have 290X crossfire. Trust me. AMD drivers suck. CF has been broken on every new game the past year+. Studdering, flashing, and flickering all over the place. Single card drivers are better but only for old games. Want to play a brand new game? You have to wait a month for drivers (forget CF because they will be broken no matter what AMD tells you).


.
edit. it seems there could be a problem with some games you play,maybe you only play games with GW on it


----------



## Blameless

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Are they out, and do they actually work? AMD saying one thing and delivering are clearly different.


They are out. I've been using them on my 290s and my 7950 for almost a week.

I have no idea if freesync + CF actually works as I have neither a FreeSync display nor more than one AMD GPU currently in the same system.
Quote:


> Originally Posted by *criminal*
> 
> Some AMD rep said that the Nano would be substantially faster than the 290x. I wouldn't call anything less than 15% substantial. But that is just me.


I expect the Nano to be considerably faster than the 390X, _overall_.

However, faster overall doesn't mean faster in everything. There may well be isolated cases were the Nano is slower.


----------



## pengs

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/


You should add some context to these pictures
Quote:


> Under load the specified AMD, *typical consumption of 275 watts is also crossed as the middle-called core temperature of 50 degrees Celsius*. *Our measurement report shows the worst case, a power consumption of 383 watts PCGH VGA tool at 65 ° C - of these values is the Radeon R9 Fury X Games under load, however, far away, where the maximum power consumption is 329 watts. Power arms games like Crysis 3 and Skyrim (not measured by us) are currently significantly lower*. Overall, the efficiency compared to the Radeon R9 290X increases significantly, favored by small, produced by the liquid cooling design temperatures.


Ima post a thermal picture of a VRM on a overclocked Titan X running a Furmark type benchmark and watch everyone shriek









PCGH VGA tool is essentially Furmark and causes an increase of about 40%/110w over a normal gaming load


The fact that the 120mm fan stays stagnant is another issue, these reviewers apparently have no idea how to use a custom fan profile, either that or the fan is not able to be controlled.


----------



## Cool Mike

XFX available at Newegg! Click on item first and you will see its available as of this time.

Just ordered the XFX. $50 more than the Sapphire, but you do get a lifetime warranty


----------



## sugalumps

Quote:


> Originally Posted by *royalkilla408*
> 
> What? I do. Check my system. I have 290X crossfire. Trust me. AMD drivers suck. CF has been broken on every new game the past year+. Studdering, flashing, and flickering all over the place. Single card drivers are better but only for old games. Want to play a brand new game? You have to wait a month for drivers (forget CF because they will be broken no matter what AMD tells you).


I really dont know if I could be bothered with dual gpu from either side anymore, if you play games at launch by the time you get a support driver you will have completed the game anyway. Especially considering our platform as of late, it's a struggle to get decent single gpu support nevermind dual.

The only games I can think it would make a difference is multiplayers.


----------



## kizwan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Well, actually the 15.15 drivers have Crossfire + Freesync support.
> 
> 
> 
> Are they out, and do they actually work? AMD saying one thing and delivering are clearly different.
> 
> EDIT:
> 
> This kind of makes me want to go buy a 980 Ti G1.......
Click to expand...

Yes of course they out. It's 390/Fury release drivers.
Quote:


> Originally Posted by *royalkilla408*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> where the 290x cf fails?wait ,you own one right?...
> 
> 
> 
> What? I do. Check my system. I have 290X crossfire. Trust me. AMD drivers suck. CF has been broken on every new game the past year+. Studdering, flashing, and flickering all over the place. Single card drivers are better but only for old games. Want to play a brand new game? You have to wait a month for drivers (forget CF because they will be broken no matter what AMD tells you).
Click to expand...

What games?


----------



## intelfan

Quote:


> Originally Posted by *Sheyster*
> 
> There was one for $235 in the OCN marketplace recently, in the Video section.


It's reference cooling unless I'm mistaken. In the Online Deals section, there is a reference Asus 290x for $270 after rebate compared to $235 used. When buying used is only 12% off compared to new, it doesn't make a lot of financial sense to buy used. Prices will drop eventually.


----------



## Slomo4shO

Quote:


> Originally Posted by *intelfan*
> 
> It's reference cooling unless I'm mistaken. In the Online Deals section, there is a reference Asus 290x for $270 after rebate compared to $235 used. When buying used is only 12% off compared to new, it doesn't make a lot of financial sense to buy used. Prices will drop eventually.


The 300s were just released last week... Give it a week or two when retailers start clearing inventory of their 200 series...


----------



## criminal

Quote:


> Originally Posted by *Cool Mike*
> 
> XFX available at Newegg! Click on item first and you will see its available as of this time.
> 
> Just ordered the XFX. $50 more than the Sapphire, but you do get a lifetime warranty


Cool deal.









You still have that 980 Ti? If so you can do an unbiased review yourself.


----------



## tiborrr12

Quote:


> Originally Posted by *harney*
> 
> that is what i wanted to see ......OUCH!


Those VRMs work up to 130°C no probs...


----------



## Gray Fox

Quote:


> Originally Posted by *criminal*
> 
> Don't count on a price drop from Nvidia now. They have no reason to.


Dang, that's what I figured. Thanks for the input though!!


----------



## 47 Knucklehead

Quote:


> Originally Posted by *fatmario*
> 
> didn't Amd mention AMD FURY NANO going to significantly faster then r9 290x? that would make it right near gtx 980 spot. how come some of the bench showing fury x close to gtx 980


Yes, 50% faster ... which appears to pretty much be a lie ... considering the water cooled Fury X isn't 50% faster than a 290X (41.4 vs 35.7 Crysis 3, 62 vs 55 BF4, 95.7 vs 83.4 GTA5, 52.4 vs 42.5 Metro). So an underclocked, air cooled, version of the Fury will in NO WAY reach anywhere near what AMD boasted it would.

It was just more hype.


----------



## Falknir

I was not expecting much. Somehow was hoping AMD would at least match a stock GTX 980 Ti / Titan X with their factory OC in less memory demanding games, but could not. The serious lack of VRAM headroom and weak OC potential for roughly the same price as a GTX 980 Ti. Count me very disappointed.

At least they got a more efficient design now. Hopefully they turn that into something more purchase worthy in future iterations before they get steamrolled by NVIDIA again.


----------



## rv8000

Quote:


> Originally Posted by *Alatar*
> 
> Were the guru3d thermal results with both the front plate and the back plate on?
> 
> Because both of those are just surrounded by air and insulated pretty well from the actual hot components of the card.


Pcperspective, toms, tpu are all getting between 60-65 (there ambient seems marginally higher at best). Something doesn't seem right about that result and it's a bit ludicrous even for extreme loads such as furmark.


----------



## Rickles

Quote:


> Originally Posted by *pengs*
> 
> You should add some context to these pictures
> Ima post a thermal picture of a VRM on a overclocked Titan X running a Furmark type benchmark and watch everyone shriek
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCGH VGA tool is essentially Furmark.
> 
> 
> The fact that the 120mm fan stays stagnant is another issue, these reviewers apparently have no idea how to use a custom fan profile, either that or the fan is not able to be controlled.


I think the difference in that thermal image is that they've taken the cover off so you can actually see what is going on underneath the cover, as opposed to what is leaking around the cover.

If the backside of the PCB gets up to 104c it only makes sense that the front side does as well.


----------



## maarten12100

Quote:


> Originally Posted by *Cool Mike*
> 
> XFX available at Newegg! Click on item first and you will see its available as of this time.
> 
> Just ordered the XFX. $50 more than the Sapphire, but you do get a lifetime warranty


Limited lifetime warranty where do I sign up?
I can find nothing about this on XFX website but I would really like it. Might push me to purchase one.


----------



## Defoler

I was actually hoping for something to rival the titan X, since AMD put so much PR and so much trash talking how they are going to wipe the floor with nvidia.
In the end, once both nvidia's and amd's drivers mature, we will get a GPU similar to nvidia's, with similar pricing. This is good in the end, but I wanted more. Especially since OCing seems to be very limited.
I wonder if we will see any aftermarket or better water cooling solutions for it. Since I can't put two of those in my little case.


----------



## Orivaa

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Yes, 50% faster ... which appears to pretty much be a lie ... considering the water cooled Fury X isn't 50% faster than a 290X (41.4 vs 35.7 Crysis 3, 62 vs 55 BF4, 95.7 vs 83.4 GTA5, 52.4 vs 42.5 Metro). So an underclocked, air cooled, version of the Fury will in NO WAY reach anywhere near what AMD boasted it would.
> 
> It was just more hype.


That was not what Su said. She said it was 50% the performance per watt, and substantially faster. She never said it was 50% faster.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Falknir*
> 
> I was not expecting much. Somehow was hoping AMD would at least match a stock GTX 980 Ti / Titan X with their factory OC in less memory demanding games, but could not. The serious lack of VRAM headroom and weak OC potential for roughly the same price as a GTX 980 Ti. Count me very disappointed.
> 
> At least they got a more efficient design now. Hopefully they turn that into something more purchase worthy in future iterations before they get steamrolled by NVIDIA again.


Steamrolled? Don't you mean ... wait for it ... Bulldozed?


----------



## Olivon

Most crazy stuff is to see that an oced 980 compete with the AMD 600 mm² monster 4096SP/4096-bit on 1080p and 1440p.
GM204 is 400mm² chip, with way less trannies, 256-bit memory bus, low cost manufacturing, great efficiency and thermal behaviour.
I kinda really don't understand how is it possible ???


----------



## Blameless

Quote:


> Originally Posted by *pengs*
> 
> You should add some context to these pictures
> Ima post a thermal picture of a VRM on a overclocked Titan X running a Furmark type benchmark and watch everyone shriek
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCGH VGA tool is essentially Furmark.


Context is definitely important.

Gaming loads vs. dedicated stress tests are often night and day in load differences.
Quote:


> Originally Posted by *pengs*
> 
> The fact that the 120mm fan stays stagnant is another issue, these reviewers apparently have no idea how to use a custom fan profile, either that or the fan is not able to be controlled.


I'd expect most reviewers to stick to default fan and power profiles, outside of the OCing section of the reviews.
Quote:


> Originally Posted by *EK_tiborrr*
> 
> Those VRMs work up to 130°C no probs...


Until you start pushing significantly more current through them than stock clocks can draw.


----------



## velocityx

im so so so surprised how a company like amd can spend millions on inventing hardware designs and at the same time not see and let it fail on such weak video drivers. mantle already showed how these cards can work, and nvidia also showed how they can catch up to mantle with dx11. so amd kinda shot themselves in the foot with mantle because amd dx11 drivers are just letting them down. all these graphs I see better hardware losing because of software not being up to the task... im glad more sites are started talking about, oc3d TTL already mentioned amd fix ur drivers, guru 3d mentions driver overhead as a critical point to why the cards are losing to nvidia,

limiting 15.15 drivers to 300 only cards when modded 1040 win 10 drivers already work for all other cards and bring same or better improvements doesnt help amd keep their customer base.


----------



## Cool Mike

Yes, I still have the EVGA 980Ti. I look forward to doing a few benchmark comparison's for myself. I truly believe the fury will improve to a solid 980Ti level after one or two driver rev's.
I will be running heaven and 3DMark. After a week or so I will sell one of them. Hoping the Fury works out as I have been on the red team for years.


----------



## boot318

I know 47 and Mod A are smiling







. It took me awhile to log on today after reading reviews. I'm scared to read this thread. lol


----------



## Krusher33

I'm just gonna wait till next generation of HBM.

I do have an upgrade itch though. Just not feeling like the price of 980 is worth the upgrade considering I'm playing games just fine on the 290X yet.


----------



## royalkilla408

Quote:


> Originally Posted by *PontiacGTX*
> 
> .
> edit. it seems there could be a problem with some games you play,maybe you only play games with GW on it


Nope. Check all the reviews here. AMD lags big time on all newer games from 2015. I don't think I've seen one review where GTA V or Witcher 3 performs as good as Nvidia. Actually it's really behind. I blame a lot on the drivers because AMD drivers are just horrible and late.


----------



## sugalumps

Quote:


> Originally Posted by *Olivon*
> 
> Most crazy stuff is to see that an oced 980 compete with the AMD 600 mm² monster 4096SP/4096-bit on 1080p and 1440p.
> GM204 is 400mm² chip, with way less trannies, 256-bit memory bus, low cost manufacturing, great efficiency and thermal behaviour.
> I kinda really don't understand how is it possible ???


Cause maxwell is beast


----------



## Cool Mike

In the spec's on the Newegg site it says lifetime for parts and labor. I hope this is true. Someone correct me if Newegg has made a mistake by saying lifetime.


----------



## harney

Quote:


> Originally Posted by *maarten12100*
> 
> Limited lifetime warranty where do I sign up?
> I can find nothing about this on XFX website but I would really like it. Might push me to purchase one.


Limited lifetime warranty ....i am sure that means warranty of the products life that modal ...never liked that term


----------



## tiborrr12

The VRMs are built in so called flip-chip all-metallic design, hence the drain is on the top of the MOS. This means it dumps heat directly to the heatsink and not the the circuit board through BGA like DrMOS on GTX 980 Ti / TITAN X.

The internal temperature of such packaging (non-flip chip, drain on bottom and BGA serving as heatsink) is much higher than the surface temperature than flip-chip. If the surface temperature is about 50°C the internal is closer to 80°C while with flip chip the internal is very close to external. Both designs have it's perks and drawbacks. For instance, it's way easier to cool flip-chip design with direct cooling.

Regarding the coil whine - it becomes obvious once you start using water cooling on GPUs. Same was with HDD noise some 12 years ago when we started CPU water cooling







My GTX 970 has a horrible coil whine until it heats up properly....


----------



## criminal

Quote:


> Originally Posted by *Olivon*
> 
> Most crazy stuff is to see that an oced 980 compete with the AMD 600 mm² monster 4096SP/4096-bit on 1080p and 1440p.
> GM204 is 400mm² chip, with way less trannies, 256-bit memory bus, low cost manufacturing, great efficiency and thermal behaviour.
> I kinda really don't understand how is it possible ???


Drivers?
Quote:


> Originally Posted by *Cool Mike*
> 
> In the spec's on the Newegg site it says lifetime for parts and labor. I hope this is true. Someone correct me if Newegg has made a mistake by saying lifetime.


Unless something has changed, yes XFX offers lifetime warranty.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Orivaa*
> 
> That was not what Su said. She said it was 50% the performance per watt, and substantially faster. She never said it was 50% faster.


Nope.






Cue up to the 35 minute mark and watch.

Fury X ... "1.5x performance per watt over the R9 290X" ... that is a lie, now proven. 36:30 time.
Fury Nano ... "Significantly more performance than the R9 290X, at half the size and half the power. ... 2x performance per watt." 37:50

There is NO WAY that string of lies is/was true. PERIOD.

Given the now known wattage and now known performance numbers, her numbers don't add up. And "significantly more performance" shouldn't mean 10-15%.

And for reference, here are the "performance per watt" numbers ...

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/32.html

Notice the Fury X at 100% and the R9 290X at 80%/72%/75%/78% depending on resolution? Since when is that "1.5x performance per watt over the R9 290X"? It isn't. The CEO of AMD is a LIAR.

Quote:


> Originally Posted by *boot318*
> 
> I know 47 and Mod A are smiling
> 
> 
> 
> 
> 
> 
> 
> . It took me awhile to log on today after reading reviews. I'm scared to read this thread. lol


Who, me?


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Serandur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *criminal*
> 
> I really wish AMD hadn't bought ATI. ATI really did compete better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Strong and possibly hype-induced opinion warning (and yes, there's hyperbole too):
> 
> ATi were a proud GPU company giving Nvidia a run for their money especially during the few years before AMD bought them out. AMD are a failing parasitic leech of a company sucking ATi dry. They're the EA of hardware companies having bought out ATi during their golden years, then subsequently running them into the ground.
> 
> I always thought the ATi purchase was a mistake, but didn't feel so strongly about this until relatively recently when the magnitude of AMD's GPU R&D cuts and the results were known.
Click to expand...

ATI really did stick it to Nvidia. The x1950xtx was undoubtedly faster than any single gpu Nvidia card. I don't know how true it is but I remember reading that AMD actually wanted to buy Nvidia but they couldn't come to terms.


----------



## svenge

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Steamrolled? Don't you mean ... wait for it ... Bulldozed?


Perhaps it was _Piledriven_ so badly that the Excavator never even showed up?


----------



## iLeakStuff

This graph takes the cake when we have seen the real results today. Not as bad as bulldozer hype/marketing but pretty bad in itself.
There I was thinking HBM would be the holy grail of 4K gaming and lead to severe Nvidia beating after watching that graph.
In reality Fury X maybe beat 980Ti in 1 of those games. The rest they were equal or 980Ti beating Fury X.
Can`t say I noticed much about the HBM magic over GDDR5 to be honest

The hell AMD?


----------



## Slomo4shO

Quote:


> Originally Posted by *harney*
> 
> Limited lifetime warranty


I would take the 3 year warranty from MSI over the lifetime warranty offered by XFX any day of the week...

Quote:


> Originally Posted by *iLeakStuff*
> 
> There I was thinking HBM would be the holy grail of 4K gaming


I find it ironic how little the "enthusiast" understand about the role of memory in GPUs on these forums...


----------



## Ramzinho

Who needs a 1000$ GPU







when this is the future of pc gaming











tweeted by scott wasson


----------



## szeged

Quote:


> Originally Posted by *iLeakStuff*
> 
> This graph takes the cake when we have seen the real results today. Not as bad as bulldozer hype/marketing but pretty bad in itself.
> There I was thinking HBM would be the holy grail of 4K gaming and lead to severe Nvidia beating after watching that graph.
> In reality Fury X maybe beat 980Ti in 1 of those games. The rest they were equal or 980Ti beating Fury X.
> Can`t say I noticed much about the HBM magic over GDDR5 to be honest
> 
> The hell AMD?


thats what happens when you listen to pre release manufacturer benchmarks lol. they probably messed with the settings so it looked better for them.


----------



## PontiacGTX

Quote:


> Originally Posted by *royalkilla408*
> 
> Nope. Check all the reviews here. AMD lags big time on all newer games from 2015. I don't think I've seen one review where GTA V or Witcher 3 performs as good as Nvidia. Actually it's really behind. I blame a lot on the drivers because AMD drivers are just horrible and late.


TW3 uses gamework(hair)...


----------



## rv8000

Quote:


> Originally Posted by *iLeakStuff*
> 
> This graph takes the cake when we have seen the real results today. Not as bad as bulldozer hype/marketing but pretty bad in itself.
> There I was thinking HBM would be the holy grail of 4K gaming and lead to severe Nvidia beating after watching that graph.
> In reality Fury X maybe beat 980Ti in 1 of those games. The rest they were equal or 980Ti beating Fury X.
> Can`t say I noticed much about the HBM magic over GDDR5 to be honest
> 
> The hell AMD?


Don't ever believe a graph that has no indication of what settings where used and what the test setup was. That's asking to be disappointed.


----------



## Blackops_2

Quote:


> Originally Posted by *Slomo4shO*
> 
> I would take the 3 year warranty from MSI over the lifetime warranty offered by XFX any day of the week...
> I find it ironic how little the "enthusiast" understand about the role of memory in GPUs on these forums...


This. I went through 4 XFX 7970s..will never touch XFX again.


----------



## sugalumps




----------



## PontiacGTX

Quote:


> Originally Posted by *Slomo4shO*
> 
> I find it ironic how little the "enthusiast" understand about the role of memory in GPUs on these forums...


the cards arent BW limited at 4k yet it seems


----------



## CallsignVega

Ah, good to know Titan-X is still top dog. Although a good showing, Fury-X should be like $599.


----------



## DMatthewStewart

Quote:


> Originally Posted by *BigMack70*
> 
> This card is nothing more than AMD playing _"me too!!!!!"_ but with less VRAM, no HDMI 2.0, and a few months late to the party ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways on a more positive note... hope you guys who buy the card enjoy it!


I thought it was weird that they didnt utlize hdmi 2.0 either. But then again, maybe they are just thinking that anyone looking for a higher refresh rate is going to use DP versus HDMI. I mean, I havent used HDMI in a very long time. Nor have I even considered it since getting my 144hz monitor.

Ive been looking at the 980's now for a while and I almost pulled the trigger. However, I have to read the reviews thoroughly. Im interested in seeing how well the HBM performs. Maybe GDDR5 is getting a little outdated. But only having 4gb of ram was a little disconcerting. But who knows. When I went to order one today they were already sold out. Probably for the better anyway. Im going to need an EK block and I dont think thats ready to go yet...is it??









Im so tempted to get one anyway. But I am going to try and justify my 290x lightnings for as long as possible. After all, I had to get a giant case to house everything (I'll eventually have to get rid of that too!)


----------



## criminal

Quote:


> Originally Posted by *iLeakStuff*
> 
> This graph takes the cake when we have seen the real results today. Not as bad as bulldozer hype/marketing but pretty bad in itself.
> There I was thinking HBM would be the holy grail of 4K gaming and lead to severe Nvidia beating after watching that graph.
> In reality Fury X maybe beat 980Ti in 1 of those games. The rest they were equal or 980Ti beating Fury X.
> Can`t say I noticed much about the HBM magic over GDDR5 to be honest
> 
> The hell AMD?


They must have disabled boost on the 980Ti.


----------



## royalkilla408

Quote:


> Originally Posted by *PontiacGTX*
> 
> TW3 uses gamework(hair)...


Yes but even off AMD lags behind Nvidia lol. I don't know what point you're trying to make. I own AMD graphics cards. Everyone know AMD drivers suck. I can attest to that. There is no hidden performance by AMD and if you think they will come our with working drivers then you're dreaming.


----------



## Slomo4shO

Quote:


> Originally Posted by *Ramzinho*
> 
> this is the future of pc gaming


AAA ports are worst with each release. The developers are trying to get away with doing as little as possible


----------



## iLeakStuff

Quote:


> Originally Posted by *szeged*
> 
> thats what happens when you listen to pre release manufacturer benchmarks lol. they probably messed with the settings so it looked better for them.


Yeah I know. Shame on me for believing them. Last time I trust benchies from AMD.

Quote:


> Originally Posted by *rv8000*
> 
> Don't ever believe a graph that has no indication of what settings where used and what the test setup was. That's asking to be disappointed.


Oh there was details about settings allright. Along with "Worlds fastest GPU" which was posted on a ton of websites
http://www.kitguru.net/components/graphic-cards/anton-shilov/amds-official-radeon-r9-fury-x-performance-results-worlds-fastest-gpu/


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> ATI really did stick it to Nvidia. The x1950xtx was undoubtedly faster than any single gpu Nvidia card. I don't know how true it is but I remember reading that AMD actually wanted to buy Nvidia but they couldn't come to terms.


I agreed. I loved ATI. They were the best video card maker, hands down, in my 32 years of messing with PCs. I've owned about 4-5 times as many ATI cards as nVidia cards.

I said it before and I'll say it again. I hope AMD sells off the "ATI division" to a good company like Samsung and once again they can rise from the ashes and give nVidia SERIOUS competition. I think they can, they just need someone with deeper pockets, a real marketing strategy, and not someone who is in a losing two front war between nVidia and Intel.


----------



## PontiacGTX

Quote:


> Originally Posted by *royalkilla408*
> 
> Yes but even off AMD lags behind Nvidia lol. I don't know what point you're trying to make. I own AMD graphics cards. Everyone know AMD drivers suck. I can attest to that. There is no hidden performance by AMD and if you think they will come our with working drivers then you're dreaming.


the drivers you are using dont have the lastest tessetellation improvements, why it took so long? give thanks to nvidia for gameworks that limited optimization


----------



## harney

Quote:


> Originally Posted by *Cool Mike*
> 
> In the spec's on the Newegg site it says lifetime for parts and labor. I hope this is true. Someone correct me if Newegg has made a mistake by saying lifetime.


Some one in retail must be on here to explain...i am sure limited lifetime warranty means the lifetime of the product on the market and not the persons lifetime ie if the card dies after 10 years i would be able to RMA ......its a stupid term it will mostly likely have 3 to 5 years ..but i could be wrong


----------



## royalkilla408

Yep. Let's home AMD sells ATI. No way they can survive anymore. I just don't see how. After this disappointment I don't see their numbers increasing or earning anything significant to keep them out of the red.


----------



## keikei

Quote:


> Originally Posted by *Ramzinho*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Who needs a 1000$ GPU
> 
> 
> 
> 
> 
> 
> 
> when this is the future of pc gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tweeted by scott wasson


Good one sir.


----------



## Slomo4shO

Quote:


> Originally Posted by *PontiacGTX*
> 
> the cards arent BW limited at 4k yet it seems


If Hawaii wasn't limited at 320 GB/sec...


----------



## Ganf

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Nope.
> 
> Cue up to the 35 minute mark and watch.
> 
> Fury X ... "1.5x performance per watt over the R9 290X" ... that is a lie, now proven. 36:30 time.
> Fury Nano ... "Significantly more performance than the R9 290X, at half the size and half the power. ... 2x performance per watt." 37:50
> 
> There is NO WAY that string of lies is/was true. PERIOD.
> 
> Given the now known wattage and now known performance numbers, her numbers don't add up. And "significantly more performance" shouldn't mean 10-15%.
> 
> And for reference, here are the "performance per watt" numbers ...
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X/32.html
> 
> Notice the Fury X at 100% and the R9 290X at 80%/72%/75%/78% depending on resolution? Since when is that "1.5x performance per watt over the R9 290X"? It isn't. The CEO of AMD is a LIAR.
> Who, me?












That's not how math works. Use 290x as the baseline and then figure it again.

That chart shows that a 290x is 70-75% as efficient as a Fury X, not that efficiency has been improved by 50%. To show that efficiency has improved by 50% you start with the baseline, figure out the efficiency per GPixel or however you want to figure it, figure the same thing for the Fury X, take the difference and factor that as a percentage of the baseline 290x's figure, and you have your increase in efficiency.


----------



## PostalTwinkie

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i nominate postalTwinkle of the ocn community
> 
> nvidia representative of year. congrats man, the shady business of nvidia is paying off a lot these days, I mean just look at gamewrecks how amd cards kneel in x64 tessellation agongy. the plan work out great didn't it mean a 290x beating a 780ti which it wasn't able surpass months after launch, who am i kidding right. amd is dead atm


You might have a hard time explaining your nomination to Nvidia when I own more AMD hardware than Nvidia. But let's not allow facts to get in the way of a political campaign, it might look out of place.

I just love how AMD flagships are so bad that their users have to try and blame the software (which is optional to use, you can turn it off) of the competitor. So sad to see such a once great company in such a bad spot. I am glad I don't ever have to try and defend my main rig purchase with "to help out the underdog", I am not UNICEF - I don't need another charity on my wallet.

Oh, and speaking of software, do you blame developers for wanting to target the largest market share? Why would developers target the ~24% market holder, and not the giant in the industry? The answer is they wouldn't. They, if they want to stay in business, will go for the larger pool of potential customers. That aside, Gameworks doesn't do snot to AMD, and you know it. ProjectCars was debunked, Batman was done by a 12 man $1 studio as a consoel to PC port, and Witcher 3 can have the hair turned off.

Now if you will excuse me, I will take my poor old and abandoned 780 Ti , that still handily whoops the snot out of AMD, and go cry in the corner with my G-Sync display and working technology. Oh the tear free and stutter free gaming experience, the hoorrooorrrrr......theeee hoooorroooorrrrrr!!!!!


----------



## Tippy

*Holy balls a 105 page discussion in 9 hours.* OCN community scares me sometimes









On topic, I'll echo what some others have said, AMD need to consider dropping to $599. The stock performance, OC, vRAM, lack of HDMI2.0, etc warrants that.

nVidia is going to get some crazy good aftermarket variants, hell just look at what Gigabyte did at _stock voltage..._










vs










So with AMD we're pretty much relying on drivers and nothing else. But keep in mind nVidia may also have better drivers lined up...


----------



## royalkilla408

Quote:


> Originally Posted by *PontiacGTX*
> 
> the drivers you are using dont have the lastest tessetellation improvements, why it took so long? give thanks to nvidia for gameworks that limited optimization


I am using the latest drivers. How do you know I don't? I never even said what drivers I am using and your saying I'm using the wrong ones? Of course I have the latest ones. I upgrade right away to new drivers in hope of nether performance but AMD drivers always are a let down. Check their drivers page and see the usual excuse "performance with Crossfire doesn't work or works somewhat, we are HARD at working with xxxxyyy game developer to improve our crossfire performance". When is that work with the developers paying off? I don't see it anywhere. It's been months and CF hardly works for any new game. AMD needs to go and sell off ATI. Don't care about their CPU business as they suck even more.


----------



## Ramzinho

I'll have to be the devil's advocate here. Guys we are talking about a 650$ Watercooled gpu that's trading blows with the 980ti. this is a day 1 release. with drivers maturing this gpu will probably be between the 980ti and the titanx. and you guys are not giving AMD any credit for having a 2 years old gpu "290X" still standing strong in any non biased "gameworks" title vs the 980. Be more reasonable. Love Nvidia as much as you want.. IT IS YOUR RIGHT. if amd dies we gamers will probably be back to the 2001 ERA when we had to buy a new GPU with every new game release !


----------



## Olivon

Quote:


> Originally Posted by *Ramzinho*
> 
> Who needs a 1000$ GPU
> 
> 
> 
> 
> 
> 
> 
> when this is the future of pc gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tweeted by scott wasson


PC gamers who don't know how to edit an .ini file don't deserve the master race label


----------



## PostalTwinkie

Quote:


> Originally Posted by *Olivon*
> 
> PC gamers who doesn't know how to edit an .ini file doesn't deserve the master race label


Developers are starting to purposefully try and prevent it.


----------



## szeged

i wonder how big the 980ti owners club is gonna get in the next few days.


----------



## Slomo4shO

Quote:


> Originally Posted by *Ganf*
> 
> That's not how math works. Use 290x as the baseline and then figure it again.
> 
> That chart shows that a 290x is 70-75% as efficient as a Fury X, not that efficiency has been improved by 50%. To show that efficiency has improved by 50% you start with the baseline, figure out the efficiency per GPixel or however you want to figure it, figure the same thing for the Fury X, take the difference and factor that as a percentage of the baseline 290x's figure, and you have your increase in efficiency.


1/.7 = 1.43 = 43% more efficient over the R9 290X
1/.75= 1.33 = 33% more efficient over the R9 290X

Definitely not 50% by any stretch of the imagination...


----------



## iLeakStuff

Quote:


> Originally Posted by *royalkilla408*
> 
> Yep. Let's home AMD sells ATI. No way they can survive anymore. I just don't see how. After this disappointment I don't see their numbers increasing or earning anything significant to keep them out of the red.


As an AMD supporter, I think their entire problem is the timing of their GPU releases. Imagine Fury X releasing in February/March along with Titan X. That would be an entirely different situation. Fury X would sell like crazy. Now they come dragging behind with an equally expensive GPU that is a little bit slower than 980Ti.
I understand why people can`t help but feel a little "meh" about the situation. The Fury X just doesnt get the attention it deserves to.


----------



## PontiacGTX

Quote:


> Originally Posted by *royalkilla408*
> 
> I am using the latest drivers. How do you know I don't? I never even said what drivers I am using and your saying I'm using the wrong ones? Of course I have the latest ones. I upgrade right away to new drivers in hope of nether performance but AMD drivers always are a let down. Check their drivers page and see the usual excuse "performance with Crossfire doesn't work or works somewhat, we are HARD at working with xxxxyyy game developer to improve our crossfire performance". When is that work with the developers paying off? I don't see it anywhere. It's been months and CF hardly works for any new game. AMD needs to go and sell off ATI. Don't care about their CPU business as they suck even more.


http://www.guru3d.com/files-details/amd-catalyst-15-15-download.html...

you are playing (some) games based/biased towards nvidia, those claims needs to be checked if you are playing games with gameworks


----------



## Ramzinho

Quote:


> Originally Posted by *Olivon*
> 
> PC gamers who don't know how to edit an .ini file don't deserve the master race label


You ruined the joke.. watch the crosshairs on your forehead


----------



## Tippy

Quote:


> Originally Posted by *iLeakStuff*
> 
> As an AMD supporter, I think their entire problem is the timing of their GPU releases. Imagine Fury X releasing in February/March along with Titan X. That would be an entirely different situation. Fury X would sell like crazy. Now they come dragging behind with an equally expensive GPU that is a little bit slower than 980Ti.
> I understand why people can`t help but feel a little "meh" about the situation. The Fury X just doesnt get the attention it deserves to.


I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.

If 980 Ti was $700 then Fury-X would've made perfect sense at $650.


----------



## PostalTwinkie

Quote:


> Originally Posted by *szeged*
> 
> i wonder how big the 980ti owners club is gonna get in the next few days.


Big, about the only ones that won't jump are the ones holding out for Pascal. Which is pretty hard to do to be honest. That Gigabyte G1 980 Ti is pretty over the top.
Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.
> 
> If 980 Ti was $700 then Fury-X would've made perfect sense at $650.


This is actually a pretty solid statement. I think you are right and/or on to something.


----------



## szeged

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Big, about the only ones that won't jump are the ones holding out for Pascal. Which is pretty hard to do to be honest. That Gigabyte G1 980 Ti is pretty over the top.


only people i see not going are die hard fan(boys) of amd who will get the furyx no matter what, owners of already high end set ups, people who cant afford it either way and people waiting for 14/16nm.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.


I'm sorry, how could AMD NOT KNOW nVidia's pricing of the 980Ti? The 980Ti was released 3 weeks before the Fury X. AMD had all the time in the world to make a price change. nVidia was the one who tipped their pricing hand first, not AMD.


----------



## Slomo4shO

Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively.


The 980 Ti has been on the market for 3 weeks... There is no reason why pricing couldn't be modified prior to release. AMD is the one responding with a new product, not the other way around... Also, there is nothing aggressive about the 980 Ti being $650... In fact, if anything, both the 980 Ti and Fury X are overpriced.


----------



## iLeakStuff

Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.
> 
> If 980 Ti was $700 then Fury-X would've made perfect sense at $650.


Ya that too. I remember reading that somewhere that it took AMD as surprise and they had to adjust price as well. I guess $650 was the lowest they could go with the card.


----------



## Kane2207

Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.
> 
> If 980 Ti was $700 then Fury-X would've made perfect sense at $650.


I dunno, rumours pointed to AMD pricing the Fury X at $750-800 before the 980ti became official. Couple that with AMD's desire to not be known as 'the cheaper option', we might have to credit Nvidia with the Fury's $650 price.

Edit - wow, this thread is insane even 9 hours in. Ninja'd by 3 people in the time I typed my reply lol


----------



## 970Rules

More reviews = more highlights:

"AMD needed High Bandwidth Memory to even compete against NVIDIA and its continued use of GDDR5... so where will that leave us next year when both companies are on equal footing with the use of HBM2? If this is all AMD can manage with HBM1, and NVIDIA can continue to battle the Fury X with its GDDR5-based offerings, what will NVIDIA do to AMD when it moves over to not just HBM2, but the Pascal architecture and 16nm?"

"I think AMD could've clawed back some of that lost GPU market share, but the Fury X won't do that. At the end of the day, the best card to buy right now is still the GeForce GTX 980 Ti. A great card that beats the Fury X in most situations, with great custom cards from the likes of ZOTAC, EVGA, ASUS and everyone else."

"The effect is mixed feelings, especially thanks to a more greenish GPU- GeForce GTX 980 Ti. Nvidia GPU performs much like Fury X out of the box, and is offered in addition to juicy overclocked versions from partners. With limited opportunities for overclock increases AMD's newcomer simply can not keep up."

"Nvidia's pre-emptive launch of the GeForce GTX 980 Ti uses a more efficient core and regular GDDR5 memory to achieve benchmark performances that are, in our opinion, a little better than the latest Radeon's, perhaps helped in small part by having a larger framebuffer. Partner GTX 980 Ti's are faster still and overclock better than Fury X."


----------



## criminal

Quote:


> Originally Posted by *Slomo4shO*
> 
> The 980 Ti has been on the market for 3 weeks... There is no reason why pricing couldn't be modified prior to release. AMD is the one responding with a new product, not the other way around...


AMD probably cut the price as far as they could afford considering it is new tech. Still overpriced though.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *szeged*
> 
> only people i see not going are die hard fan(boys) of amd who will get the furyx no matter what, owners of already high end set ups, people who cant afford it either way and people waiting for 14/16nm.


I'm not going 980Ti. I'm sticking with my 980 until Pascal comes out. The 980Ti is a great card, but I already have too much invested in my 980 to bother to trade it in for either a 980Ti (and ESPECIALLY a Fury X ... because I have 2 G-Sync monitors).

Is it 2016 yet?


----------



## mutantmagnet

For AMD's sake I hope DirectX 12 features help it surpass the Titan X.

Even with the premium for Gsync vs Freesync there isn't a compelling reason to get a freesync display + a Fury besides some really low temps.


----------



## PostalTwinkie

Quote:


> Originally Posted by *47 Knucklehead*
> 
> I'm sorry, how could AMD NOT KNOW nVidia's pricing of the 980Ti? The 980Ti was released 3 weeks before the Fury X. AMD had all the time in the world to make a price change. nVidia was the one who tipped their pricing hand first, not AMD.


The problem about changing the price is that they might not have much room. Looking at the Fury X, that CLC isn't cheap, it is a big die. I just don't know if AMD had planned enough margin in there to truly counter such a price of a 908 Ti.

But could you blame them? Could we have honestly expected AMD to expect Nvidia to []D[][]V[][]D slap their own Titan owners? Nope! The 980 Ti dropped low, and I think lower than maybe AMD ever thought.

Just a few thoughts.


----------



## Prophet4NO1

Guess i dont have to get rid of my titan x cards.


----------



## Ganf

Quote:


> Originally Posted by *szeged*
> 
> i wonder how big the 980ti owners club is gonna get in the next few days.


Boycotting all of the clubs. I'm going to put black tape over the Geforce logo of my 980ti's and write BFG Voodoo X on it in White-out.


----------



## rv8000

Quote:


> Originally Posted by *Tippy*
> 
> I think it's more about AMD not expecting nVidia pricing 980 Ti so aggressively. Seriously, pretty much all rumors were putting 980 Ti at $700-800 up till the last minute until nVidia revealed the actual price and surprised the hell out of everyone.
> 
> If 980 Ti was $700 then Fury-X would've made perfect sense at $650.


This is just what I was thinking a moment ago







. Nvidia probably had some rough idea of Fiji performance, and being a good few months ahead of AMD in production took advantage of the situation, and went with really aggressive pricing making their cards look that much better. Good business move.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ganf*
> 
> Boycotting all of the clubs. I'm going to put black tape over the Geforce logo of my 980ti's and write BFG Voodoo X on it in White-out.


I wish Nvidia would use the Voodoo name, fairly certain they acquired the rights to the names when they picked apart the corpse of 3DFX.

Maybe give the flagship Pascal card the Voodoo name! Oh that would be sweet....all nostalgic.
Quote:


> Originally Posted by *p4inkill3r*
> 
> I'll be ordering one today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel that it will only get more competitive as driver maturation sets in and being within 8-10% of nvidia's hottest card is good enough for me.


I missed your comment on this earlier.....

Good luck to you. My guess is the next driver set brings improvements and maybe voltage control. Allowing for that "Overclockers dream" we haven't seen yet.


----------



## Lansow

Quote:


> Originally Posted by *p4inkill3r*
> 
> I'll be ordering one today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel that it will only get more competitive as driver maturation sets in and being within 8-10% of nvidia's hottest card is good enough for me.


I... don't understand. You're practically rewarding AMD for failing to deliver on their promises, and pricing it far beyond what it should be. MIND BLOWN. People should be snapping up Nvidia's cards like hotcakes right now to send AMD a message; do NOT lie to us.


----------



## iLeakStuff

Quote:


> Originally Posted by *Ramzinho*
> 
> I'll have to be the devil's advocate here. Guys we are talking about a 650$ Watercooled gpu that's trading blows with the 980ti. this is a day 1 release. with drivers maturing this gpu will probably be between the 980ti and the titanx. and you guys are not giving AMD any credit for having a 2 years old gpu "290X" still standing strong in any non biased "gameworks" title vs the 980. Be more reasonable. Love Nvidia as much as you want.. IT IS YOUR RIGHT. if amd dies we gamers will probably be back to the 2001 ERA when we had to buy a new GPU with every new game release !


I said earlier that I think Fury is a fantastic piece of hardware. But I personally dont feel excited and I feel dissappointed after AMDs own graph where it almost seem like Fury X would beat 980ti by 10-15%. Id buy 1 or 2 cards today if that was the case. But now im left here thinking about what to do.

We saw today that HBM didnt leapfrog GDDR5 in high resolution, not even 4K.
Its an alternative to 980Ti, no doubt. But I don`t think the result is enough to create big sales or excitement. Maybe its just me bummed out but after reading replies to reviews and forums I think many other feel the same.
Kudos for making a efficient card like Maxwell though. Only they are 1.5 year after Nvidia got the first Maxwell card out or something.
Late. grrrr


----------



## romanlegion13th

AMD needed to beat the Titan X so people bought there cards, that would make Nvidia make better more competition

now we get a Nvidia white wash so they mite hold back on the next cards as there is no competition

not good for PC gamers


----------



## Lex Luger

As much as I bash AMD, I'm not delusional. Pascal flagship will probably be 800 dollars or more and I'm expecting that will be the midrange x04 size chip.


----------



## Slomo4shO

Quote:


> Originally Posted by *criminal*
> 
> AMD probably cut the price as far as they could afford considering it is new tech. Still overpriced though.


Well, AMDs attempt at a "premium" card went up in flames









This launch was as poorly executed as Hawaii. A lot of hype, reference design with voltage lock and limited overclocking, and piss poor timing. The saving grace would be if the Fury Pro has the exact same specs at the Fury X on air but that is probably wishful thinking...









Quote:


> Originally Posted by *Lansow*
> 
> People should be snapping up Nvidia's cards like hotcakes right now


I take it you haven't payed any attention to the GPU market share over the last year...


----------



## AndroidVageta

Quote:


> Originally Posted by *romanlegion13th*
> 
> AMD needed to beat the Titan X so people bought there cards, that would make Nvidia make better more competition
> 
> now we get a Nvidia white wash so they mite hold back on the next cards as there is no competition
> 
> not good for PC gamers


Did they? The Titan X is a $1000 card.


----------



## rdr09

Quote:


> Originally Posted by *provost*
> 
> Nvidia might have said they added support, in reality, the new drivers are highly unstable for a lot of people, including myself. I crash on any driver other iCafe latest drivers or 350.12 , and these don't have the Gk110 optimizations. But, I can't run any of the new drivers without crashing. have tried all kinds of novelty fixes HW acc off, Phsx to cpu, no ge force, ddu, polling rate fix, etc,etc, etc ( please don't say overclock related, as the whole system is dead stock clock)
> 
> and, sli support has been terrible for games for the past year or so


don't dare say anything bad . . .


----------



## Lansow

Quote:


> Originally Posted by *SKYMTL*
> 
> I did look at the reviews. 4GB isn't a limiting factor.
> *Removes backplate*
> 
> *Complains about temperatures*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The backplate allows for proper dissipation through the PCB of some of that heat. Not a great situation but the "heat rises" rule is in effect here.


Quote:


> Originally Posted by *SKYMTL*
> 
> 4GB of HBM is more than enough. I have yet to encounter a situation that uses more than 4GB of memory which doesn't cause a GPU bottleneck far before the memory becomes a limiting factor.


Very interesting. You are literally the only reviewer to make the claim that 4GB of VRAM has no impact. *Either everyone else is wrong, or you are.* I've read your reviews many times in the past, which leads me to believe that the problem is YOU, not everyone else.

I am expressly calling SKYMTL out. Put up some proof, or stop spreading nonsense. People look to you for unbiased reviews, and you're failing them.


----------



## PontiacGTX

Quote:


> Originally Posted by *Lansow*
> 
> Very interesting. You are literally the only reviewer to make the claim that 4GB of VRAM has no impact. *Either everyone else is wrong, or you are.* I've read your reviews many times in the past, which leads me to believe that the problem is YOU, not everyone else.


He didnt tested above 4k...


----------



## Lansow

Quote:


> Originally Posted by *PontiacGTX*
> 
> He didnt tested above 4k...


Again with the nonsense. There are games that exceed 4GB at 1440p. 4k isn't even required to bump against those limits.

Even if that weren't the case, since this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.


----------



## p4inkill3r

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I missed your comment on this earlier.....
> Good luck to you. My guess is the next driver set brings improvements and maybe voltage control. Allowing for that "Overclockers dream" we haven't seen yet.


I ordered the Sapphire model from Amazon this morning.


----------



## rv8000

Quote:


> Originally Posted by *Lansow*
> 
> Very interesting. You are literally the only reviewer to make the claim that 4GB of VRAM has no impact. *Either everyone else is wrong, or you are.* I've read your reviews many times in the past, which leads me to believe that the problem is YOU, not everyone else.


In the 6 reviews that I've read through (toms, pcper, tpu, guru3d and so on), the performance gap tends to shrink between the 980ti and Fury X as you go up in res, and the tests don't show Fury X running into any issues @ 4k, Guru3D showing fantastic frametimes across the board. I'm still finding it very hard to believe all this VRAM limitation nonsense.

His review is also very consistent throughout and a pretty easy read as far as finding information. Definitely one of the better reviewers out there even though his opening sounded a little sour of AMD and in favor of Nvidia


----------



## PontiacGTX

Quote:


> Originally Posted by *Lansow*
> 
> Again with the nonsense. There are games that exceed 4GB at 1440p. 4k isn't even required to bump against those limits.
> 
> Even if that weren't the case, since this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.


the frame buffer for those games doesnt really means all is being used.maybe in the future 4GB wont be enough for 4k,byt now it is.of cours this card doesnt work more than being a placeholder for 2016 unless all new games are DX12 games and all use DX12 explicit multiadapter


----------



## Kane2207

Quote:


> Originally Posted by *Lansow*
> 
> Again with the nonsense. There are games that exceed 4GB at 1440p. 4k isn't even required to bump against those limits.
> 
> Even if that weren't the case, since this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.


Cached VRAM vs VRAM in use are two very different things.

I can hit 5.5GB VRAM usage on SoM @1080p, yet the Fury X gets better average frames despite having 50% less VRAM.

But saying that, I don't think I'd buy a 4GB card now, VRAM requirements are only going one way, and that's up...


----------



## Lansow

Quote:


> Originally Posted by *Slomo4shO*
> 
> I take it you haven't payed any attention to the GPU market share over the last year...


Yes, I have been paying attention. The results are because AMD hasn't released anything competitive. This was their only chance. You can fully expect them to dip below 10% this year.


----------



## Xuper

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Nope.
> 
> 
> 
> 
> 
> 
> Cue up to the 35 minute mark and watch.
> 
> Fury X ... "1.5x performance per watt over the R9 290X" ... that is a lie, now proven. 36:30 time.
> 
> Fury Nano ... "Significantly more performance than the R9 290X, at half the size and half the power. ... 2x performance per watt." 37:50
> 
> There is NO WAY that string of lies is/was true. PERIOD.
> 
> Given the now known wattage and now known performance numbers, her numbers don't add up. And "significantly more performance" shouldn't mean 10-15%.
> 
> And for reference, here are the "performance per watt" numbers ...
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X/32.html
> 
> Notice the Fury X at 100% and the R9 290X at 80%/72%/75%/78% depending on resolution? Since when is that "1.5x performance per watt over the R9 290X"? It isn't. The CEO of AMD is a LIAR.
> 
> Who, me?


You are going into trouble by your post:

at 4K R9 290x = 70% , Fury X =100% , Now how much faster ? say R9 290x = 100% ==> Fury X = 1.428 , So with Optimized Driver , It can reach 1.50

more perofrmance than 290x = sure
at half the size = sure
at half power = we don't know , Fury X shows on par with 290x.
2x performance per watt = perhaps near to 2x or 1.5x ~ 1.8x with optimized Driver.

Before you Insult Lisa Su, Just write carefully what you're saying.


----------



## SSJVegeta

Looks like I'll be upgrading next year instead.


----------



## Slomo4shO

Quote:


> Originally Posted by *Lansow*
> 
> this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.
> 
> I am calling SKYMTL out. Put up some proof, or stop spreading nonsense. People look to you for unbiased reviews, and you're failing them.


Seems like you are just trying to rationalize your purchase decision of the SLI Titan X... You are calling someone out based on hypothetical scenarios that may or may not exist down the line...


----------



## p4inkill3r

Quote:


> Originally Posted by *Lansow*
> 
> I... don't understand. You're practically rewarding AMD for failing to deliver on their promises, and pricing it far beyond what it should be. MIND BLOWN. People should be snapping up Nvidia's cards like hotcakes right now to send AMD a message; do NOT lie to us.


I'm buying a high-tech product from a company I have supported for 20 years that will replace my R9 290 CFX setup when I move to mitx.


----------



## bigjdubb

All this talk about the 4gb being the problem but when I look at the benchmarks it seems as though the performance gap gets smaller as the resolution goes up. What am I not understanding?


----------



## iatacs19

Did Anandtech skip the review of the Fury X?


----------



## MKHunt

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The problem about changing the price is that they might not have much room. Looking at the Fury X, that CLC isn't cheap, it is a big die. I just don't know if AMD had planned enough margin in there to truly counter such a price of a 908 Ti.
> 
> But could you blame them? Could we have honestly expected AMD *to expect Nvidia to []D[][]V[][]D slap their own Titan owners? Nope!* The 980 Ti dropped low, and I think lower than maybe AMD ever thought.
> 
> Just a few thoughts.


Hahahahaha. You didn't pay attention to Titan -> Titan Black / 780Ti did you?


----------



## criminal

Quote:


> Originally Posted by *Lansow*
> 
> Again with the nonsense. There are games that exceed 4GB at 1440p. 4k isn't even required to bump against those limits.
> 
> Even if that weren't the case, since this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.


Another person that doesn't understand the difference between vram use and vram allocation. Yay!


----------



## 47 Knucklehead

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The problem about changing the price is that they might not have much room. Looking at the Fury X, that CLC isn't cheap, it is a big die. I just don't know if AMD had planned enough margin in there to truly counter such a price of a 908 Ti.
> 
> But could you blame them? Could we have honestly expected AMD to expect Nvidia to []D[][]V[][]D slap their own Titan owners? Nope! The 980 Ti dropped low, and I think lower than maybe AMD ever thought.
> 
> Just a few thoughts.


Oh, I agree, I think that AMD has cut their price to the bone. Paying for that HBM R&D and higher cost has to drive up the BOM (Bill of Materials). As I said a month or more ago, nVidia has their spies and most likely got a good idea of how well the Fury was going to perform, and cut the 980Ti price to the bone knowing full well that AMD can't win in a price war. Initial estimates for both cards were well in to the upper $700/lower $800s, and I honestly think that nVidia wanted to make AMD pay, and pay good.

The price on Titans has always been high, and time after time, their owners knew that price tag and still bought them. Heck, the Titan X is still a hell of a card, even at $999 ... because it has 12GB of memory ... triple that of the Fury X ... so if you are going to do any serious multiple 4K monitor use, that's the card to have. Still, I wouldn't own one.









Again, I think that knowing the price for 3 weeks of what the 980Ti was and STILL coming out at the same price, AMD doesn't really have any room to go lower, not without directly taking it on their bottom line, which honestly, they can't afford to anymore. I think they gave their developers as much time to pull out some miracle with drivers or BIOS, and this is the best they could do, even after knowing the 980Ti's numbers.


----------



## Ganf

Quote:


> Originally Posted by *iatacs19*
> 
> Did Anandtech skip the review of the Fury X?


They're busy cooking the books to put the Fury up there trading blows with the Titan X to stir up some internet drama and get dem clicks.


----------



## Alatar

Quote:


> Originally Posted by *iatacs19*
> 
> Did Anandtech skip the review of the Fury X?


https://twitter.com/RyanSmithAT/status/613661664479612929


----------



## PostalTwinkie

Quote:


> Originally Posted by *MKHunt*
> 
> Hahahahaha. You didn't pay attention to Titan -> Titan Black / 780Ti did you?


I have a 780 Ti.


----------



## Kaltenbrunner

Well I think I'm happy enough with the benches, only thing is they r priced the same as the 980ti but I gues the benches are back and forth, plus AMD might get some newer drivers. I'll get 1 or the other openbox over the summer and be happy either way.

BYE BYE crossfire problems for once


----------



## GorillaSceptre

Damn, these 980Ti's are OC monsters..

Even with improved drivers (







), there's no way the Fury can compete with Maxwell's OC ability.

Give credit where credit is due, it's impressive how well GM200 can clock, i didn't think they would be able to get anywhere near the smaller dies.


----------



## lacrossewacker

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Well I think I'm happy enough with the benches, only thing is they r priced the same as the 980ti but I gues the benches are back and forth, plus AMD might get some newer drivers. I'll get 1 or the other openbox over the summer and be happy either way.
> 
> BYE BYE crossfire problems for once


benches are back and forth?

Maybe I'm missing it, but the 980 Ti is consistently ahead in most games at stock.

Let's not forget the 980 Ti has some large gains in OCing as well.

As for "AMDs driver team being their magic fairy dust"...aint happening.


----------



## kizwan

Quote:


> Originally Posted by *criminal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lansow*
> 
> Again with the nonsense. There are games that exceed 4GB at 1440p. 4k isn't even required to bump against those limits.
> 
> Even if that weren't the case, since this card clearly isn't powerful enough for 4k on its own you'll need two of them for high detail levels and acceptable framerates... where the 4GB will AGAIN be a limiting factor.
> 
> 
> 
> Another person that doesn't understand the difference between vram use and vram allocation. Yay!
Click to expand...

Many people actually. I've come across with one quite recently.


----------



## Kuivamaa

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Very disappointing results for Fiji. I cant think of any reasonable excuse to recommend this card over a 980Ti. Guess we all just have to admit that Nvidia is just dominant and theres nothing AMD can do about it. By all means, if you want the best card available today get a 980Ti (preferrably a non reference one). Congrats to all of you that snagged a 980Ti already. You were right...


Αfter reading the reviews I just don't see that way. Granted, I ignore 1080p altogether (it is the DSR/VSR era) but still the perception of performance largely depends on titles used. Throw in dying light, project carrs, batman origins or the latest CoD in the mix and Fury seems garbage because it goes toe to toe with a 980. Bench Ryse, Absolution or even Far Cry 4 (weird that fiji does so well in an nvidia title but it happens I guess) like cb.de did and it edges the Ti and matches Titan X. I understand that people expected a titan X killer but I really have no reason to be disappointed with this card. Just another nice option in the 650 bracket alongside the very potent 980Ti. Same performance bracket really. You check your games and choose accordingly. The game that will drive my next GPU purchase is Battlefront personally with Star Citizen as an afterthought (because yeah,good luck seeing it in the next 12 months







) , technically the next Deus Ex too but that one will run smoothly on anything. So I will wait it out till I see what DX12 brings. 6 gigs are a good selling point for the Ti right now, as I do not think ports will ever ask for more, seems like the sweet spot for this generation. But I still count on HBM bandwidth to find a very good niche really soon too.


----------



## tiborrr12

GM107 and Bonaire are more important for NVIDIA and AMD than any big die when it comes to OEM sales. It's where the real money is. Sadly both chips are now ~2 years old.


----------



## Alatar

Not sure if someone else already noticed this however I find this a bit worrying:
Quote:


> driver modifications needed to properly manage 4GB of memory on a wider, but slower, memory bus are still being perfected. *AMD told me this week that the driver would have to be tuned "for each game".* This means that AMD needs to dedicate itself to this cause if it wants Fury X and the Fury family to have a nice, long, successful lifespan.


http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-

The memory management stuff they were talking about has to be done individually for each game?


----------



## SpeedyVT

Quote:


> Originally Posted by *Kuivamaa*
> 
> Αfter reading the reviews I just don't see that way. Granted, I ignore 1080p altogether (it is the DSR/VSR era) but still the perception of performance largely depends on titles used. Throw in dying light, project carrs, batman origins or the latest CoD in the mix and Fury seems garbage because it goes toe to toe with a 980. Bench Ryse, Absolution or even Far Cry 4 (weird that fiji does so well in an nvidia title but it happens I guess) like cb.de did and it edges the Ti and matches Titan X. I understand that people expected a titan X killer but I really have no reason to be disappointed with this card. Just another nice option in the 650 bracket alongside the very potent 980Ti. Same performance bracket really. You check your games and choose accordingly. The game that will drive my next GPU purchase is Battlefront personally with Star Citizen as an afterthought (because yeah,good luck seeing it in the next 12 months
> 
> 
> 
> 
> 
> 
> 
> ) , technically the next Deus Ex too but that one will run smoothly on anything. So I will wait it out till I see what DX12 brings. 6 gigs are a good selling point for the Ti right now, as I do not think ports will ever ask for more, seems like the sweet spot for this generation. But I still count on HBM bandwidth to find a very good niche really soon too.


Everyone is busy gawking at the games crushed with gameworks and asking why does AMD not get better frames at 4K. The Fury is better than 980 ti only when you consider this.


----------



## Master__Shake

Quote:


> Originally Posted by *Ramzinho*
> 
> Who needs a 1000$ GPU
> 
> 
> 
> 
> 
> 
> 
> when this is the future of pc gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tweeted by scott wasson


----------



## SoloCamo

Wow. Rushed home from work and... wow. Yea, the disappointment is real. Going to stick to my guns and skip this gen, but if I wasn't it'd be the 980ti at this point. Custom pcb's & 2gb more vram. Even if the Fury X oc's a bit more and the drivers mature a bit it really won't justify the 4gb limit to those on higher res. And at best I feel it would match a oc'ed 980ti as they seem to be beasts.

I'm more dissapointed now then I was when I thought replacing my ti4200 with a fx5200 was a good idea


----------



## Ganf

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?


Welp, that could explain a few things.

I sure as hell hope that AMD has some neat tricks for streamlining this process, or have found a way to zombify their driver team and enslave them for 24/7 productivity one.


----------



## Remij

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?


Eeesh, hopefully AMD's driver team will start to be on the ball with timely releases and hopefully they improve their crossfire support.

Still too early to talk about DX12 however. Wonder how performance will translate in the coming year with DX12 titles?


----------



## Kuivamaa

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?


Drivers ideally have to be tuned for every game. Major releases always get this treatment anyway , no real news.


----------



## PontiacGTX

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?


drivers to use HBM properly. its like they arent using one for HBM design.


----------



## szeged

Inb4 10 month wait for drivers that fix things that should have been fixed a long time ago, but wait... They have special names and banners and presentations! Calling your driver's omega and stuff like that just screams "please ignore the fact we are so late on this" to me.


----------



## Kuivamaa

Quote:


> Originally Posted by *Master__Shake*


Yeah, this is depressing. What were they thinking?


----------



## Sycksyde

Quote:


> Originally Posted by *Slomo4shO*
> 
> Well, AMDs attempt at a "premium" card went up in flames


Interesting, things are indeed dire for AMD, even more now with this latest dud.


----------



## Ha-Nocri

Quote:


> Originally Posted by *SoloCamo*
> 
> Wow. Rushed home from work and... wow. Yea, the disappointment is real. Going to stick to my guns and skip this gen, but if I wasn't it'd be the 980ti at this point. Custom pcb's & 2gb more vram. Even if the Fury X oc's a bit more and the drivers mature a bit it really won't justify the 4gb limit to those on higher res. And at best I feel it would match a oc'ed 980ti as they seem to be beasts.
> 
> I'm more dissapointed now then I was when I thought replacing my ti4200 with a fx5200 was a good idea


4GB of RAM won't be an issue. Fury X is faster than 980ti @4k, except in some newer titles. If you are on 1080p or 1440p you'd be fine


----------



## PontiacGTX

Quote:


> Originally Posted by *szeged*
> 
> Inb4 10 month wait for drivers that fix things that should have been fixed a long time ago, but wait... They have special names and banners and presentations! Calling your driver's omega and stuff like that just screams "please ignore the fact we are so late on this" to me.


that sounds like the 8 months old kepler drivers? next month the problem is around 9 months old


----------



## Vesku

Quote:


> Originally Posted by *Slomo4shO*
> 
> The 980 Ti has been on the market for 3 weeks... There is no reason why pricing couldn't be modified prior to release. AMD is the one responding with a new product, not the other way around... Also, there is nothing aggressive about the 980 Ti being $650... In fact, if anything, both the 980 Ti and Fury X are overpriced.


Lisa Su said she's going to try to hold on prices across the board as the CEO. Perhaps they will try to more tightly control the amount of stock in the channel instead of dropping prices as often.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> Well, AMDs attempt at a "premium" card went up in flames
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This launch was as poorly executed as Hawaii. A lot of hype, reference design with voltage lock and limited overclocking, and piss poor timing. The saving grace would be if the Fury Pro has the exact same specs at the Fury X on air but that is probably wishful thinking...
> 
> 
> 
> 
> 
> 
> 
> 
> I take it you haven't payed any attention to the GPU market share over the last year...


i'm a glass half full kind of guy.

i see that graph and im like "hey that's just 76 percent of people who could be amds potential customers".


----------



## toothpick

Here's the issue I have with saying "It's just launched, wait a couple of (days|weeks|months) and we'll see the real results." This company is launching their flagship, no holds barred, this is the pinnacle of our engineering product. They know that it is going to be reviewed most heavily and decisions on how consumers spend $650 will rely on how it does on that day. So either they did their best and this is what they have or this is the effort you can expect from their driver team at the most critical aspect of the product lifespan. Why should I be expected to pay $650 now and hope that it gets better later?


----------



## iamhollywood5

This card needs to be priced at $599 or even $549. You'd have to be insane to pay the same price for less performance and 2 essential gigabytes less of VRAM plus the hassle of mounting a radiator somewhere in the case.

Problem is, I'm guessing the margins on this card are already slim as it is, not sure they could even break even if they charged less. Then again, they didn't spend anything on R&D for a new architecture, so who knows. AMD is a sinking ship.


----------



## LandonAaron




----------



## Myst-san

Here is one more review HardwareBG


----------



## SpeedyVT

I wouldn't buy either 980 ti or Fury, 650$ is a huge thing to throw dollars at. Nearly build a computer for that price. All the games that matter don't have sucky performance issues like Batman. The other thing is that NVidia has been slowly crippling their old and AMD's cards with Gameworks, this is something we should not stand for.


----------



## Master__Shake

Quote:


> Originally Posted by *toothpick*
> 
> Here's the issue I have with saying "It's just launched, wait a couple of (days|weeks|months) and we'll see the real results." This company is launching their flagship, no holds barred, this is the pinnacle of our engineering product. They know that it is going to be reviewed most heavily and decisions on how consumers spend $650 will rely on how it does on that day. So either they did their best and this is what they have or this is the effort you can expect from their driver team at the most critical aspect of the product lifespan. Why should I be expected to pay $650 now and hope that it gets better later?


i agree with this.

when nvidia launched the titanx there was no need for drivers to mature and when they released the 980ti it was already better than the titan x when over clocked.

is amd like cheese or wine?

does it need to age to get better?

gpu's are not an investment for future maturity they are a purchase for use NOW.


----------



## Alatar

Seems like a local site over here got a sample too:

1440p: http://muropaketti.com/artikkelit/naytonohjaimet/amd-radeon-r9-fury-x-fiji,4
4K: http://muropaketti.com/artikkelit/naytonohjaimet/amd-radeon-r9-fury-x-fiji,5


----------



## PontiacGTX

Some more 4k benchmarks http://www.digitalstorm.com/unlocked/amd-fury-x-performance-benchmarks-idnum360/


----------



## GorillaSceptre

Quote:


> Originally Posted by *Master__Shake*
> 
> i agree with this.
> 
> when nvidia launched the titanx there was no need for drivers to mature and when they released the 980ti it was already better than the titan x when over clocked.
> 
> is amd like cheese or wine?
> 
> does it need to age to get better?
> 
> gpu's are not an investment for future maturity they are a purchase for use NOW.


Agreed, and in this case, i think most of us waited long enough already.


----------



## SoloCamo

Quote:


> Originally Posted by *Master__Shake*
> 
> is amd like cheese or wine?
> 
> does it need to age to get better?
> 
> gpu's are not an investment for future maturity they are a purchase for use NOW.


While it has favored me owning a 290x, this is why I'm even more hesitant moving forward after seeing this card's launch.

Two options:

Get lesser performance now and better later = AMD

Get solid performance now and early performance abandonment later = Nvidia


----------



## Papermilk

Just checked PC Case Gear here in Australia and pretty much all 980tis have been sold out over night. I'm surprised how much people were waiting to see how Fury turned out.


----------



## scorpscarx

They just need to sell and compete with nvidia, not much else matters. Applaud them for bringing the new product to market, AMD is just spread to thin, that rumor of dividing the company might prove fruitful afterall.


----------



## SpeedyVT

Technically the most powerful single slot card still belongs to AMD with the 295x2. TECHNICALLY.


----------



## FallenFaux

Quote:


> Originally Posted by *SoloCamo*
> 
> While it has favored me owning a 290x, this is why I'm even more hesitant moving forward after seeing this card's launch.
> 
> Two options:
> 
> Get lesser performance now and better later = AMD
> 
> Get solid performance now and early performance abandonment later = Nvidia


I've never once regretted my choice of dropping my 780 for a 290x because we got the best of both worlds, super fast at launch and then it just kept getting better and better. However, if this is any indication of what Nvidia vs AMD will look like at 14/16nm I'll probably just go Nvidia. I'm just hoping that Nvidia can fix the hot mess that has been their drivers lately.


----------



## ebduncan

Quote:


> Originally Posted by *Master__Shake*
> 
> i agree with this.
> 
> when nvidia launched the titanx there was no need for drivers to mature and when they released the 980ti it was already better than the titan x when over clocked.
> 
> is amd like cheese or wine?
> 
> does it need to age to get better?
> 
> gpu's are not an investment for future maturity they are a purchase for use NOW.


well by that logic the Fury X will stomp all over the 980ti when Nvidia releases their next gen stuff. Kinda how the 290x beats the 780 Ti in most things now.

All things considered I think it's a decent launch for them. It's clearly superior to the 980, costs a bit more, but comes water cooled from the factory. Voltage unlocks will tell the story on how well they overclock. DX 12 will even the playing field when it comes to CPU driver overhead. I'd fully expect to see the Fury X trading blows with the 980 Ti in the near future with better drivers, DX 12, and voltage overclocking.

The one thing that complexes me is how the Fury X has larger compute performance than the Titan X on all AMD's Slides, the 8.6 teraflops is noted. Where the Titan X is stated to have 6.6 teraflops. Shouldn't this translate better into the real world benchmarks?


----------



## PontiacGTX

Quote:


> Originally Posted by *SpeedyVT*
> 
> Technically the most powerful single slot card still belongs to AMD with the 295x2. TECHNICALLY.


thats double slot


----------



## youra6

Quote:



> Originally Posted by *SpeedyVT*
> 
> Technically the most powerful single slot card still belongs to AMD with the 295x2. TECHNICALLY.


Technically, the 295x2 isn't a single slot card.  I think you meant to say most powerful card on a single PCB.


----------



## SpeedyVT

Quote:


> Originally Posted by *PontiacGTX*
> 
> thats double slot


I mean pcie slot.


----------



## Offender_Mullet

Quote:


> Originally Posted by *PontiacGTX*
> 
> Some more 4k benchmarks http://www.digitalstorm.com/unlocked/amd-fury-x-performance-benchmarks-idnum360/


Here are their Fury X Crossfire benches: https://www.youtube.com/watch?v=rUY32Mq4dlY


----------



## SpeedyVT

Quote:


> Originally Posted by *youra6*
> 
> Quote:
> 
> Technically, the 295x2 isn't a single slot card.
> 
> 
> 
> 
> 
> 
> 
> I think you meant to say most powerful card on a single PCB.


Exactly.


----------



## svenge

While Fury-X seems to be a decent enough card, its performance (slightly under or equal to the 980 Ti) and relative pricing (equal to the 980 Ti) isn't likely going to "move the needle" significantly in AMD's favor. It's certainly not a disaster, but Fury-X doesn't seem to be the savior that AMD backers were hoping for in the "GPU wars".

Also if Fury-X requires a CLC in order to run at acceptable temperatures, the implications for the any air-cooled Fury (non-X) cards may be significant in terms of a performance hit. They're still "GCN 1.2" based, so there's not much new magic that can be done with the architecture itself.


----------



## anujsetia

It is sometimes amazing how OCN reacts. I have gone through all reviews in this thread.

Its quite clear GTX 980ti is faster than Fury at less than 4K resolutions.

But after checking so many reviews I think Fury X is at least equal in performance if not more than 980ti at 4K.

Two games are really killing AMD in most reviews "Project cars' and 'Dying light'

Other than that it is a very good showing by Fury X.

Disappointing thing is it is voltage locked as of now so we don't know much about overclocked performance.

HBM is a new tech altogether and I am sure still drivers are not optimized yet. I can easily foresee 3 months down the line Fury X to be atleast equal in performance with Titan X


----------



## szeged

Still waiting for that kitguru review


----------



## PontiacGTX

R9 290X2 are going for 520usd open box

http://www.overclock.net/t/1561945/newegg-open-box-powercolor-r9-290x2-deivl13-bundle-with-free-razer-ouroboros-520usd


----------



## FallenFaux

Quote:


> Originally Posted by *SpeedyVT*
> 
> Technically the most powerful single slot card still belongs to AMD with the 295x2. TECHNICALLY.


If we go by that metric then technically AMD has had the fastest GPU since 2013 and still does.


----------



## yesitsmario

Quote:


> Originally Posted by *criminal*
> 
> I still want a 295X2.


There was a deal on a 295X2 for $549 AR last time, good bang for the buck.


----------



## GorillaSceptre

Doesn't the Fury's poor OC headroom indicate the card is already pushed near it's limit?

All these 4k benches are against stock 980Ti's, it makes the Fury look better than it really is imo.


----------



## infranoia

Quote:


> Originally Posted by *SoloCamo*
> 
> While it has favored me owning a 290x, this is why I'm even more hesitant moving forward after seeing this card's launch.
> 
> Two options:
> 
> Get lesser performance now and better later = AMD
> 
> Get solid performance now and early performance abandonment later = Nvidia


This. I'm tempted to pick up a 980Ti, but going Nvidia puts you on a pretty rapid upgrade treadmill. This 290x has been an incredible value and has gone toe-to-toe with silicon that it had no right to, now especially with a CLC.

I'm not made of money, so I'm looking at the long game.


----------



## CBZ323

This card would have been a great option if Nvidia didnt release the 980Ti at that price.

Right now AMD needs to work on the drivers and hopefully capture some more market share.

I want a proper competition in this market.


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Doesn't the Fury's poor OC headroom indicate the card is already pushed near it's limit?
> 
> All these 4k benches are against stock 980Ti's, it makes the Fury look better than it really is imo.


It means that nobody has OC'd on unlocked voltages yet.


----------



## xundeadgenesisx

Quote:


> Originally Posted by *youra6*
> 
> Quote:
> 
> Technically, the 295x2 isn't a single slot card.
> 
> 
> 
> 
> 
> 
> 
> I think you meant to say most powerful card on a single PCB.


Technically the Ares 3 is a 295x2 and single slot.


----------



## BinaryDemon

Speaking of dual GPU cards... is there going to be a rebranded upgraded 295x2? I don't see why they couldn't have a R9 395x2 16gb (8gb per GPU) available next week, since it would still be GDDR5 based. Or will all development focus go to the Fury X2?


----------



## KyadCK

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Doesn't the Fury's poor OC headroom indicate the card is already pushed near it's limit?
> 
> All these 4k benches are against stock 980Ti's, it makes the Fury look better than it really is imo.


For the millionth time...

It indicates that there is no voltage control yet, and Afterburner (and all the others) need to catch up. As every single review has said.


----------



## Kaltenbrunner

well seriously when do people think nvidia will have their next gen, next spring summer ????

And when do people think AMD will have their next gen, next summer/fall ????????

No way am I waiting that long


----------



## ZealotKi11er

Quote:


> Originally Posted by *BinaryDemon*
> 
> Speaking of dual GPU cards... is there going to be a rebranded upgraded 295x2? I don't see why they couldn't have a R9 395x2 16gb (8gb per GPU) available next week, since it would still be GDDR5 based. Or will all development focus go to the Fury X2?


Fury X2 100% coming.


----------



## Master__Shake

Quote:


> Originally Posted by *ebduncan*
> 
> *well by that logic the Fury X will stomp all over the 980ti when Nvidia releases their next gen stuff. Kinda how the 290x beats the 780 Ti in most things now.
> *
> All things considered I think it's a decent launch for them. It's clearly superior to the 980, costs a bit more, but comes water cooled from the factory. Voltage unlocks will tell the story on how well they overclock. DX 12 will even the playing field when it comes to CPU driver overhead. I'd fully expect to see the Fury X trading blows with the 980 Ti in the near future with better drivers, DX 12, and voltage overclocking.
> 
> The one thing that complexes me is how the Fury X has larger compute performance than the Titan X on all AMD's Slides, the 8.6 teraflops is noted. Where the Titan X is stated to have 6.6 teraflops. Shouldn't this translate better into the real world benchmarks?


exactly by the time that happens newer cards are already out.

i can see how it is a plus for hangers on or second hand card buyers, but to entice people in to dropping close to 7 bills on a gpu today you really have to have something worth it.

this card isn't.

amd shines at 550 they always have. this is just a poor showing at 650.


----------



## sugalumps

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Damn, these 980Ti's are OC monsters..
> 
> Even with improved drivers (
> 
> 
> 
> 
> 
> 
> 
> ), there's no way the Fury can compete with Maxwell's OC ability.
> 
> Give credit where credit is due, it's impressive how well GM200 can clock, i didn't think they would be able to get anywhere near the smaller dies.


Indeed, maxwell is the true oc'ers dream.


----------



## infranoia

If the Fury X2 doesn't have HDMI 2.0 I'm giving up on this industry and going back to AGP cards. 6800GT here I come. There, there, I know, daddy missed you too.


----------



## Kane2207

Quote:


> Originally Posted by *infranoia*
> 
> This. I'm tempted to pick up a 980Ti, but going Nvidia puts you on a pretty rapid upgrade treadmill. This 290x has been an incredible value and has gone toe-to-toe with silicon that it had no right to, now especially with a CLC.
> 
> I'm not made of money, so I'm looking at the long game.


Hold tight then if some forum members get their wish and AMD spins off ATI to Samsung.

Their product shelf life is measured in years, unfortunately though they're dog years


----------



## Master__Shake

Quote:


> Originally Posted by *Kane2207*
> 
> Hold tight then if some forum members get their wish and AMD spins off ATI to Samsung.
> 
> Their product shelf life is measured in years, unfortunately though they're dog years


i'd rather amd get a cash injection from samsung rather than have samsung own them,


----------



## mltms

they mange the power very good in fury x i wonder who it will be on fury nano


----------



## Kane2207

Quote:


> Originally Posted by *Master__Shake*
> 
> i'd rather amd get a cash injection from samsung rather than have samsung own them,


You don't fancy a minor revision of the same flagship released every other month then? In 3 years we'd have a Fury X 1-6, Fury Active, Fury Ace, Fury Edge, Fury Note etc and none of them would see support past 6-9 months lol


----------



## Rickles

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?


Quote:


> Originally Posted by *Rickles*
> 
> I was under the impression that the more stacks the wider the bus.
> 
> So a 4GB card would give you a 4096 bit bus and a 8GB card would give you a 8,192-bit bus, but I could be wrong as I haven't looked into it much yet.
> 
> HBM will be a great feature when it is actually needed, and written in.


Saw that one coming 2 weeks ago...


----------



## szeged

My point was giving the driver a special name and special presentation when all it was was a driver that fixed things they ignored for so long but apparently your natural reaction is to just ignore the subject and try to attack nvidia users


----------



## GetToTheChopaa

Gotta agree with all that say this card isn't all that for 650$! I waited for it before upgrading my 780, which still serves me well @1440p I must say.
It just seems that everything works against me buying the Fury, no DVI for my Qnix monitor, slower than the competition (which has DVI), same price as the better performing competition.....
This card for 550$ would be awesome! It just isn't a compelling choice, when you have 650$ 980tis around.


----------



## Master__Shake

Quote:


> Originally Posted by *Kane2207*
> 
> You don't fancy a minor revision of the same flagship released every other month then? In 3 years we'd have a Fury X 1-6, Fury Active, Fury Ace, Fury Edge, Fury Note etc and none of them would see support past 6-9 months lol












uh yep.

that about covers it.


----------



## 47 Knucklehead

I bet you the people over at KitGuru, and especially Leo Waldock, are laughing their ass off over this right now.

Given the reviews that pretty much are across the board about the Fury X being "Too little, too late" and largely 3% slower than a GTX 980Ti with less overclocking headroom, I wonder if AMD will be giving ANYONE a review card in the future.


----------



## Orthello

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Doesn't the Fury's poor OC headroom indicate the card is already pushed near it's limit?
> 
> All these 4k benches are against stock 980Ti's, it makes the Fury look better than it really is imo.


Well its pushed more at its default voltage yes. AMD played it safe re voltage to stay inside comparable power envelopes to the 980 ti i think.
Quote:


> Originally Posted by *KyadCK*
> 
> For the millionth time...
> 
> It indicates that there is no voltage control yet, and Afterburner (and all the others) need to catch up. As every single review has said.


Yes i totally agree no comparison about overclocking can be made until AB has voltage control unlocked / mem control unlocked and we can see what it can do. Lets not forget a default 980ti boosts to 1150 mhz + in most games - the real figure depends on the sample. The fury X is capped at 1050mhz eg its a max boost setting vs NVs. So the TI has a 10%+ clock lead in most scenarios at default .

With a couple of driver tweaks and ~ 1300mhz clocks after voltage control comes i think the fury will trade blows against the 980ti pretty comfortably. AMD has a way of getting longevity out of their hardware that NV does not seem to have so i expect it to have long legs.

I have TXs myself and NV cards so have no bias. I think we have a very poor set of drivers on day 1 for some games which has not helped , eg dips in the Wither 3 etc which if can be sorted out will put this card right up there.

I'm looking forward to subscribing to the Fury X owners club and see what happens over the next couple of months , its still an exciting card IMHO.


----------



## Forceman

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Gotta agree with all that say this card isn't all that for 650$! I waited for it before upgrading my 780, which still serves me well @1440p I must say.
> It just seems that everything works against me buying the Fury, no DVI for my Qnix monitor, slower than the competition (which has DVI), same price as the better performing competition.....
> This card for 550$ would be awesome! It just isn't a compelling choice, when you have 650$ 980tis around.


Now we just need some non-X leaks so we can see if AIBs are adding DVI back in. Three weeks is too long to wait.


----------



## sugalumps

Quote:


> Originally Posted by *szeged*
> 
> My point was giving the driver a special name and special presentation when all it was was a driver that fixed things they ignored for so long but apparently your natural reaction is to just ignore the subject and try to attack nvidia users


Apparntly in his world two wrongs do make a right.
Quote:


> Originally Posted by *47 Knucklehead*
> 
> I bet you the people over at KitGuru, and especially Leo Waldock, are laughing their ass off over this right now.
> 
> Given the reviews that pretty much are across the board about the Fury X being "Too little, too late" and largely 3% slower than a GTX 980Ti with less overclocking headroom, I wonder if AMD will be giving ANYONE a review card in the future.


Ye I was thinking of that earlier, I bet he has been laughing all day. Especially at all the comments that were attacking the video shouting "the fury x is going to destroy etc etc".


----------



## maarten12100

Quote:


> Originally Posted by *Slomo4shO*
> 
> 1/.7 = 1.43 = 43% more efficient over the R9 290X
> 1/.75= 1.33 = 33% more efficient over the R9 290X
> 
> Definitely not 50% by any stretch of the imagination...


Some of the games did exceptionally bad thus dragged the performance average down. If they would've benched just those titles in the review guide which means not the extreme cases. And additionally would've run at the 15.6 driver they would've found 50%.

The claims is pretty much true as per their testing and reviewers can find the same. No one has required AMD to test every game in existence to define this. Another thing is performance per watt figures are done by taking average performance and dividing by average power in a totally different scenario. Power draw is not constant across different games.

Plenty sources for error. Now I'm not defending this card but the claim isn't false. Actually every claim of performance per watt ever made depends on the selection of benches that were used to determine it along with many other factors.

If you don't believe me even though I explained it as good as can be explained go look at the GTA V bench if you don't think that brings the average down then I can't help you.


----------



## Rei86

As for me, like I already said. MONEY [email protected][email protected]!

But I don't get why most of you guys are calling for this card to be cheaper. I get if it came in at the stupid 1k point that the Titan X is setting at. However this is priced right next to the GTX 980Ti. And from most of the performance graphs these cards are right about inline with the 980Ti in terms of performance. Sure maybe a one or five percent off, but close enough. So I don't see it as a reason for them to lower the price.

And again this is in AMD's lineup their Flagship. I would not in AMD's camp lower the price of these cards at all. As for raising the price... nope.

Now we wait for the R9-Fury and R9-Nano and see what they might bring to the table. Sorry that the Fury X didn't bust 980Ti's balls, so now its up to the Fury and Nano to exceed our expectations.


----------



## svenge

Quote:


> Originally Posted by *Master__Shake*
> 
> i'd rather amd get a cash injection from samsung rather than have samsung own them,


The Emirate of Abu Dhabi already owns 1/3 of AMD's stock ("West Coast Hitech, L.P." has 18% and "Mubadala Development" another 15%), along with Global Foundries, so I don't see a scenario where Samsung buys AMD without Abu Dhabi's consent.


----------



## sugalumps

Quote:


> Originally Posted by *Rei86*
> 
> As for me, like I already said. MONEY [email protected][email protected]!
> 
> But I don't get why most of you guys are calling for this card to be cheaper. I get if it came in at the stupid 1k point that the Titan X is setting at. However this is priced right next to the GTX 980Ti. And from most of the performance graphs these cards are right about inline with the 980Ti in terms of performance. Sure maybe a one or five percent off, but close enough. So I don't see it as a reason for them to lower the price.
> 
> And again this is in AMD's lineup their Flagship. I would not in AMD's camp lower the price of these cards at all. As for raising the price... nope.
> 
> Now we wait for the R9-Fury and R9-Nano and see what they might bring to the table. Sorry that the Fury X didn't bust 980Ti's balls, so now its up to the Fury and Nano to exceed our expectations.


Less performance, worse frame pacing, 2gb less vram than their competitor and 4gb less than their mid tier, less OC headroom than competition and late to the party. I would say about $600 would suit it better.


----------



## Master__Shake

Quote:


> Originally Posted by *Rei86*
> 
> As for me, like I already said. MONEY [email protected][email protected]!
> 
> *But I don't get why most of you guys are calling for this card to be cheaper. I get if it came in at the stupid 1k point that the Titan X is setting at. However this is priced right next to the GTX 980Ti. And from most of the performance graphs these cards are right about inline with the 980Ti in terms of performance. Sure maybe a one or five percent off, but close enough. So I don't see it as a reason for them to lower the price.
> *
> And again this is in AMD's lineup their Flagship. I would not in AMD's camp lower the price of these cards at all. As for raising the price... nope.
> 
> Now we wait for the R9-Fury and R9-Nano and see what they might bring to the table. Sorry that the Fury X didn't bust 980Ti's balls, so now its up to the Fury and Nano to exceed our expectations.


if you are going to buy a high end gpu and don't have the cash for a titan x, would you buy a card that performed less than another for the same amount of money?

i wouldn't.

i'm sure i'm not alone.


----------



## maarten12100

Whatever some people have woken up I see and this thread is no longer about the actual card. Whatever people want to say they can spit it out the Fury X is less of a disaster than the 290X was at its time. Since this is not a year late versus the Titan X unlike what the 290X was to the titan. Additionally the power delta is not big actually some reviewers noted that it peaks high but during actual gaming it is not that high. It is also cooler and quieter and will obviously drop in price soon or be aided by the cheaper air cooled version. At 100 bucks less for the same perf as the 980 Ti the aircooled variant may be good.

I'm quite sure they will match the 980 Ti once the kinks are gone.

Things to come with AMD that are interesting to me:
Carrizo
Fury Nano
Artic islands, 14nm, HBM2 and Zen.


----------



## Tivan

Quote:


> Originally Posted by *Master__Shake*
> 
> if you are going to buy a high end gpu and don't have the cash for a titan x, would you buy a card that performed less than another for the same amount of money?
> 
> i wouldn't.
> 
> i'm sure im not alone.


Looking at the purchasing decisions made that lead to Nvidia market share over the last 1-2 years, I'd assume that people value features and presentation over performance.

Both points that AMD made back some ground in, lets hope it's enough.

Also I'd assume that the FuryX is going to get some nice driver optimizations eventually. c: (but that's just something that gets my curiosity, not really a selling feature)


----------



## Master__Shake

Quote:


> Originally Posted by *Tivan*
> 
> *Looking at the purchasing decisions made that lead to Nvidia market share over the last 1-2 years, I'd assume that people value features and presentation over performance.*
> 
> Both points that AMD made back some ground in, lets hope it's enough.
> 
> Also I'd assume that the FuryX is going to get some nice driver optimizations eventually. c: (but that's just something that gets my curiosity, not really a selling feature)


huh?

explain.


----------



## sugalumps

Quote:


> Originally Posted by *rdr09*
> 
> better drivers, too, right?


Nope which is why I didn't bring it up, both sides drivers are shaky. Though it's pritty funny seeing you latch onto the one thing and keep driving it cause there is no other defense here.


----------



## Rei86

Quote:


> Originally Posted by *rdr09*
> 
> better drivers, too, right?


I also don't get this whole driver crap coming about nVidia. Its like out o nowhere everyone is singing about how bad nVidia drivers are







Where the hell is this coming from?

I haven't had a hiccup with there drivers. Not only that at least nVidia gives two s...cents to release a driver when a well respected game comes out and tries to optimize their products for it.
And again I'm not playing a Super Nintendo, its a PC and realized I'm gonna have "**** happens" pop up from time to time.


----------



## PontiacGTX

Quote:


> Originally Posted by *szeged*
> 
> Kepler is dead, nvidia put it down behind the woodshed. It's sad that they got away with it too, I bet it's going to happen to maxwell too.


kinda bad because fermi wasnt crippled at keplers release , instead nvidia improved kepler drivers


----------



## Tivan

Quote:


> Originally Posted by *Master__Shake*
> 
> huh?
> 
> explain.


They AMD 200 series was extremely competitive on price/performance.

edit: It was lacking in other departments. Like the 290 reference cooler wasn't a good presentation, for example.


----------



## Master__Shake

Quote:


> Originally Posted by *Tivan*
> 
> They AMD 200 series was extremely competitive on price/performance.


amd cut the prices of the 200 series so people would buy them. the performance of 980/970 is what made people look away from them.

im sure amd loves to cut the price of flagships in half to get people to consider them.../s


----------



## Kinaesthetic

Quote:


> Originally Posted by *rdr09*
> 
> so you think its the game? lol
> 
> some are going all the way back to 347. hope they don't go as far back to 320.18.


Ironically for me, changing how I updated allowed for the drivers to perform perfectly fine for me, rather than constant TDR'ing. Rock solid stable. You know, irregardless of fanboy crap, your trolling is really getting pathetic lately. I honestly hope you start getting some serious infractions for trolling. If someone has a problem, you don't piss on them. You try and help them. Cut the brand bullcrap out of the equation.

Your posting is a disgusting representation of the greater OCN community. And it is almost shameful that it is in such a publicly looked thread.


----------



## Schmuckley

eh..$100 more than gtx 970 and has more horsepower..works for me.
prices will probably drop some, too.


----------



## rt123

If AMD could have priced this at $549 it would have been good.
Unfortunately the AIO, HBM & the Huge Die means that their BOM is probably too High.
They can't price it lower even if they want to. Nvidia really nailed them with the 980Ti pricing.


----------



## Tivan

Quote:


> Originally Posted by *Master__Shake*
> 
> amd cut the prices of the 200 series so people would buy them. the performance of 980/970 is what made people look away from them.
> 
> im sure amd loves to cut the price of flagships in half to get people to consider them.../s


the 200 series was released way before 980/970, and wasn't selling well before that, either.

Due to as I said, presentation (such as an inefficient reference cooler) and features. The reliability of the driver was doubted at times as well, which you might put both into presentation or features.


----------



## keikei

Retailers selling at $700 and sold out. Some gamers want Fiji no matter what.


----------



## Rei86

Quote:


> Originally Posted by *PontiacGTX*
> 
> kinda bad because fermi wasnt crippled at keplers release , instead nvidia improved kepler drivers


I thought they capped the ability of Fermi to OC to something like 999mhz or something after the GTX 680 launched?

And you know who we got to blame for these practices that nVidia is getting away with? Us and time. We buy there bullcrap like the Titan X, and whine like little babies that the GTX 980 should be the GTX 960. While we swallow their load of bullcrap by giving them our money. And again what is the reasonable amount of time that a tech company should support a product? With the fast pace of updates and changes and new hardware. How long should they support a product? Yes the Kepler should be still supported but for how much longer?
Quote:


> Originally Posted by *rdr09*
> 
> not sure
> 
> https://forums.geforce.com/default/topic/847580/geforce-drivers/official-nvidia-353-30-whql-game-ready-display-driver-feedback-thread-released-2015-06-22-/29/


The few is not the majority.

And like I said they either do it right, or at least they try.


----------



## TheReciever

More importantly m390x is now a thing.


----------



## sugalumps

Quote:


> Originally Posted by *keikei*
> 
> Retailers selling at $700 and sold out. Some gamers want Fiji no matter what.


Well people bought the 980ti for $1000(titan x), they will literally buy them at any price.


----------



## Kane2207

Quote:


> Originally Posted by *keikei*
> 
> Retailers selling at $700 and sold out. Some gamers want Fiji no matter what.


Not difficult to sell out of an allocated 5 units per retailer...

/s


----------



## ZealotKi11er

Quote:


> Originally Posted by *Tivan*
> 
> the 200 series was released way before 980/970, and wasn't selling well before that, either.
> 
> Due to as I said, presentation (such as an inefficient reference cooler) and features. The reliability of the driver was doubted at times as well, which you might put both into presentation or features.


Wait what? AMD sold tons of 290 and 290X. Did you forget mining. AMD just did not have to stock like Nvidia does. After that mining did huge damage to the brand and the card sales. People would not pick R9 290 for $200 months before GTX970 came out.


----------



## DMac84

Man, Im disappointed. My 290X OC wasn't doing it for me anymore after upgrading to 3440x1440 (21:9)

Im not sure, maybe it was all the hype, maybe it was the promise of HBM, I dont know... but I was really expecting this to trade blows with the Titan X here and there at a significantly less price point. As it see it today, per the benchies, I would have been thrilled if this card was $499.99 USD, and it would have been acceptable at $549.99 USD

Edit: This needed to come out ahead of the Titan X, Say in February 2015 for this card to be successful and AMD to grab back some of the market. Its unfortunate that it didnt happen, because we all need competition.

Back to the dark side for me!


----------



## Tivan

Quote:


> Originally Posted by *keikei*
> 
> Retailers selling at $700 and sold out. Some gamers want Fiji no matter what.


It's a _beautiful_ piece of hardware, selling a dream of vertically arranged semiconductors on top.

It's romance.
It's adventure!

It's not in my pricerange and I'd maybe pick one up when the driver improvements kick in and there's 8GB versions, sometime in 1-2 years or something.

But yeah, if one got money to burn, it's definitely an attractive offering, to me. And I had issues getting perceived input lag under control on Nvidia so I'll stay away from that camp, personally. Maybe once they become competitive on price/performance on the second hand market (where I buy), I'd consider giving em another shot again.

But yeah, not trying to say that the FuryX is a better card than the 980ti. It's clearly not. But you could buy it instead of a 980ti and your life wouldn't end, you know.


----------



## Maximization

So where is the resolution 1920 x 1200? Or is that 1080 now


----------



## Orthello

Quote:


> Originally Posted by *Rei86*
> 
> I thought they capped the ability of Fermi to OC to something like 999mhz or something after the GTX 680 launched?
> 
> And you know who we got to blame for these practices that nVidia is getting away with? Us and time. We buy there bullcrap like the Titan X, and whine like little babies that the GTX 980 should be the GTX 960. While we swallow their load of bullcrap by giving them our money. And again what is the reasonable amount of time that a tech company should support a product? With the fast pace of updates and changes and new hardware. How long should they support a product? Yes the Kepler should be still supported but for how much longer?
> The few is not the majority.
> 
> And like I said they either do it right, or at least they try.


Yep .. i never forgot that handbrake, running 1070mhz on GTX580s one day under pre 300 drivers then next day under 300+ drivers hard locked to 1000mhz .. made me jump straight to 7970 lightnings afterwards.


----------



## DSgamer64

Quote:


> Originally Posted by *BinaryDemon*
> 
> Speaking of dual GPU cards... is there going to be a rebranded upgraded 295x2? I don't see why they couldn't have a R9 395x2 16gb (8gb per GPU) available next week, since it would still be GDDR5 based. Or will all development focus go to the Fury X2?


I think it is safe to assume that their dual flagship card will be Fury X based, which is what I am hoping for. But who knows, maybe they will throw a dual 390X into the ring with double the memory and it might be good for work stations and surround monitor configurations.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DMac84*
> 
> Man, Im disappointed. My 290X OC wasn't doing it for me anymore after upgrading to 3440x1440 (21:9)
> 
> Im not sure, maybe it was all the hype, maybe it was the promise of HBM, I dont know... but I was really expecting this to trade blows with the Titan X here and there at a significantly less price point. As it see it today, per the benchies, I would have been thrilled if this card was $499.99 USD, and it would have been acceptable at $549.99 USD
> 
> Edit: This needed to come out ahead of the Titan X, Say in February 2015 for this card to be successful and AMD to grab back some of the market. Its unfortunate that it didnt happen, because we all need competition.
> 
> Back to the dark side for me!


You know there is GTX 980 Ti?


----------



## p4inkill3r

??


----------



## sugalumps

Quote:


> Originally Posted by *p4inkill3r*
> 
> ??


Yes today the card is a best seller as it's just out, so in today it will be outselling other cards.


----------



## FallenFaux

Quote:


> Originally Posted by *Orthello*
> 
> Yep .. i never forgot that handbrake, running 1070mhz on GTX580s one day under pre 300 drivers then next day under 300+ drivers hard locked to 1000mhz .. made me jump straight to 7970 lightnings afterwards.


Wow I didn't know that, my 480s couldn't get past 950mhz anyway. That's some pretty shady stuff though.


----------



## Rei86

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You know there is GTX 980 Ti?


And some people just wants a Titan, because its a Titan.


----------



## manolith

I see alot of people whining about this gpu. I think its an awesome card especially for 4k.


----------



## Kane2207

Quote:


> Originally Posted by *p4inkill3r*
> 
> ??


Amazon just proved we're all mugs - that 8400 GS is obviously the new hotness...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Rei86*
> 
> And some people just wants a Titan, because its a Titan.


Because a Ti is slower now. Also You should at least get Titan wen it comes out. Such a bad purchase now.


----------



## azanimefan

Quote:


> Originally Posted by *sugalumps*
> 
> The pump noise happens on most aio's, it's just most people dont here it because of the insanely bad fans(corsair fains) that come with them. As soon as I changed my fans out for noctuas and put them down low you could hear the pump. I went air cooling and will never go back unless it's a full custom loop.
> 
> Clc's/aio is such a niche product, you get worse noise to temps performance at a higher cost than a big air cooler.


I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


----------



## iinversion

Quote:


> Originally Posted by *p4inkill3r*
> 
> ??


The real question is why is the 8400GS number 3 @ $34?


----------



## keikei

Quote:


> Originally Posted by *Tivan*
> 
> It's a _beautiful_ piece of hardware, selling a dream of vertically arranged semiconductors on top.
> 
> It's romance.
> It's adventure!
> 
> It's not in my pricerange and I'd maybe pick one up when the driver improvements kick in and there's 8GB versions, sometime in 1-2 years or something.
> 
> But yeah, if one got money to burn, it's definitely an attractive offering, to me. And I had issues getting perceived input lag under control on Nvidia so I'll stay away from that camp, personally. Maybe once they become competitive on price/performance on the second hand market (where I buy), I'd consider giving em another shot again.
> 
> But yeah, not trying to say that the FuryX is a better card than the 980ti. It's clearly not. But you could buy it instead of a 980ti and your life wouldn't end, you know.


I dont blame you, it is _nice_. If i had the cash i'd throw money at AMD already, but i tend to buy one below their top performer. My first 'high end' card was the 7970 and iv'e stuck with AMD since then, so the Pro might be my pick. I definitely want to see when the voltage gets unlocked and newer drivers do to the performance. Gamers buy what they like, i wont bust their chops for buying it.


----------



## edo101

Good lord, most of y'all made up your mind already from first day/launch drivers. lel ok. Might make things better for me then. If nobody buys it, prices go down and I buy it. couple of months later, drivers get better and I am gaming at the same performance you guys did at a more expensive price.


----------



## Blameless

Quote:


> Originally Posted by *azanimefan*
> 
> the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system.


Most of the custom loops I've had didn't have any audible pump noise. Unless the pump is defective/broken, poorly sized for the loop in question, or not isolated, it should be effectively silent.

I currently have an AIO not known for silence in my primary system and the pump at 100% is not as loud as the stock fans on a Noctua NH-D14.

I do have a first generation XSPC RX360 kit somewhere that has a pump which rattles and sounds almost exactly like a constantly seeking HDD, but those first gen H2O 750 pumps are decidedly mediocre and relatively fragile.


----------



## p4inkill3r

Quote:


> Originally Posted by *iinversion*
> 
> The real question is why is the 8400GS number 3 @ $34?


8 years and still going strong.


----------



## svenge

Quote:


> Originally Posted by *edo101*
> 
> Good lord, most of y'all made up your mind already from first day/launch drivers. lel ok. Might make things better for me then. If nobody buys it, prices go down and I buy it. couple of months later, drivers get better and I am gaming at the same performance you guys did at a more expensive price.


The question is can AMD do any kind of substantial price drop on the Fury-X in the near term, especially given its much higher BOM cost as compared to the 980 Ti?


----------



## FallenFaux

Quote:


> Originally Posted by *azanimefan*
> 
> I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


I have had a bunch of different pumps and the only time they're really audible is when there have been air bubbles in the loop right after you first install it. The main reason I even bother with water cooling is because I can't hear my computer at all.


----------



## edo101

Quote:


> Originally Posted by *svenge*
> 
> The question is can AMD do any kind of substantial price drop on the Fury-X in the near term, especially given its much higher BOM cost as compared to the 980 Ti?


if your cards don't sell what else are you gonna do.

It is a bit ******ed their R&D strategy I mean, (not gonna go into)it but I see a card that will definitely match the 980ti as the months go by. But judging by how upgrade happy people are here, (guess I am in college so this makes no sense to me how people throw away money soo quickly), I'd assume these cards won't sell as much as AMD wants. And they'll have to cut the price

Unless somehow we get some mining craze scenario.


----------



## tpi2007

Quote:


> Originally Posted by *Alatar*
> 
> Not sure if someone else already noticed this however I find this a bit worrying:
> Quote:
> 
> 
> 
> driver modifications needed to properly manage 4GB of memory on a wider, but slower, memory bus are still being perfected. *AMD told me this week that the driver would have to be tuned "for each game".* This means that AMD needs to dedicate itself to this cause if it wants Fury X and the Fury family to have a nice, long, successful lifespan.
> 
> 
> 
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-
> 
> The memory management stuff they were talking about has to be done individually for each game?
Click to expand...

I wonder how that will turn out. I bet that they are putting all their expectations that people transition to Windows 10 really fast so games can start using DX 12 faster and that problem is lessened. It could explain why some games are performing worse than others compared to the 980 Ti.

There is one other thing I noted in the comments, I was wondering why the Fury X was only just a little over half the performance in tessellation compared to GM200, well it seems that along with keeping the 64 ROPs from Hawaii, they kept the same amount of tessellation hardware that you can find in Tonga... which has half the cores. I'm starting to wonder if some of these sacrifices have to do with maximum die size.
Quote:


> The answer as to why is just because it doesn't integrate any more tessellation hardware than Tonga did. Same with the ROP count.


----------



## Redeemer

Fury X excellent performance all around early drivers too, be picking one up to replace my 780TI


----------



## magicc8ball

Quote:


> Originally Posted by *azanimefan*
> 
> I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


My custom loop is damn near silent and I would put it up against the noctua NHD-14 or 15 and those come in around 20db. My pump is the laing d5 inside NZXT Switch 810. The only sound the water makes is boot-up once it is up I do/have not gotten any unpleasant noises.


----------



## p4inkill3r

Quote:


> Originally Posted by *svenge*
> 
> The question is can AMD do any kind of substantial price drop on the Fury-X in the near term, especially given its much higher BOM cost as compared to the 980 Ti?


Near term, I'm willing to believe they can get MSRP and sell every one. Intermediate term, a $50 drop and/or game bundle will happen more than likely.

The next release from nvidia is Pascal, correct, aside from AIB 980TIs?


----------



## poii

Quote:


> Originally Posted by *azanimefan*
> 
> I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


I don't know if it was said already but computerbase.de talked to AMD and asked them about the unusually loud high pitched noise from the cooling solution. AMD said they know the issue and it is much better in the cards given out to retailers.
So computerbase got one from an online shop and it was much better.

I know it could've been luck, because well a samplesize of 1 isn't really a samplesize but wanted to share.

http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/

That's the computerbase test btw, it isn't listed in the OP.

Adding pcgameshardware.de test, too.

http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/

And there is hardwareluxx.de
http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html


----------



## Dyius

Quote:


> Originally Posted by *CasualCat*
> 
> How did you manage that? $500 makes that a really, really competitive card.
> 
> Edit: At $500 I'd say it would make choosing a 980Ti pretty difficult (especially for the people who just run it stock) unless you needed that last bit of performance.


My order was cancelled. It was a price mistake that TD isn't honoring.


----------



## 12Cores

This the best card AMD has ever made, period. Not too shabby for a company with limited resources fighting battles on every front in the technology space. For everyone that is complaining about the benches, just remember that with this card the ti would not be $650, some competition is better than none.

Two questions?
Is the card really voltage locked or are we just waiting for the Afterburner guys to release an update to that tool?
Are their any benchmarks on the windows 10 technical preview?

Cheers


----------



## Sycksyde

Quote:


> Originally Posted by *Rei86*
> 
> I also don't get this whole driver crap coming about nVidia. Its like out o nowhere everyone is singing about how bad nVidia drivers are
> 
> 
> 
> 
> 
> 
> 
> Where the hell is this coming from?


Nvidia has ONE bad driver and the foamers come out of the woodwork to rant about how bad their drivers are. AMD has consistently awful drivers since forever.


----------



## sugalumps

Quote:


> Originally Posted by *12Cores*
> 
> This the best card AMD has ever made, period. Not too shabby for a company with limited resources fighting battles on every front in the technology space. For everyone that is complaining about the benches, just remember that with this card the ti would not be $650, some competition is better than none.
> 
> Two questions?
> Is the card really voltage locked or are we just waiting for the Afterburner guys to release an update to that tool?
> Are their any benchmarks on the windows 10 technical preview?
> 
> Cheers


You got it the wrong way arround, if it wasnt for the ti(which came first) this wouldnt be $650. This is the highest they could price this card giving the 980ti's price and performance.


----------



## Blameless

Quote:


> Originally Posted by *12Cores*
> 
> This the best card AMD has ever made, period.


Maybe in terms of absolute performance for a single GPU part.

Relative to the competition or value at the time, this is a mediocre release.
Quote:


> Originally Posted by *Sycksyde*
> 
> Nvidia has ONE bad driver and the foamers come out of the woodwork to rant about how bad their drivers are. AMD has consistently awful drivers since forever.


AMD drivers are not consistently awful and NVIDIA has had far more than a few shoddy drivers.


----------



## xundeadgenesisx

Its funny. After seeing what this thread has devolved into I'm pretty sure a good portion of this forum doesn't care if/wants AMD to go out of business.
Cant wait to see what this forum looks like when Intel and Nvidia are the only options.


----------



## Sashimi

People are still complaining about 4GB. I actually suspect with HBM allowing data to move back and forth so quickly it wouldn't hit frames as hard on VRAM shortages as DDR5. Who needs massive internal storage when you have lightning fast connection and cloud?

In fact I also suspect Fury X didn't gain pace from 1440p to 4k, but rather Maxwell facing bandwidth issues moving onto these resolutions. Sure they have more memory but accessibility is so restricted that it starts to affect frames.

If what I suspect is real, moving beyond 4k onto extra wide and surround setups, Maxwell and Fury X will both face challenge albeit different in nature. Maxwell will be even more restricted in bandwidth that slows down speed further, on the other hand, Fury X will in fact well and truly run into severe VRAM shortages. How these will affect overall frame is difficult to predict. I have no doubt HBM is the future of VRAM memory, but until manufacturers can get the balance between capacity vs speed correct, it's hardly an improvement from DDR5.

Having said that the Fury X is rather disappointing in the sense that at the most mainstream resolution 1440p and below it still trails behind GTX 980 Ti by a good margin. Maybe it will close in with better driver, but that's another battle front altogether and shamefully one which AMD is usually on the losing side as well.

AMD has stepped it up with R9 Fury X from previous generations, but at the same price point as GTX 980 Ti sadly the win goes to Nvidia still. I use 1440p so it's a no brainer, but even if I house a 4k display I would still pick the GTX 980 Ti over Fury X due to small incentives like quicker optimisation for new games and Physx.

I hope AMD will survive this generation until HBM matures to really put up a winning product. They've had much head start in developing this technology and should perfect it quicker than Nvidia can with their first Pascall HBM card. Video card prices has jumped up too much for my liking since Fermi due to Nvidia lacking a real contender.


----------



## Blackops_2

Quote:


> Originally Posted by *xundeadgenesisx*
> 
> Its funny. After seeing what this thread has devolved into I'm pretty sure a good portion of this forum doesn't care if/wants AMD to go out of business.
> Cant wait to see what this forum looks like when Intel and Nvidia are the only options.


Of course they're banking on a third option or someone buying out AMD. The real question is in that situation does that company that buys them out even care about the money to be made in the discrete GPU market? My guess would be no.


----------



## Alatar

Quote:


> Originally Posted by *Redeemer*
> 
> Fury X excellent performance all around early drivers too, be picking one up to replace my 780TI


Trading blows at 4K and falling behind everywhere else against a card that's priced the same but has 50% more memory isn't what I'd classify as "excellent performance all around".
Quote:


> Originally Posted by *12Cores*
> 
> This the best card AMD has ever made, period. Not too shabby for a company with limited resources fighting battles on every front in the technology space. For everyone that is complaining about the benches, just remember that with this card the ti would not be $650, some competition is better than none.


It's obviously the fastest GPU AMD has made since it's their newest flagship GPU but it's clearly not even close to the best GPU AMD has released.

4870 was better, 5870, 7970 too, I think 290X was better as well. I think Fury X is pretty much in the 6970 range of "meh".

And as for the argument about the Ti, you can make the same argument both ways. But honestly everyone who knows anything about GPUs saw the 980Ti coming at exactly the price it came out at. Happens every time with NV GPUs, competition or no competition. Slightly cut down GPU slightly after the full GPU at a much better price/perf ratio.


----------



## 12Cores

Like I said earlier people from both sides should be celebrating this card, competition is good for consumers, always has and always will be.

Once again::

Is the card really voltage locked or are we just waiting for the Afterburner guys to release an update to that tool?
Are there any benchmarks on the windows 10 technical preview?


----------



## jarble

It looks like they made a solid card that hammers 4k even with only 4gb of memory. These little boards come very close to out pacing my tri sli set up with just one card and that my friends is when it is time to buy three new cards







. AMD you would have had my money but you missed the hdmi 2.0 and therefor I will be grabbing a 980ti once the hof /lightning boards hit

" An SFF PC alongside a good 4K UHD TV are staples of a next generation gamer's living room environment but the R9 Fury X's lack of an HDMI 2.0 port feels decidedly last-gen. Granted, there are some DisplayPort to HDMI 2.0 adapters in the pipeline but that won't cut it for a $649 card these days. This is a relic left over from previous architectures that should have been addressed. " *hardwarecanucks.com*


----------



## azanimefan

Quote:


> Originally Posted by *Rei86*
> 
> I also don't get this whole driver crap coming about nVidia. Its like out o nowhere everyone is singing about how bad nVidia drivers are
> 
> 
> 
> 
> 
> 
> 
> Where the hell is this coming from?
> 
> I haven't had a hiccup with there drivers. Not only that at least nVidia gives two s...cents to release a driver when a well respected game comes out and tries to optimize their products for it.
> And again I'm not playing a Super Nintendo, its a PC and realized I'm gonna have "**** happens" pop up from time to time.


then you've not updated your drivers in a month.

the last 3 WHQL drivers have been a disaster from nvidia. i own an nvidia affected by the problems that affect everyone else. it hits across the whole product line. So yes, nvidia drivers have been absolute trash recently, last good one was 347.88 (which is the one i'm still on, as the others all are messes).


----------



## azanimefan

Quote:


> Originally Posted by *Alatar*
> 
> Trading blows at 4K and falling behind everywhere else against a card that's priced the same but has 50% more memory isn't what I'd classify as "excellent performance all around".
> It's obviously the fastest GPU AMD has made since it's their newest flagship GPU but it's clearly not even close to the best GPU AMD has released.
> 
> 4870 was better, 5870, 7970 too, I think 290X was better as well. I think Fury X is pretty much in the 6970 range of "meh".
> 
> And as for the argument about the Ti, you can make the same argument both ways. But honestly everyone who knows anything about GPUs saw the 980Ti coming at exactly the price it came out at. Happens every time with NV GPUs, competition or no competition. Slightly cut down GPU slightly after the full GPU at a much better price/perf ratio.


i'd say the hd 7970/r9-280x was probably their last "epic" card they made. that's still a beastly sucker of a card, and thanks to some creative driver manufacturing by nvidia only a few percentage points behind a gtx780 now in many new titles.

had certain events not transpired i'd probably still be rocking my r9-280x


----------



## svenge

Quote:


> Originally Posted by *p4inkill3r*
> 
> Near term, I'm willing to believe they can get MSRP and sell every one. Intermediate term, a $50 drop and/or game bundle will happen more than likely.
> 
> The next release from nvidia is Pascal, correct, aside from AIB 980TIs?


There is still that last 22.5% remnant of AMD buyers in the market place, and there should be enough of them that can buy whatever limited numbers of Fury-X AMD can produce in the near term. But given the additional costs (vs. Hawaii) of a 600mm² die, an 1000mm² interposer and the additional costs of using very new HBM VRAM on top of a mandator CLC cooler, I just do see much "wiggle room" on pricing on this model.

The presumably air-cooled Fury (non-X) may have additional price flexibility due to the lack of a CLC cooler, but since the liquid-cooled Fury-X falls just short of the 980 Ti as it is the non-X will likely be significantly lower-performing in order to keep its thermals in check. Its logical competitor would likely be the 980 (non-Ti), so depending on its performance it'll be interesting if it has to skirt the ~$500 zone or not.

As for NVIDIA, as the GM200 took 600mm² (same as Fury-X) I don't see any bigger Maxwell GPUs in the future. Now if Pascal and/or HBM2 are significantly delayed, things may change. However, I could see some new low/mid range Maxwell cards being introduced (perhaps a 950 and/or a "960 Ti") if circumstances warrant.


----------



## jarble

Quote:


> Originally Posted by *azanimefan*
> 
> then you've not updated your drivers in a month.
> 
> the last 3 WHQL drivers have been a disaster from nvidia. i own an nvidia affected by the problems that affect everyone else. it hits across the whole product line. So yes, nvidia drivers have been absolute trash recently, last good one was 347.88 (which is the one i'm still on, as the others all are messes).


I have not had any major problems with the last few drivers on ether of my tri sli rigs. I am not saying you didnt have problems just that I have exceptional smooth sailing lately when it comes to nvidia drivers.


----------



## Sashimi

Fury X is good well rounded card balanced in almost every aspect but at the same time not good enough in almost every aspect......at least with the R9 290x they can match Nvidia speed with a crap tonne more power and holds gazillion more memory. They well and truly held the market for high resolutions against Kepler. the Fury X doesn't win in any perceivable way. They trade blows with the GTX 980 Ti at 4k resolution at best and that's not even a clear win. I really fear for the future of their company. Everyone would agree AMD closing down is not good for the market.

Edit: Not 290X, I obviously was thinking about 7970 and 7990. 2 generations already...time flies...


----------



## Rei86

Quote:


> Originally Posted by *azanimefan*
> 
> then you've not updated your drivers in a month.
> 
> the last 3 WHQL drivers have been a disaster from nvidia. i own an nvidia affected by the problems that affect everyone else. it hits across the whole product line. So yes, nvidia drivers have been absolute trash recently, last good one was 347.88 (which is the one i'm still on, as the others all are messes).


I'm on 353.06, before that 350.12.

My system is humming along just fine.


----------



## edo101

Quote:


> Originally Posted by *Blackops_2*
> 
> Of course they're banking on a third option or someone buying out AMD. The real question is in that situation does that company that buys them out even care about the money to be made in the discrete GPU market? My guess would be no.


Yeah. People here have been all about someone buying them out...as if they didn't get a clue how we barely got any good ports that GPU market is very discreet. No I doubt samsung wants to sink into a market that video game devs don't give a crap about. What they will do is use it for console work or more likely their mobile units.

if AMD sinks, we sink more money into buying GPUs


----------



## Apokalipse

Quote:


> Originally Posted by *Alatar*
> 
> 4870 was better, 5870, 7970 too, I think 290X was better as well. I think Fury X is pretty much in the 6970 range of "meh".


It could be like the 3870: not particularly great on its own, but an important stepping stone from a technological standpoint towards their next GPU.
I think I'll stick with my 980 until the 14nm GPU's are out.


----------



## Ganf

Quote:


> Originally Posted by *azanimefan*
> 
> I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


Man, what kind of watercooling gear have you been buying? Whoever you got it from I suggest sending it back to them with a nasty note and getting a full refund.

With a trebuchet, through the window.



I can hear my pump, when I've got the side panel off and my head close enough to it that I can lick it if I stretch my tongue. Likewise with the water if I put a pipe against my waterblock and then my ear up to the pipe.

I mean really, what're you buying that it sounds like a garden sump? I've owned the cheapos and the expensive watercooling gear, and I've never heard anything.


----------



## Chargeit

Quote:


> Originally Posted by *azanimefan*
> 
> then you've not updated your drivers in a month.
> 
> the last 3 WHQL drivers have been a disaster from nvidia. i own an nvidia affected by the *problems that affect everyone else. it hits across the whole product line.* So yes, nvidia drivers have been absolute trash recently, last good one was 347.88 (which is the one i'm still on, as the others all are messes).


Been pretty smooth sailing on my 780. Can't remember the last time I had a driver or system crash. I guess there was that whole performance issue thing, but, outside of Witcher 3 I'm not sure it affected me that badly.

The last time I dealt with driver stability issues was over around a year ago. My PSU was also crapping out on me at the time (damned TX850M), so, no clue how that factored into it.

I have not updated to the newest released a few days ago. Will get around to that sooner or later.


----------



## Kinaesthetic

Quote:


> Originally Posted by *rdr09*
> 
> and it is ok for you for szeged to say bad stuff about amd drivers?


If I have a bad experience, I say it. And try and solve it. I had bad experiences with AMD drivers with all of my previous cards. Tried to solve it. Some cases successful, some cases I wasn't. I don't have any bad experiences with my APU system (chipset drivers that is) at all, and it does what it needs to do very well. I have had bad experiences with Nvidia drivers (in fact, fairly recently). And I tried to solve it. Sometimes, solving it works. Sometimes it doesn't.

But I don't go around trolling around with knowledge that I haven't yet actually experienced in the first place. And my knowledge about drivers and saying that they were bad only extends to when I actually used them. From what I understand, most of the newer AMD drivers are quite stable. But I'm somewhat-broke and too lazy right now to even upgrade (after having spent money that was needed elsewhere in my setup, and hate having to tear down my loop).

On the other hand, you troll around without ever having actually experienced that which you are constantly writing about. And I cannot control what Szeged says. Although considering how much he spends, he probably has a better understanding about both sides of the fence recently than you or I do.

That is the difference.


----------



## Bartouille

Quote:


> Originally Posted by *azanimefan*
> 
> i'd say the hd 7970/r9-280x was probably their last "epic" card they made. that's still a beastly sucker of a card, and thanks to some creative driver manufacturing by nvidia only a few percentage points behind a gtx780 now in many new titles.
> 
> had certain events not transpired i'd probably still be rocking my r9-280x


7970 is a legendary card. I would much rather have a 7970 than a GTX 680 these days! It came out before the 680, has more vram, better at high res, massive oc potential.







Heck, it even competes with the 780 now!! It's a shame it wasn't that good at launch tho.

Hawaii wasn't that bad either other than the fact it took like 2 months to get proper aftermarket cooler.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Chargeit*
> 
> Been pretty smooth sailing on my 780. Can't remember the last time I had a driver or system crash. I guess there was that whole performance issue thing, but, outside of Witcher 3 I'm not sure it affected me that badly.
> 
> The last time I dealt with driver stability issues was over around a year ago. My PSU was also crapping out on me at the time (damned TX850M), so, no clue how that factored into it.
> 
> I have not updated to the newest released a few days ago. Will get around to that sooner or later.


This. I've been using 353.06 on my GTX 980 without a hitch or a crash since the day it came out. I know others have said they have had issues, but not me.


----------



## Oranuro

Great card in my opinion. What will make it even better is the inevitable price drop.BTW, 4k gaming is upon us.


----------



## sugalumps

Quote:


> Originally Posted by *47 Knucklehead*
> 
> This. I've been using 353.06 on my GTX 980 without a hitch or a crash since the day it came out. I know others have said they have had issues, but not me.


I have had two crashes in the witcher 3 with 40 hours played, but from what I understand it crashes on both sides.


----------



## Master__Shake

Quote:


> Originally Posted by *Ganf*
> 
> Man, what kind of watercooling gear have you been buying? Whoever you got it from I suggest sending it back to them with a nasty note and getting a full refund.
> 
> With a trebuchet, through the window.
> 
> 
> 
> I can hear my pump, when I've got the side panel off and my head close enough to it that I can lick it if I stretch my tongue. Likewise with the water if I put a pipe against my waterblock and then my ear up to the pipe.
> 
> I mean really, what're you buying that it sounds like a garden sump? I've owned the cheapos and the expensive watercooling gear, and I've never heard anything.


agreed.

i've got an mcp355 that is quiet even when hard mounted to the case and all 3 of my coolit c240 units are quieter than the fans.

also, a mobile trebuchet?

not sure why but i want one.


----------



## Kinaesthetic

Quote:


> Originally Posted by *azanimefan*
> 
> I agree. If noise is an issue air is the way to go. the pump noises in AIO and the water/pump noises in a custom loop are far louder then the fans in an air cooled system. i've tried recording the noise my system makes, it doesn't even register over ambient when benching/stressing. I wouldn't have anything close to that level of quiet if i went water.


The actual pump noises with pumps such as the D5 and MCPXXX pumps are almost imperceptible to the ear (and I have really darn good hearing). The only thing those things produce is vibration. Which if you can successfully decouple the vibration they are producing from your case, they are incredibly quiet. All you need is something like a shoggy sandwich/foam underneath it. Also helps if you have a very solidly built case.


----------



## Chargeit

Quote:


> Originally Posted by *sugalumps*
> 
> I have had two crashes in the witcher 3 with 40 hours played, but from what I understand it crashes on both sides.


Not sure my Witcher 3 hours because I have the GoG version. I'm close to the end, have done a fair amount of side stuff. Easily 60 - 80 hours without a crash. Worth mentioning, I don't OC my 4790k. I do oc my ram and my GPU.

My system has been solid. Feels good. Especially after dealing with that PSU issue around a year ago. That was a nightmare.


----------



## rx7racer

All I can say is poor poor AMD, guess we will continue to wait through summer for the fastest graphics card in the world.

Sometimes I want to run CF or SLI again but so many titles it gets broke on or have to wait forever for their profiles I just don't know if I wanna anymore.

Looking at all the various opinions I have to admit I find it funny the ones saying Fury X is still worth it's price at launch, don't get me wrong in a month or two it might be but as of now I can't help but just wish AMD made a crappy cooler for air and let us WC it ourselves while saving us a $100 or heck even knock $50 off on the price. With it's performance I'd pay $549 and feel a lot better. As it stands and as crazy as it sounds I think my gut feels better paying $649 for a Ti and slapping my WB on it.

A nice solid try by AMD and it will help the industry as soon as NV gets their HBM2, but as usual AMD innovates only to fall short by being the first to it and as always it's competitor will do it better.


----------



## BoredErica

Quote:


> Originally Posted by *rt123*
> 
> I mean, we can try.


Nah, let's not even try.


----------



## Ganf

Quote:


> Originally Posted by *Master__Shake*
> 
> agreed.
> 
> i've got an mcp355 that is quiet even when hard mounted to the case and all 3 of my coolit c240 units are quieter than the fans.


10 high speed fans in my case and they're quiet even under full load, because I'm not blowing heat in circles inside my case before dumping it outside. It's kind of nice when my idle temps are so low my fans cut off with an OC'd 5930k in the loop. Good luck doing that with an NH-D14 when the processor pulls 200w+ under load and maintaining 60c.


----------



## youra6

Waiting for a statement from AMD claiming how performance will improve with new drivers.


----------



## rv8000

Quote:


> Originally Posted by *rx7racer*
> 
> All I can say is poor poor AMD, guess we will continue to wait through summer for the fastest graphics card in the world.
> 
> Sometimes I want to run CF or SLI again but so many titles it gets broke on or have to wait forever for their profiles I just don't know if I wanna anymore.
> 
> Looking at all the various opinions I have to admit I find it funny the ones saying Fury X is still worth it's price at launch, don't get me wrong in a month or two it might be but as of now I can't help but just wish AMD made a crappy cooler for air and let us WC it ourselves while saving us a $100 or heck even knock $50 off on the price. With it's performance I'd pay $549 and feel a lot better. As it stands and as crazy as it sounds I think my gut feels better paying $649 for a Ti and slapping my WB on it.
> 
> A nice solid try by AMD and it will help the industry as soon as NV gets their HBM2, but as usual AMD innovates only to fall short by being the first to it and as always it's competitor will do it better.


But then everyone would complain that Fury X gets too hot with air cooling







. Everyone who doesn't want the card will find something to nit pick about. No matter what AMD could have done at this point in the game, there would be some way to spin this card in a bad light.


----------



## rt123

Quote:


> Originally Posted by *youra6*
> 
> Waiting for a statement from AMD claiming how performance will improve with new drivers.


And then we can have another 30 page thread of fanboys ripping at each other.


----------



## coupe

*Looks at the first few pages, a bunch of Titan and 980 owners bashing the Fury X to justify their purchase*

*SMH*


----------



## p4inkill3r

Quote:


> Originally Posted by *youra6*
> 
> Waiting for a statement from AMD claiming how performance will improve with new drivers.


Would you not believe them if they did?


----------



## rx7racer

Quote:


> Originally Posted by *rv8000*
> 
> But then everyone would complain that Fury X gets too hot with air cooling
> 
> 
> 
> 
> 
> 
> 
> . Everyone who doesn't want the card will find something to nit pick about. No matter what AMD could have done at this point in the game, there would be some way to spin this card in a bad light.


Gah yeah you're right, 290X showed that. Well and even Fermi for NV showed that in people as well. Although to be fair even if AMD did air cool it the cooler would be huge and hanging over PCB which would just make even more complain.

I have read some mentioning 2900XT heater from back in the day saying that's why they need water, so even with a CLC they showed it's a scorcher. I honestly do hope AMD can get some drivers out because it\s really not scaling well for the amount of GCN cores it has vs other offering.

I guess what I don't get is why improve efficiency if you can't really turn that into utilized numbers somewhere. I feel like they neutered it somehow to show they made their arch more efficient.

4096 cores and this is all we get, I just.... I don't know, just not digging it.


----------



## FlyingSolo

Guess i'll just wait for the HBM2 cards now. That's if the 3.5gb card that i have now last that long


----------



## BiG StroOnZ

Quote:


> Originally Posted by *coupe*
> 
> *Looks at the first few pages, a bunch of Titan and 980 owners bashing the Fury X to justify their purchase*
> 
> *SMH*


You're kidding right? What would they be justifying exactly? It isnt faster than a Titan X nor a 980 Ti, thank the Lord it is faster than a 980. This card is a huge disappointment for anyone.


----------



## szeged

Original argument was amd didn't release drivers for months to fix issues plaguing games, that effects all amd users as they all have to wait for the drivers.

Counter argument was nvidia does more frequent drivers but some people (sometimes lots of people, but not every nvidia user) had a bad experience with it.

I posted that I never had any trouble with any of the new drivers.

Me not having problems installing drivers makes me a huge fanboy?

I had problems installing omega drivers for my xfire 290x, know why? Because they took forever to release.

Amd took forever for their drivers then tried to sugar coat it with a fancy name, it ended up in a 100 +page thread of people complaining anyways about issues

Nvidia updates drivers frequently and gets pages of complaints about issues

Both driver teams get complaints about issues.
Both driver teams need to stop screwing up.
Amd needs to not take forever with their drivers that have problems still
Nvidia needs to slow down and stop churning out drivers as fast as possible while skipping game breaking bugs

I don't know how you don't understand that


----------



## Blameless

Regarding drivers...anyone who hasn't encountered more than their fair share of issues with both brands probably hasn't had enough video cards over a long enough period of time, only plays AAA titles, or has been extremely fortunate.
Quote:


> Originally Posted by *Sashimi*
> 
> People are still complaining about 4GB. I actually suspect with HBM allowing data to move back and forth so quickly it wouldn't hit frames as hard on VRAM shortages as DDR5. Who needs massive internal storage when you have lightning fast connection and cloud?


VRAM shortages will hit the Fury just as hard, if not proportionally harder, when and if they happen, because PCI-E 3.0 is proportionally slower compared to Fury's memory bandwidth than with any other card out there.

The connection to the "cloud" isn't any faster on the Fury. You run out of VRAM, you need to get assets from main system memory, which is bottlenecked by PCI-E 3.0's ~16GB/s, at best.

To correct your analogy, VRAM is local storage, your connection is PCI-E, and the could is main system memory.
Quote:


> Originally Posted by *Sashimi*
> 
> If what I suspect is real, moving beyond 4k onto extra wide and surround setups, Maxwell and Fury X will both face challenge albeit different in nature. Maxwell will be even more restricted in bandwidth that slows down speed further, on the other hand, Fury X will in fact well and truly run into severe VRAM shortages. How these will affect overall frame is difficult to predict. I have no doubt HBM is the future of VRAM memory, but until manufacturers can get the balance between capacity vs speed correct, it's hardly an improvement from DDR5.


Yes.
Quote:


> Originally Posted by *p4inkill3r*
> 
> Would you not believe them if they did?


I'd believe them, but I also know what the disclaimers "up to" and "in certain scenarios" mean, and would not necessarily expect much.


----------



## szeged

Amd did a knockout job making the 7970 and 290x get crazy gains with drivers alone so I don't doubt their ability to squeeze performance out of Fiji with updated drivers.


----------



## rx7racer

Quote:


> Originally Posted by *szeged*
> 
> Original argument was amd didn't release drivers for months to fix issues plaguing games, that effects all amd users as they all have to wait for the drivers.
> 
> Counter argument was nvidia does more frequent drivers but some people (sometimes lots of people, but not every nvidia user) had a bad experience with it.
> 
> I posted that I never had any trouble with any of the new drivers.
> 
> Me not having problems installing drivers makes me a huge fanboy?
> 
> I had problems installing omega drivers for my xfire 290x, know why? Because they took forever to release.
> 
> Amd took forever for their drivers then tried to sugar coat it with a fancy name, it ended up in a 100 +page thread of people complaining anyways about issues
> 
> Nvidia updates drivers frequently and gets pages of complaints about issues
> 
> Both driver teams get complaints about issues.
> Both driver teams need to stop screwing up.
> Amd needs to not take forever with their drivers that have problems still
> Nvidia needs to slow down and stop churning out drivers as fast as possible while skipping game breaking bugs
> 
> I don't know how you don't understand that


Truth. Both driver teams suck at times, always have always will. Sometimes you can get Omega or Xtreme G drivers that have them tweaked a bit along with others to help but in the end it's no easy task for AMD nor NV to produce 100% awesome drivers when us the consumer wants them.

I find it odd when anyone says one side or teh other have better drivers as from history they both suck, I mean let's be honest about it.


----------



## Sashimi

Quote:


> Originally Posted by *p4inkill3r*
> 
> Would you not believe them if they did?


Sure I do, they eventually will. They don't usually specify the time frame though.


----------



## Kosai

So guys, what is currently the absolute fastest variant of the 980 ti, out of the box?


----------



## Sashimi

Quote:


> Originally Posted by *Blameless*
> 
> Regarding drivers...anyone who hasn't encountered more than their fair share of issues with both brands probably hasn't had enough video cards over a long enough period of time, only plays AAA titles, or has been extremely fortunate.
> VRAM shortages will hit the Fury just as hard, if not proportionally harder, when and if they happen, because PCI-E 3.0 is proportionally slower compared to Fury's memory bandwidth than with any other card out there.
> 
> The connection to the "cloud" isn't any faster on the Fury. You run out of VRAM, you need to get assets from main system memory, which is bottlenecked by PCI-E 3.0's ~16GB/s, at best.
> 
> To correct your analogy, VRAM is local storage, your connection is PCI-E, and the could is main system memory.


I stand corrected


----------



## hteng

it's barely beating 980 Ti in some titles, man even with new technology AMD still can't beat Nvidia, no wonder they didn't put in any comparison slides during their conference. Disappointed.


----------



## sugalumps

Quote:


> Originally Posted by *Kosai*
> 
> So guys, what is currently the absolute fastest variant of the 980 ti, out of the box?


Seems like the zotac amp extreme atm, boosts 1355 out the boxx.


----------



## p4inkill3r

Quote:


> Originally Posted by *Kosai*
> 
> So guys, what is currently the absolute fastest variant of the 980 ti, out of the box?


Maybe you're in the wrong thread?

http://www.overclock.net/t/1558203/various-nvidia-gtx-980ti-reviews/0_100


----------



## Clocknut

Quote:


> Originally Posted by *szeged*
> 
> Amd did a knockout job making the 7970 and 290x get crazy gains with drivers alone so I don't doubt their ability to squeeze performance out of Fiji with updated drivers.


I think they need to deal with the CPU overhead issues to Nvidia level & those features that are still missing(dynamic vsync, fps cap, more vsr support) + all together with slightly more frequent driver updates.

These are the only problems that IMO are keeping AMD driver always inferior to Nvidia's ones


----------



## BoredErica

What about long term driver support? Isn't the 290x basically at the 780ti level with the 780ti now trailing the 980? Seems the more time passes, the stronger AMD cards are relative to any older Nvidia card.

Of course, that's not to say it's relevant if you just upgrade cards when the next generation comes out.


----------



## sugalumps

Quote:


> Originally Posted by *Darkwizzie*
> 
> What about long term driver support? Isn't the 290x basically at the 780ti level with the 780ti now trailing the 980? Seems the more time passes, the stronger AMD cards are relative to any older Nvidia card.
> 
> Of course, that's not to say it's relevant if you just upgrade cards when the next generation comes out.


Did that happen with previous gens? There is no guarantee that will happen again, especially after the backlash they have gotten for it.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Sashimi*
> 
> I stand corrected


And to add onto what Blameless said. HBM mostly benefits the actual communication between local on-card memory and the GPU core. Since it is physically closer, along with a significantly larger bus width, latency between the core and the local memory system is significantly reduced.

But as with all memory systems, your chain is only as good as the weakest link. Which in this case, the thing that is actually populating the local memory on the card is 32 times *slower* than the bandwidth of the memory itself. So in the cases where the assets required to be processed and drawn to exceed the rate at which local memory is populated, then you are out of luck.

That is why you are seeing HBM make *a difference*, but not a *massive difference* like everyone was thinking. And unfortunately, there were a lot of people who just basically know jack all at how computer systems fundamentally work (and some common sense) that spread a whole ton of FUD all around about it.


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> Did that happen with previous gens?


Yes it sure did.


----------



## Ganf

Quote:


> Originally Posted by *Kosai*
> 
> So guys, what is currently the absolute fastest variant of the 980 ti, out of the box?


Well, it's kind of an awkward shape so I doubt an air cannon would work very well without making things complicated. That leaves good old gravity. I suspect any 980ti dropped from a height of 40 feet or more would reach terminal velocity based upon some napkin calculations and, before impact, be the fastest 980ti out on the market.

This method is more predictable than playing the silicon lottery. Highly recommended for anyone looking to set any records, but also expensive. The quality of the card is irrelevant to it's speed in this perspective, so I would go for the cheapest, plainest reference model you can find.

EVGA should work.


----------



## Pawelr98

So vrm's reach 104°C ?

I wonder if throttling takes place because of this. From what I see only half of the vrm's are cooled by that cooper pipe. The ones on reverse side remain uncooled.
Maybe with additional cooling(radiators on reverse side VRM's) it would work better ?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *iLeakStuff*
> 
> This graph takes the cake when we have seen the real results today. Not as bad as bulldozer hype/marketing but pretty bad in itself.
> There I was thinking HBM would be the holy grail of 4K gaming and lead to severe Nvidia beating after watching that graph.
> In reality Fury X maybe beat 980Ti in 1 of those games. The rest they were equal or 980Ti beating Fury X.
> Can`t say I noticed much about the HBM magic over GDDR5 to be honest
> 
> The hell AMD?


While I agree with you that the performance of Fury X is far short of what I hoped for, I have no idea what you are talking about in terms of the graph? I don't see a single game taht Fury loses or even is tied with the 980Ti here?









Edit, sorry I get what you are saying now. This was the marketing graph AMD released prior to dropping NDA...


----------



## szeged

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> While I agree with you that the performance of Fury X is far short of what I hoped for, I have no idea what you are talking about in terms of the graph? I don't see a single game taht Fury loses or even is tied with the 980Ti here?


That is the amd official benchmark guide.

They probably messed with settings to turn the tide into their favor.


----------



## sugalumps

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> While I agree with you that the performance of Fury X is far short of what I hoped for, I have no idea what you are talking about in terms of the graph? I don't see a single game taht Fury loses or even is tied with the 980Ti here?


That's the point, it's amds pre launch graph saying that the fury wins every time. When the real reviews are far from that.


----------



## tajoh111

Quote:


> Originally Posted by *mav451*
> 
> If we can whittle it down to costs, what would it have cost AMD to make Fury X with 96 or 128 ROPs?


AMD couldn't make this a 96 or 128 ROP design, the GPU is basically as big as the interposer will allow.
Quote:


> Originally Posted by *coupe*
> 
> *Looks at the first few pages, a bunch of Titan and 980 owners bashing the Fury X to justify their purchase*
> 
> *SMH*


In someways, AMD and AMD Fans brought this onto themselves.

AMD was hyping this card to be the second coming. You don't say fastest card, gaming revolution and overclockers dream lightly. Compared to gm204 or gm200 launch, Nvidia was kind of modest relative to AMD as far as when compared to the competition. In addition, I suspect AMD had a role in releasing some of the timely rumors to control maxwell sales. What makes me believe this is some people closer or in the know were hinting at crazy performance. This makes me think that fake internal leaks were brought about to give alot of AMD buyers hope and to do some marketing without actually spending any money.

Similarly AMD fans have been downplaying maxwell saying it doesn't beat last gen by that much, just wait until fiji launches, Nvidia is so cocky for calling it the most advanced GPU ever. Your tears will be sweet when fiji decimates your titan x. 10% performance improvement over titan x is conservation, I am expecting 20%. This card is going to be faster than 295x2. Etc. These fanboy thought matching titan x was conservative. So when the fury x not only loses to titan but often the gtx 980 ti, that house of cards came tumbling down.

Considering most of the public own Nvidia cards, and these type of things don't go forgotten, there was going to a huge reaction thread one way or another. This AMD backlash thread is the result.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Pawelr98*
> 
> So vrm's reach 104°C ?
> 
> I wonder if throttling takes place because of this. From what I see only half of the vrm's are cooled by that cooper pipe. The ones on reverse side remain uncooled.
> Maybe with additional cooling(radiators on reverse side VRM's) it would work better ?


They are rated up to 130C if I remember correctly for 24/7 use. So 104C is fine. The only worrying part is that they are running so hot that once overvolting is open season for the Fury X, then they might be heating up past their rated temps.


----------



## hamzta09

Quote:


> Originally Posted by *sugalumps*
> 
> That's the point, it's amds pre launch graph saying that the fury wins every time. When the real reviews are far from that.


Its convenient how the AMD wins by 0.5 fps too in that test.. twice!


----------



## Blameless

Quote:


> Originally Posted by *Pawelr98*
> 
> So vrm's reach 104°C ?
> 
> I wonder if throttling takes place because of this. From what I see only half of the vrm's are cooled by that cooper pipe. The ones on reverse side remain uncooled.
> Maybe with additional cooling(radiators on reverse side VRM's) it would work better ?


104C is still about 20-25C away from throttle territory for most VRMs, though 104C at stock clocks (even in dedicated stress tests) is not promising and VRM temps may become a limiting factor when real OCing starts to happen.

Anyway, the VRM components on the back of the card are dumping most of their heat into the PCB, much of which should be carried away by the cooling plate and copper water tubing.

Putting a decent thermal pad between the back of the PCB and the backplate should still help temps to a modest degree, however.


----------



## BigMack70

Quote:


> Originally Posted by *Alatar*
> 
> Trading blows at 4K and falling behind everywhere else against a card that's priced the same but has 50% more memory isn't what I'd classify as "excellent performance all around".
> It's obviously the fastest GPU AMD has made since it's their newest flagship GPU but it's clearly not even close to the best GPU AMD has released.
> 
> 4870 was better, 5870, 7970 too, I think 290X was better as well. I think Fury X is pretty much in the 6970 range of "meh".
> 
> And as for the argument about the Ti, you can make the same argument both ways. But honestly everyone who knows anything about GPUs saw the 980Ti coming at exactly the price it came out at. Happens every time with NV GPUs, competition or no competition. Slightly cut down GPU slightly after the full GPU at a much better price/perf ratio.


I'm starting to worry that the 7970 is the last great card we'll see from AMD


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Blameless*
> 
> Regarding drivers...anyone who hasn't encountered more than their fair share of issues with both brands probably hasn't had enough video cards over a long enough period of time, only plays AAA titles, or has been extremely fortunate.
> *VRAM shortages will hit the Fury just as hard, if not proportionally harder, when and if they happen, because PCI-E 3.0 is proportionally slower compared to Fury's memory bandwidth than with any other card out there.*
> 
> The connection to the "cloud" isn't any faster on the Fury. You run out of VRAM, you need to get assets from main system memory, which is bottlenecked by PCI-E 3.0's ~16GB/s, at best.
> 
> To correct your analogy, VRAM is local storage, your connection is PCI-E, and the could is main system memory.
> Yes.
> I'd believe them, but I also know what the disclaimers "up to" and "in certain scenarios" mean, and would not necessarily expect much.


To be fair, while performance of the card overall has been lackluster at best, I have yet to see any indication whatsoever that Fury is being limited by its 4GB frame buffer. You keep saying that its going to have to go to RAM for assets but that just doesn't appear to be happening in the tests I have seen (including 4k and 5k at ultra settings). Please point out where we can categorically see Fury running out of memory...


----------



## Papermilk

Someone has to at least buy these things 980s running around everywhere will get boring.


----------



## PontiacGTX

it is interesting that no one noticed that this is GCN1.3 then i guess this has FL12.1 DX12









but arch changes are here http://www.pcper.com/reviews/Graphics-Cards/AMD-Exposes-Fiji-World-HBM-Enthusiast/Fiji-GPU


----------



## Sashimi

Quote:


> Originally Posted by *Kinaesthetic*
> 
> And to add onto what Blameless said. HBM mostly benefits the actual communication between local on-card memory and the GPU core. Since it is physically closer, along with a significantly larger bus width, latency between the core and the local memory system is significantly reduced.
> 
> But as with all memory systems, your chain is only as good as the weakest link. Which in this case, the thing that is actually populating the local memory on the card is 32 times *slower* than the bandwidth of the memory itself. So in the cases where the assets required to be processed and drawn to exceed the rate at which local memory is populated, then you are out of luck.
> 
> That is why you are seeing HBM make *a difference*, but not a *massive difference* like everyone was thinking. And unfortunately, there were a lot of people who just basically know jack all at how computer systems fundamentally work (and some common sense) that spread a whole ton of FUD all around about it.


TBH I never thought memory population can every be slower than the bandwidth. As with SSDs and flash drives it has always been the interface that's limiting the flow of data and not the memory itself. Back to VRAM, in real world gaming when you run out of VRAM the most prominent effect are stutters as the GPU has to wait for old cached memories to be replaced by new ones before proceeding with calculations, could it be possible that HBM can speed up this process and reduce stutters to the point where VRAM cache isn't needed for a smooth experience?


----------



## raghu78

As I said earlier the Fury X has some serious issues scaling performance over R9 390X because the front end has not scaled as much as the shaders. Driver issues also seem to be hurting Fury X. Thats even more damning. To spend so much effort in designing a flagship card with HBM and fail due to drivers is pathetic.

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Overclocking-Pricing-and-

"*In the case of GTA V, one of the latest and most popular PC games with a heavy modding community, the GTX 980 Ti was 15-33% faster depending on the resolution in question. That is a hard performance gap to write off. (UPDATE: A couple of people have guessed that the GTA V performance deficit might be related to driver immaturity. That's definitely possible but still concerning considering GTA V is such a big PC game currently.) "*

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html

You can clearly see that GTX 980 Ti scales better from a GTX 980 especially at lower resolutions like 1440p because everything is scaled uniformly. GPC, tesselation engines, shaders, ROPs, bandwidth. At 1440p GTX 980 Ti is 25% faster on average than GTX 980 for 40% more shaders *even though GTX 980 is running at higher boost clocks*. Fury X is 20% faster than R9 390X for 45% shaders *and identical core clocks with much more bandwidth due to color compression combined with HBM*. At 4k Fury X does slightly better. But the fundamental issue seems to be front end engine scaling issues and driver issues. It cannot get any more worse for AMD. They are going to go through hell for the next 12 months.

I see Nvidia getting to >80% market share. If Nvidia gets real aggressive they can launch a GTX 980 refresh with higher core/memory clocks and a GTX 970 refresh (with ROPs and L2 intact). That would effectively kill AMD's entire product lineup. I suspect AMD's bleeding is going to get worse. The worst part was AMD was confident about gaining back market share with this product stack. what a joke. lmao


----------



## zipper17

do you remember amd did released "never settle" driver perfomance that boost the HD 7000's gpu significantly?

maybe this FuryX is slower just because the driver not yet matured?


----------



## Blameless

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> To be fair, while performance of the card overall has been lackluster at best, I have yet to see any indication whatsoever that Fury is being limited by its 4GB frame buffer. You keep saying that its going to have to go to RAM for assets but that just doesn't appear to be happening in the tests I have seen (including 4k and 5k at ultra settings). Please point out where we can categorically see Fury running out of memory...


I haven't said Fury is running out of memory (though I'm sure someone can produce a specific scenario were a VRAM limitation is occurring if they really try), I'm saying that when it does run out of VRAM, it's going to be hit just as hard as any other part that runs out of VRAM.


----------



## svenge

Quote:


> Originally Posted by *zipper17*
> 
> do you remember amd did released "never settle" driver perfomance that boost the HD 7000's gpu significantly?
> 
> maybe this FuryX is slower just because the driver not yet matured?


GCN was a new architecture back in the HD 7000-series era. Except for some relatively minor iterative improvements, the Fury-X is very much like its GCN 1.x predecessors in terms of how its graphics core works (besides the sheer number of shaders).

Of course if the nature of HBM leads to some novel tweaks that I can't think of, then there _might_ be something to it. But it's certainly not the kind of change that Kepler -> Maxwell or VLIW4 -> GCN brought.


----------



## Alatar

Quote:


> Originally Posted by *BigMack70*
> 
> I'm starting to worry that the 7970 is the last great card we'll see from AMD


Honestly pretty much everything depends on Zen at this point. AMD needs to start pulling those great margins from the server markets.

Even GPGPU and HPC with GPUs isn't that much of a possibility for AMD because while they have some pretty nice hardware as far as the GPUs go they lack a lot of the software, interconnect hardware and experience etc. needed for the really big HPC deals that NVidia for example keeps getting. Even if AMD gets to offer a better HPC GPU would you take the plain GPU from AMD or choose Nvidia who comes to you with IBM and Mellanox partnerships and 'solid' software.

Finfets are really going to be the do or die time for AMD. We're looking at a reset of the GPU market after 5 years on the same node as well as the debut of the CPU architecture that AMD has been putting most of its money into for a while now.

If they manage pull off both of those things well enough that their margins improve drastically and they aren't completely at the mercy of intel and Nvidia when it comes to setting prices then they'll be more than fine.

This Fiji / GM200 generation is just a stop gap


----------



## PontiacGTX

Quote:


> Originally Posted by *svenge*
> 
> GCN was a new architecture back in the HD 7000-series era. Except for some relatively minor iterative improvements, the Fury-X is very much like its GCN 1.x predecessors in terms of how its graphics core works (besides the sheer number of shaders).
> 
> Of course if the nature of HBM leads to some novel tweaks that I can't think of, then there _might_ be something to it. But it's certainly not the kind of change that Kepler -> Maxwell or VLIW4 -> GCN brought.


GCN 1.3 is new


----------



## svenge

Quote:


> Originally Posted by *PontiacGTX*
> 
> GCN 1.3 is new


It depends on what the meaning of "new" is. AMD's definition of "new" has deteriorated lately, especially with their "new" 300-series cards and their "new" chips' code names (Hawaii / Grenada, etc.).

But Fury-X has the same amount of ROPs and tessellation resources as its much weaker GCN 1.2 cousin (Tonga), but has a greatly-increased number of shaders (a concept that's certainly not new) along with HBM memory (which is truly new) as the only significant changes.

If it weren't for the fact that HBM is a new technology and may have performance implications (good or bad) that haven't been fully discovered yet, I'd say that it's another one of AMD's traditional "too little, too late" product launches that fails to dethrone King Maxwell.


----------



## Rei86

Quote:


> Originally Posted by *zipper17*
> 
> do you remember amd did released "never settle" driver perfomance that boost the HD 7000's gpu significantly?
> 
> maybe this FuryX is slower just because the driver not yet matured?


But that's the issue with AMD.

Since watching nVidia GTX 680 to GTX 980Ti. Each release of nVidia's cards have always been competitive to besting the previous cards and competitors.

AMD on the other hand feels halfassed at everything. And yeah I remember the never settle drivers that came out around when the Ghz 7970s, this is also when frame pacing became a huge topic and something AMD denied till they finally sucked it up and worked on it.

Its still a shame that their crossfire support is woeful, and this dragging of the feet for the other is just shameful.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Blameless*
> 
> I haven't said Fury is running out of memory *(though I'm sure someone can produce a specific scenario were a VRAM limitation is occurring if they really try)*, I'm saying that when it does run out of VRAM, it's going to be hit just as hard as any other part that runs out of VRAM.


That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


I hate to be the prick that says this, but the exact answer to your question is very much common sense. When given the same rate of populating data. That rate will be the limiting factor, no matter how fast the local memory on the card is. You cannot magically change rock hard numbers like that.


----------



## svenge

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


But isn't HBM supposed to be like TARDIS or the stable in Narnia in that "Its inside is bigger than its outside"?









I could've sworn I heard claims from _certain quarters_ that 4GB of HBM holds more than 6GB of GDDR5...


----------



## rt123

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I hate to be the prick that says this, but the exact answer to your question is very much common sense. When given the same rate of populating data. *That rate will be the limiting factor*, no matter how fast the local memory on the card is. You cannot magically change rock hard numbers like that.


I think you mean capacity. How much data you can hold in the VRAM. the rate technically means the speed at which you fill up the VRAM.

HBM lacks capacity, rate is superior to GDDR5 in any scenario.


----------



## raghu78

Quote:


> Originally Posted by *svenge*
> 
> GCN was a new architecture back in the HD 7000-series era. Except for some relatively minor iterative improvements, the Fury-X is very much like its GCN 1.x predecessors in terms of how its graphics core works (besides the sheer number of shaders).
> 
> Of course if the nature of HBM leads to some novel tweaks that I can't think of, then there _might_ be something to it. *But it's certainly not the kind of change that Kepler -> Maxwell or VLIW4 -> GCN brought*.


very true. Kepler-> Maxwell is probably one of the biggest leaps ever in terms of pure architectural improvement since it was achieved on the same 28nm process node. The only other similarly or more impressive transitions are

1. 7900 GTX -> 8800 GTX. That was a giant leap and both were at 90nm. 8800GTX was a GPU legend followed by 8800 GT (65nm die shrink) which was a darling for the masses.
2. HD 4870 was also an equally big jump over HD 3870 and both were built at 55nm. But that was because HD 2900XT was a failure and HD 3870 was based on the same HD2900XT architecture.
3. Radeon 9700 Pro over Radeon 8500. Both were built on 150nm process. But a grounds up new DX9 architecture and improved manufacturing (due to maturity of process node) helped achieve that massive leap.

Nvidia has really built on of the best GPU architectures ever with Maxwell. AMD has failed again and GCN needs a complete overhaul to remain competitive. I also think drivers are again a problem for AMD with Fury X. Anyway none of this matters. Nvidia's GPU leadership is only going to strengthen. AMD is slowly digging its grave. Soon we might be left with two absolute monopolies - Intel and Nvidia.


----------



## tajoh111

http://www.anandtech.com/bench/product/1513?vs=1496

Anandtech's benches are up. Like other reviews, it show's fury losing most of the time, but it at it most competitive at 4k where it can win in a couple games, but when it loses, it really loses in some titles.


----------



## SKYMTL

Quote:


> Originally Posted by *PontiacGTX*
> 
> GCN 1.3 is new


At this point the GCN core within Fiji is about as old as disco.....its basic premise has been reused with only minor modifications since 2012.


----------



## FallenFaux

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


You could always keep raising the resolution and AA settings to see if it ever passes a 295x2, since the 295x2 is the fastest thing around that would show that it's GDDR5 eventually becomes not enough compared to HBM.

I don't buy into this whole HBM will make up for low memory thing though, all it does is increase bandwidth and lower latency. However, since the 295x2 still beats the Titan X in most benchmarks I also think 4GB is fine at the moment.


----------



## Sycksyde

Quote:


> Originally Posted by *raghu78*
> 
> To spend so much effort in designing a flagship card with HBM and fail due to drivers is pathetic.


Hasn't that been the case with AMD since forever though? Great hardware let down by horrible drivers.


----------



## raghu78

Quote:


> Originally Posted by *SKYMTL*
> 
> At this point the GCN core within Fiji is about as old as disco.....its basic premise has been reused with only minor modifications since 2012.


agree. AMD needs a significant architectural overhaul if not a grounds up new architecture.


----------



## Serandur

Quote:


> Originally Posted by *raghu78*
> 
> very true. Kepler-> Maxwell is probably one of the biggest leaps ever in terms of pure architectural improvement since it was achieved on the same 28nm process node. The only other similarly or more impressive transitions are
> 
> 1. 7900 GTX -> 8800 GTX. That was a giant leap and both were at 90nm. 8800GTX was a GPU legend followed by 8800 GT which was a darling for the masses.
> 2. HD 4870 was also an equally big jump over HD 3870 and both were built at 55nm. But that was because HD 2900XT was a failure and HD 3870 was based on the same HD2900XT architecture.
> 3. Radeon 9700 Pro over Radeon 8500. Both were built on 150nm process. But a grounds up new DX9 architecture and improved manufacturing (due to maturity of process node) helped achieve that massive leap.
> 
> Nvidia has really built on of the best GPU architectures ever with Maxwell. AMD has failed again and GCN needs a complete overhaul to remain competitive. I also think drivers are again a problem for AMD with Fury X. Anyway none of this matters. Nvidia's GPU leadership is only going to strengthen. AMD is slowly digging its grave. Soon we might be left with two absolute monopolies - Intel and Nvidia.


GM100/GM200 would have really been quite a sight if the 20nm plans played out. Second coming of the 8800GTX big, maybe even bigger.

On the other hand, GCN's been around for nearly 4 years now and it still has barely evolved past its initial form. Even by AMD/ATi standards with TeraScale, GCN should have gone through at least one large revision so far and should be almost ready to be replaced completely. Somehow, I doubt they're even close to having a viable successor ready, however. Meanwhile, Nvidia have Pascal and Volta lined up in quick succession over the next couple years. That's really bad for AMD... really, really, really bad.


----------



## Kinaesthetic

Quote:


> Originally Posted by *rt123*
> 
> I think you mean capacity. How much data you can hold in the VRAM. the rate technically means the speed at which you fill up the VRAM.
> 
> HBM lacks capacity, rate is superior to GDDR5 in any scenario.


No, the rate of population is the limiting factor. That is done over a 15.75GB/s PCIe 3.0 x16 bus. You can't magically change that. The rate at using that data is definitely faster on HBM though.


----------



## PontiacGTX

Quote:


> Originally Posted by *SKYMTL*
> 
> At this point the GCN core within Fiji is about as old as disco.....its basic premise has been reused with only minor modifications since 2012.


well not 2012 but 2014-2013 if you count the CU change (1.0 to 1.1) improved tessellation(1.1 to 1.2, 4x more clock per cycle), different memory and different memory controller as a change in the micro architecture


----------



## Blameless

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


The idea that HBM will some how not run out of capacity when an equivalent GDDR5 part would has exactly zero basis. All other things being equal 4GiB of HBM is the same capacity as 4GiB of DDR5, and will hold the same data.

If Fury was a GDDR5 part and they equipped it with 4GiB of GDDR5, it would run out of memory, or not, doing exactly the same things that an HBM Fury would.

This is less speculative and more a given. If the assets required to render a scene total more than VRAM capacity, the assets that don't fit in VRAM have to come from somewhere else. Getting to and from that somewhere else becomes a bottleneck unless VRAM is _slower_.

If you find where any 4GiB card is clearly running out of VRAM, Fury X will have issues in the same game/app and settings.

The first 35 seconds of this video explain things pretty well, just replace "taller" with "HBM":


----------



## svenge

Quote:


> Originally Posted by *PontiacGTX*
> 
> well not 2012 but 2014 if you count the improved tessellation, different memory and different memory controller as a change in the micro architecture


Those are merely tweaks on an relatively ancient substructure. AMD needs a wholesale revamp of their entire architecture _a la_ Kepler -> Maxwell.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I hate to be the prick that says this, but the exact answer to your question is very much common sense. When given the same rate of populating data. That rate will be the limiting factor, no matter how fast the local memory on the card is. You cannot magically change rock hard numbers like that.


Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...


----------



## rt123

Quote:


> Originally Posted by *Kinaesthetic*
> 
> No, the rate of population is the limiting factor. That is done over a 15.75GB/s PCIe 3.0 x16 bus. You can't magically change that. The rate at using that data is definitely faster on HBM though.


That isn't gonna change anytime soon. Even Skylake-E doesn't seem to have the next PCiE standard & that's not coming till 2016.
Nvidia is gonna have the same problem.


----------



## PontiacGTX

Quote:


> Originally Posted by *svenge*
> 
> Those are merely tweaks on an relatively ancient substructure. AMD needs a wholesale revamp of their entire architecture _a la_ Kepler -> Maxwell.


thats maybe why they inicreased a lot the SP/shaders and also left to do some of the task to the better memory design.still the drivers can change the things the results dont show that the card has a constant result in game at different resolutions even if they are the same game on 1080+/1440+


----------



## FallenFaux

Quote:


> Originally Posted by *svenge*
> 
> Those are merely tweaks on an relatively ancient substructure. AMD needs a wholesale revamp of their entire architecture _a la_ Kepler -> Maxwell.


Why? Even with this last change they managed to almost match the power efficiency of Maxwell. They used TeraScale from the 2000 to 6000 series of cards and in some cases managed a 100% increase in performance from generation to generation.


----------



## Casey Ryback

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...


Yep fury seems to be superior to the 6GB ti in both witcher 3 and shadows of mordor 4K.


----------



## Casey Ryback

Quote:


> Originally Posted by *FallenFaux*
> 
> Why? Even with this last change they managed to almost match the power efficiency of Maxwell.


No they didn't. The whole package does, but lets remember one card is using HBM on a smaller more efficient PCB.

Construct a similar card with maxwell cores you're going to see a much more efficient product again.


----------



## DividebyZERO

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...


Not sure if you have already seen this or not but...
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *HiTechPixel*
> 
> I am looking for 5K Crossfire reviews too. The results will decide if I go Titan X SLI or Fury X Crossfire since I have a 5K monitor.
> 
> 
> 
> The only thing stopping me right now is the 4GB VRAM and unknowns of overclocking/crossfire. Anyone interested in memory though might want to look at *this*
Click to expand...


----------



## svenge

Quote:


> Originally Posted by *FallenFaux*
> 
> Why? Even with this last change they managed to almost match the power efficiency of Maxwell.


You're conflating the power savings that HBM brings with the amout of power Fury-X uses as a whole. The GCN architecture in itself hasn't changed much in terms of efficiency.

Were it possible to just slap on some HBM memory to Maxwell, its cards' power usage would drop a proportional amount and further highlight GCN's deficiencies in that regard.


----------



## FallenFaux

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yep fury seems to be superior to the 6GB ti in both witcher 3 and shadows of mordor 4K.


But the memory comparison is irrelevant there, especially in SoM. The 295x2 beats all those cards with a lowly 4GB of GDDR5 so it's clearly not a memory issue.
Quote:


> Originally Posted by *Casey Ryback*
> 
> No they didn't. The whole package does, but lets remember one card is using HBM on a smaller more efficient PCB.
> 
> Construct a similar card with maxwell cores you're going to see a much more efficient product again.


Maybe. Integration of an HBM interposer onto the package is going to require a chip redesign and there's no way for us to know what effect that might have on efficiency. In addition, do you actually believe that they won't make any architectural changes in the 14nm shrink considering they made 3 changes on the same node? 14/16nm is going to be the start of a whole new game.


----------



## geoxile

Quote:


> Originally Posted by *FallenFaux*
> 
> Why? Even with this last change they managed to almost match the power efficiency of Maxwell. They used TeraScale from the 2000 to 6000 series of cards and in some cases managed a 100% increase in performance from generation to generation.


Often by just doubling (or more) the amount of cores on the GPU.

Actually, I find it rather strange that the Fury X isn't a straight 45% better than the 290X in comparable cases. It's weird since GCN seemed to scale pretty well with shaders before


----------



## szeged

I'd love to see a new architecture from amd. Gcn needs to go imo, it's been around for so long now and now all amd is doing is add Moar coarzzzz. Also graphics core next is a terrible name imo.


----------



## sterob

kind of disappointing in a sense thath AMD hyped thing so much. Nvidia did a good move release 980ti and bring down 980 price. If only AMD was better and down Fury X to $599 or may be "screw Nvidia and anti-trust, let make this another 4800 era, $499 Fury X."


----------



## Casey Ryback

Quote:


> Originally Posted by *FallenFaux*
> 
> In addition, do you actually believe that they won't make any architectural changes in the 14nm shrink considering they made 3 changes on the same node? 14/16nm is going to be the start of a whole new game.


When did I say they won't make any architectural changes?

I'm hoping that AMD already have the blueprints that will be their next, far more efficient architecture using HBM2 on the 14nm shrink.

There future depends on it as pascal will (judging from recent architectural history) will be efficient and powerful, whilst also taking advantage of smaller cards using HBM2 memory.

edit - and you are right about the game performance, the 4GB doesn't seem to be a bottleneck at 4K in those titles.


----------



## tajoh111

Quote:


> Originally Posted by *raghu78*
> 
> very true. Kepler-> Maxwell is probably one of the biggest leaps ever in terms of pure architectural improvement since it was achieved on the same 28nm process node. The only other similarly or more impressive transitions are
> 
> 1. 7900 GTX -> 8800 GTX. That was a giant leap and both were at 90nm. 8800GTX was a GPU legend followed by 8800 GT (65nm die shrink) which was a darling for the masses.
> 2. HD 4870 was also an equally big jump over HD 3870 and both were built at 55nm. But that was because HD 2900XT was a failure and HD 3870 was based on the same HD2900XT architecture.
> 3. Radeon 9700 Pro over Radeon 8500. Both were built on 150nm process. But a grounds up new DX9 architecture and improved manufacturing (due to maturity of process node) helped achieve that massive leap.
> 
> Nvidia has really built on of the best GPU architectures ever with Maxwell. AMD has failed again and GCN needs a complete overhaul to remain competitive. I also think drivers are again a problem for AMD with Fury X. Anyway none of this matters. Nvidia's GPU leadership is only going to strengthen. AMD is slowly digging its grave. Soon we might be left with two absolute monopolies - Intel and Nvidia.


I think the problem for AMD is they got arrogant with the console win. They thought since the consoles used GCN, they could ride out GCN for a long long time, making just more and more revisions to the same thing.

What they didn't see happening is the same problem they are running into the CPU front, you can't just keep on adding cores and expect linear scaling. They probably got optimum core occupancy and usage with Hawaii, from what the reviews have shown, the gains are going to be marginal any further and AMD has just got to make a new architecture.

If we look at it from an architecture perspective, AMD got schooled.

Titan x(since it is the full chip like fiji) when pushed to the same limits using similar cooling(within 10% of its ceiling) is probably 20% faster at 4k and 30% at lower resolution.What is hiding this difference is Nvidia is sandbagging it's current performance(underclocking) while AMD is pushing their chips closer to the max(overclocking) to make their cards more competitive. If this sounds familiar, it is what is happening to AMD vs Intel currently.

What is particularly disappointing this generation is this is the first time AMD has made a chip as big as Nvidia, and they are still losing the war even with HBM. There's never been this big of a gap as far as architectures go because Nvidia was always faster, but they had a bigger chip as well. This time Nvidia is faster with the same sized chip and by a bigger amount than the past.

Once Nvidia's get's it hands on HBM,which will shave off much of Nvidias traditionally massive memory controller, AMD might just be conroed if Maxwell didn't do it already.

What I hope is AMD doesn't just do GCN 2.0 next gen and makes some big changes next generation and a new architecture. Then again, R and D cuts indicate otherwise and all I can hope is Zen is a smash, otherwise AMD goose is cooked.


----------



## FallenFaux

Quote:


> Originally Posted by *geoxile*
> 
> Often by just doubling (or more) the amount of cores on the GPU.
> 
> Actually, I find it rather strange that the Fury X isn't a straight 45% better than the 290X in comparable cases. It's weird since GCN seemed to scale pretty well with shaders before


Yeah, from the 3800>4800>5800 each card was double (or more) the previous and they got near 100% scaling. It seems that either GCN 1.2 just doesn't scale very well or that the component ratio is off.


----------



## Sashimi

Quote:


> Originally Posted by *Kinaesthetic*
> 
> No, the rate of population is the limiting factor. That is done over a 15.75GB/s PCIe 3.0 x16 bus. You can't magically change that. The rate at using that data is definitely faster on HBM though.


Not arguing, just bringing in a question. Could this data rate of the PCIe 3.0 be fast enough that even when VRAM is all used up, the time it takes to replace those cached memories will be unnoticeable?


----------



## geoxile

Quote:


> Originally Posted by *FallenFaux*
> 
> Yeah, from the 3800>4800>5800 each card was double (or more) the previous and they got near 100% scaling. It seems that either GCN 1.2 just doesn't scale very well or that the component ratio is off.


Considering the 285 was trading blows with the 280X despite less SPs that doesn't sound right. GCN 1.2 should scale better. Maybe it's Amdahl's law in action? Not sure.

Since Raja Koduri returned in 2013 it'll probably be a year or two before I we see anything truly new from AMD's GPU department.


----------



## FallenFaux

Quote:


> Originally Posted by *Casey Ryback*
> 
> When did I say they won't make any architectural changes?
> 
> I'm hoping that AMD already have the blueprints that will be their next, far more efficient architecture using HBM2 on the 14nm shrink.
> 
> There future depends on it as pascal will (judging from recent architectural history) will be efficient and powerful, whilst also taking advantage of smaller cards using HBM2 memory.
> 
> edit - and you are right about the game performance, the 4GB doesn't seem to be a bottleneck at 4K in those titles.


There's no doubt that 14nm is going to be very exciting. With the new efficiency they're going to gain from HBM and skipping a whole process node I'm really hoping to see 100%+ gains on the first cards. Whoever plays the best underwater is going to get my money.


----------



## p4inkill3r

Quote:


> Originally Posted by *svenge*
> 
> You're conflating the power savings that HBM brings with the amout of power Fury-X uses as a whole. The GCN architecture in itself hasn't changed much in terms of efficiency.
> 
> Were it possible to just slap on some HBM memory to Maxwell, its cards' power usage would drop a proportional amount and further highlight GCN's deficiencies in that regard.


The inverse is also true, however; you can slap HBM onto GCN cores and make a product that is competitive with Maxwell.


----------



## Blameless

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...


You have been confusing two completely different questions:

_When_ is 4GiB not enough?

and

What _happens_ when 4GiB is not enough?

Answer to the former is pretty much up in the air, at least when it comes to games. I'm not quite sure how one would go about reliably ensuring that more than 4GiB were required to render a given scene so it could be benched. This used to be easy to test simply by making frame buffers exceed VRAM size, but resolutions have not scaled any where near as fast as VRAM capacity.

Answer to the latter is a given and it's exactly the same answer as to the question of what happens when you run out of system memory, or space on your SSD, or pages in a book, or room in a fuel tank, or anything else...the bottleneck stops becoming how quick the first collection of stuff is and becomes how fast can you swap things to/from the first collection from some other source.
Quote:


> Originally Posted by *svenge*
> 
> You're conflating the power savings that HBM brings with the amout of power Fury-X uses as a whole. The GCN architecture in itself hasn't changed much in terms of efficiency.


GCN has definitely improved in efficiency. HBM may be responsible for respectable power savings, but even the GDDR5 memory subsystem was only a fraction of total board power in prior parts, with the GPU itself being the bulk.
Quote:


> Originally Posted by *FallenFaux*
> 
> Yeah, from the 3800>4800>5800 each card was double (or more) the previous and they got near 100% scaling. It seems that either GCN 1.2 just doesn't scale very well or that the component ratio is off.


3000, 4000, and 5000 were all VLIW5 and likely had no more differences in architecture than the different iterations of GCN.

I don't think there is anything wrong with GCN 1.2. I think Fury has two main issues:

1. Too few ROPs.

2. Too much FP64.


----------



## blue1512

Quote:


> Originally Posted by *geoxile*
> 
> Often by just doubling (or more) the amount of cores on the GPU.
> 
> Actually, I find it rather strange that the Fury X isn't a straight 45% better than the 290X in comparable cases. It's weird since GCN seemed to scale pretty well with shaders before


FuryX is hold by 64ROP, which is the same as 290X. In scenario which is ROP intensive (tessellation for example), FuryX suffers a serious bottleneck, hence the performance is close to 290x in those case.

For reference, GM200 on TX and Ti has 96 ROP.

If FuryX had 96 ROP, it would take advantage of the wide bus better, even if it would be cut in SP.

Hey, it would mean the cut down version in Nano is more effective, but we have to wait until we see the specs.


----------



## KyadCK

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...
> 
> 
> 
> Not sure if you have already seen this or not but...
> Quote:
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *HiTechPixel*
> 
> I am looking for 5K Crossfire reviews too. The results will decide if I go Titan X SLI or Fury X Crossfire since I have a 5K monitor.
> 
> Click to expand...
> 
> The only thing stopping me right now is the 4GB VRAM and unknowns of overclocking/crossfire. Anyone interested in memory though might want to look at *this*
> 
> 
> 
> 
> Click to expand...
Click to expand...

This proves.... what, exactly? That DX11 likes to over-allocate memory based on how much is available?
Quote:


> Originally Posted by *Sashimi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kinaesthetic*
> 
> No, the rate of population is the limiting factor. That is done over a 15.75GB/s PCIe 3.0 x16 bus. You can't magically change that. The rate at using that data is definitely faster on HBM though.
> 
> 
> 
> Not arguing, just bringing in a question. Could this data rate of the PCIe 3.0 be fast enough that even when VRAM is all used up, the time it takes to replace those cached memories will be unnoticeable?
Click to expand...

Absolutely not, VRAM overflow turns CPU RAM into VRAM, and that is the same as reducing your 512GB/s memory speed to just 15GB/s. It would be much worse than when the 970 is forced to use it;s last 512MB, and it will tank your FPS into the single digits instantly.


----------



## svenge

Quote:


> Originally Posted by *FallenFaux*
> 
> Yeah, from the 3800>4800>5800 each card was double (or more) the previous and they got near 100% scaling. It seems that either GCN 1.2 just doesn't scale very well or that the component ratio is off.


The ROPs and tessellation resources on Fury-X were not increased proportionally to match the additional shaders, which would explain the scaling difficulties.


----------



## FallenFaux

Quote:


> Originally Posted by *geoxile*
> 
> Considering the 285 was trading blows with the 280X despite less SPs that doesn't sound right. GCN 1.2 should scale better. Maybe it's Amdahl's law in action? Not sure.
> 
> Since Raja Koduri returned in 2013 it'll probably be a year or two before I we see anything truly new from AMD's GPU department.


Quote:


> Originally Posted by *Blameless*
> 
> ...
> I don't think there is anything wrong with GCN 1.2. I think Fury has two main issues:
> 
> 1. Too few ROPs.
> 
> 2. Too much FP64.


Quote:


> Originally Posted by *blue1512*
> 
> FuryX is hold by 64ROP, which is the same as 290X. In scenario which is ROP intensive (tessellation for example), FuryX suffers a serious bottleneck, hence the performance is close to 290x in those case.
> 
> For reference, GM200 on TX and Ti has 96 ROP.
> 
> If FuryX had 96 ROP, it would take advantage of the wide bus better, even if it would be cut in SP.


I'm not convinced that it has too few ROPs though, it's exactly double a 285 which you would think would just scale up to 100% faster based on how the previous cards scaled. I was actually thinking that maybe the die is just too big, the path lengths have become too long and now you're losing efficiency because data takes too long to travel around the die. I know it's a thing that can happen to CPUs but I'm not sure how it could effect a GPU or how large the die would have to get.


----------



## Sashimi

Perhaps it will eventually be possible to stop VRAM from going into RAM entirely and instead force the card to replace old VRAM, that could better utilise HBM?


----------



## magnek

nvm

old post is old


----------



## Forceman

Quote:


> Originally Posted by *FallenFaux*
> 
> I'm not convinced that it has too few ROPs though, it's exactly double a 285 which you would think would just scale up to 100% faster based on how the previous cards scaled. I was actually thinking that maybe the die is just too big, the path lengths have become too long and now you're losing efficiency because data takes too long to travel around the die. I know it's a thing that can happen to CPUs but I'm not sure how it could effect a GPU or how large the die would have to get.


It's the same ROPs as Hawaii with 45% more shaders. So unless Hawaii was over-provisioned with ROPs (possible) then it makes sense that it would be a limitation. By comparison the 980 Ti is 50% more than a 980 in everything.

I'm not positive, but my theory on why it is falling behind at lower resolutions is because that's where the extra shaders are not helping because the ROPs are the limit, then when you increase the resolution the shader power is able to keep it closer to the ROP limit while the 290X/390X falls behind the ROP limit. So the Fury is ROP limited at lower resolution, while the 290X is shader limited at higher.


----------



## geoxile

Quote:


> Originally Posted by *blue1512*
> 
> FuryX is hold by 64ROP, which is the same as 290X. In scenario which is ROP intensive (tessellation for example), FuryX suffers a serious bottleneck, hence the performance is close to 290x in those case.
> 
> For reference, GM200 on TX and Ti has 96 ROP.
> 
> If FuryX had 96 ROP, it would take advantage of the wide bus better, even if it would be cut in SP.


Tonga's 32 ROPs were more than a match for the 290X's in certain tests.
Quote:


> Originally Posted by *Forceman*
> 
> It's the same ROPs as Hawaii with 45% more shaders. So unless Hawaii was over-provisioned with ROPs (possible) then it makes sense that it would be a limitation. By comparison the 980 Ti is 50% more than a 980 in everything.
> 
> I'm not positive, but my theory on why it is falling behind at lower resolutions is because that's where the extra shaders are not helping because the ROPs are the limit, then when you increase the resolution the shader power is able to keep it closer to the ROP limit while the 290X/390X falls behind the ROP limit. So the Fury is ROP limited at lower resolution, while the 290X is shader limited at higher.


Delta color compression should make better use of the ROPs. We saw that in Tonga.


----------



## MKHunt

Well, Amazon Newegg and EVGA are all sold out of reference 980 ti's.


----------



## azanimefan

Quote:


> Originally Posted by *MKHunt*
> 
> Well, Amazon Newegg and EVGA are all sold out of reference 980 ti's.


yep. AMD helped make up everyone's mind on whether they should buy a 980ti or Furyx. it seems that everyone went with a 980ti. Expect the price of the 980ti to trend up in the coming months.


----------



## JJEEGG2211

Quote:


> Originally Posted by *geoxile*
> 
> Since Raja Koduri returned in 2013 it'll probably be a year or two before I we see anything truly new from AMD's GPU department.


Made me want to read this http://www.anandtech.com/show/6907/the-king-is-back-raja-koduri-leaves-apple-returns-to-amd

I guess we're starting to see his impact to company. I just hope there'll be more. They've got to make the competition tight.


----------



## svenge

Quote:


> Originally Posted by *MKHunt*
> 
> Well, Amazon Newegg and EVGA are all sold out of reference 980 ti's.


If I were NVIDIA and the AIB partners, I'd call up the fabs/factories and triple my component orders post-haste.

Fury-X, while being a competent card, has wholly failed in its mission to dislodge King Maxwell (GTX 980 Ti) from his throne.


----------



## raghu78

Quote:


> Originally Posted by *Blameless*
> 
> I don't think there is anything wrong with GCN 1.2. I think Fury has two main issues:
> 
> 1. Too few ROPs.
> 
> 2. Too much FP64.


No its not as simple as that. The current Fiji is at the limits of the current GCN architecture. Scalability from R9 390X is poor indicating front end is not capable of feeding the shaders to ensure consistent perf scaling. Perf / sp is very poor when compared to Maxwell perf/cc. Per/watt is again poor given that if Maxwell had HBM power advantages this would look even more one-sided. Perf/sq mm is worse. The last one is is the most damning as the earlier AMD chips like R9 290X had significantly better perf/sq mm over GTX 780 Ti. AMD's perf/sq mm has been withering away from the HD 4870 days and now Nvidia has that under their belt too. Fundamentally GCN in its present form cannot compete with Maxwell. Given that Pascal is going to be a huge jump in perf and efficiency because of 16nm FINFET, HBM2 and architectural improvements AMD are up against it. Right now I have no hope that AMD can compete. So until AMD prove me wrong its pretty much a Nvidia GPU monopoly we are headed for.


----------



## carlhil2

Take the 980ti out of the equation, Fury X would be deemed a success, at $700+. damn, that 980ti...


----------



## geoxile

Quote:


> Originally Posted by *JJEEGG2211*
> 
> Made me want to read this http://www.anandtech.com/show/6907/the-king-is-back-raja-koduri-leaves-apple-returns-to-amd
> 
> I guess we're starting to see his impact to company. I just hope there'll be more. They've got to make the competition tight.


I somehow doubt that he's had a big influence yet. Stuff like HBM (stacked memory) has been on the roadmap for quite some time, but got pushed back for ages. And Fiji doesn't seem like a major change. Most likely, we'll see it in 2016 or 2017 if we get new architectures instead of this minor GCN variation pattern.


----------



## FallenFaux

Quote:


> Originally Posted by *carlhil2*
> 
> Take the 980ti out of the equation, Fury X would be deemed a success, at $700+. damn, that 980ti...


2900XT probably would have been a success too if it hadn't been for that pesky 8800GTX.


----------



## svenge

Quote:


> Originally Posted by *FallenFaux*
> 
> 2900XT probably would have been a success too if it hadn't been for that pesky 8800GTX.


And Bulldozer would've been a success were it not for that darn 80386DX...


----------



## ondoy

AnandTech Fury X Benchmarks


----------



## rt123

Quote:


> Originally Posted by *ondoy*
> 
> AnandTech Fury X Benchmarks


Already posted a bunch of times.
Stop trying to harvest more reps.


----------



## Ghoxt

Quote:


> Originally Posted by *coupe*
> 
> *Looks at the first few pages, a bunch of Titan and 980 owners bashing the Fury X to justify their purchase*


Who is bashing, just commenting on what multiple Reviewer sites have *confirmed with cards in hand.*

Nothing more, and nothing less could be expected unless one is new around here. I really have not seen a ton of Baiting from the usual suspects. I see alot of fact based parroting of what the reviewers have said.

I read HardOCP's brutal assessment of the Fury X and was shocked at their disappointing narrative as they went up one side of the Fury X expectations and down the other on what was delivered. (Vram, Performance, HDMI, etc)

If the Fury X was a Titan X killer you'd see the first couple pages reflecting that, and several OCN Titan and 980 owners selling their 3 month old cards and buying the Fury cards. But unfortunately for competition that is fiction.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *raghu78*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blameless*
> 
> I don't think there is anything wrong with GCN 1.2. I think Fury has two main issues:
> 
> 1. Too few ROPs.
> 
> 2. Too much FP64.
> 
> 
> 
> No its not as simple as that. The current Fiji is at the limits of the current GCN architecture. Scalability from R9 390X is poor indicating front end is not capable of feeding the shaders to ensure consistent perf scaling. Perf / sp is very poor when compared to Maxwell perf/cc. Per/watt is again poor given that if Maxwell had HBM power advantages this would look even more one-sided. Perf/sq mm is worse. The last one is is the most damning as the earlier AMD chips like R9 290X had significantly better perf/sq mm over GTX 780 Ti. AMD's perf/sq mm has been withering away from the HD 4870 days and now Nvidia has that under their belt too. Fundamentally GCN in its present form cannot compete with Maxwell. Given that Pascal is going to be a huge jump in perf and efficiency because of 16nm FINFET, HBM2 and architectural improvements AMD are up against it. Right now I have no hope that AMD can compete. So until AMD prove me wrong its pretty much a Nvidia GPU monopoly we are headed for.
Click to expand...

I'd actually argue that GCN is still competitive to maxwell in terms of performance. The 290X/390X is right there with the 970 and 980 and the fury x is very close to both the 980ti and titan x.

AMD's issue right now isn't performance, it's money problems and bad marketing. I agree that the future of AMD's gpus is murky but I wouldn't go all doom and gloom on them yet. Everyone counted AMD out for the count when the 2900xt was released and they bounced back tremendously from that disaster.

On an unrelated note, how the heck did one of AMD's biggest supporters turn a complete 180? I'm not trolling, I'm genuinely curious. It wasn't even 2 weeks ago that you were rooting them on and now you've painted a picture of their future so dreary that a self proclaimed Nvidia fanboy has to defend them.


----------



## Blameless

Quote:


> Originally Posted by *FallenFaux*
> 
> I'm not convinced that it has too few ROPs though, it's exactly double a 285 which you would think would just scale up to 100% faster based on how the previous cards scaled.


There are a fair number of tests where the Fury X is dramatically faster than a 285, often nearly twice as fast.

Scalling looks pretty damn good to me: http://www.anandtech.com/bench/product/1513?vs=1512
Quote:


> Originally Posted by *geoxile*
> 
> Tonga's 32 ROPs were more than a match for the 290X's in certain tests.


What ROP limited test was the 285 faster than the 290X?

The only tests I recall the 285 being faster in were geometry (tessellation and the like) limited ones.
Quote:


> Originally Posted by *geoxile*
> 
> Delta color compression should make better use of the ROPs. We saw that in Tonga.


Not 100% better.

Even AMD's own figures put the high-end of the advantage at ~40%.
Quote:


> Originally Posted by *raghu78*
> 
> No its not as simple as that.


It never is, but I strongly suspect fitting more ROPs would could have significantly improved performance and that discarding all FP64 capability (and even some CUs, if necessary) may have freed up enough transistor budget to make it worthwhile.

Maybe I'm wrong, or maybe AMD really wants a Fiji based FirePro.
Quote:


> Originally Posted by *raghu78*
> 
> The current Fiji is at the limits of the current GCN architecture. Scalability from R9 390X is poor indicating front end is not capable of feeding the shaders to ensure consistent perf scaling.


Each CU has it's own front-end and the ratio of shaders to CUs has not changed, though perhaps it should have.

Anyway, scaling, unit for unit, does not seem as poor as many are making it out to be, and I still feel the key issue is too few ROPs.


----------



## sugalumps

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> I'd actually argue that GCN is still competitive to maxwell in terms of performance. The 290X/390X is right there with the 970 and 980 and the fury x is very close to both the 980ti and titan x.
> 
> AMD's issue right now isn't performance, it's money problems and bad marketing. I agree that the future of AMD's gpus is murky but I wouldn't go all doom and gloom on them yet. Everyone counted AMD out for the count when the 2900xt was released and they bounced back tremendously from that disaster.
> 
> On an unrelated note, how the heck did one of AMD's biggest supporters turn a complete 180? I'm not trolling, I'm genuinely curious. It wasn't even 2 weeks ago that you were rooting them on and now you've painted a picture of their future so dreary that a self proclaimed Nvidia fanboy has to defend them.


You are no true nvidia fanboy, stop being rational!


----------



## FallenFaux

Quote:


> Originally Posted by *Blameless*
> 
> ...
> It never is, but I strongly suspect fitting more ROPs would could have significantly improved performance and that discarding all FP64 capability (and even some CUs, if necessary) may have freed up enough transistor budget to make it worthwhile.
> 
> Maybe I'm wrong, or maybe AMD really wants a Fiji based FirePro...


I was actually thinking about this myself, this card would make one insanely beastly workstation GPU but I haven't seen any new Firepros. Surely they need a new workstation card more than they need a consumer GPU. On a side note, I would like to see how fast this thing can hash.


----------



## geoxile

Quote:


> Originally Posted by *Blameless*
> 
> There are a fair number of tests where the Fury X is dramatically faster than a 285, some where it's nearly twice as fast.
> What ROP limited test was the 285 faster than the 290X?
> 
> The only tests I recall the 285 being faster in were geometry (tessellation and the like) limited ones.
> Not 100% better.
> 
> Even AMD's own figures put the high-end of the advantage at ~40%.
> It never is, but I strongly suspect fitting more ROPs would could have significantly improved performance and that discarding all FP64 capability (and even some CUs, if necessary) may have freed up enough transistor budget to make it worthwhile.
> 
> Maybe I'm wrong, or maybe AMD really wants a Fiji based FirePro.
> Each CU has it's own front-end and the ratio of shaders to CUs has not changed, though perhaps it should have.


http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/16

Pixel fill rate test. Not exactly ROP limited but related to pixel rendering. I'm not sure what test is explicitly ROP limited.


----------



## TechCrazy

Just got a email from newegg showing prices on the furyX


----------



## Unkzilla

Went to the HardOCP review first - always like their testing methodologies and think they keep the best selection of games to benchmark (good spread of popular older/newer releases etc)

Some of the comments are brutal

"its current implementation in the AMD Radeon Fury X leaves a lot to be desired. AMD's GPU program for the first time has truly reminded us of its CPU program."

"In terms of gaming performance, the AMD Radeon R9 Fury X seems like better competition for the GeForce GTX 980 4GB video card, rather than the GeForce GTX 980 Ti. GTX 980 cards are selling for as low at $490 today. This is not a good thing since the AMD Radeon R9 Fury X is priced at $649"

"Limited VRAM for a flagship $649 video card, sub-par gaming performance for the price, and limited display support options with no HDMI 2.0 and no DVI port. To be honest, we aren't entirely sure who the AMD Radeon R9 Fury X is really built for?"

Owch


----------



## carlhil2

Quote:


> Originally Posted by *FallenFaux*
> 
> 2900XT probably would have been a success too if it hadn't been for that pesky 8800GTX.


True, I guess what I am trying to say is that AMD most likely didn't think that the "cut" gm200 would show up til AFTER the Fury X release, just guessing..


----------



## sugalumps

Quote:


> Originally Posted by *TechCrazy*
> 
> Just got a email from newegg showing prices on the furyX


Oh man that one review on new egg for the sapphire version.

"Pros: It's easier to cook with than my R9 295x2. I used to have a big problem with my old card burning my meat. It was hard to get a nice pink and juicy center...........but this card CHANGED MY LIFE!!! Fine dining forever!!! This baby tops off at a cool 134 degrees F. Perfect for pork, chicken, beef, you name it!!!"


----------



## edo101

Look. Buy your 980 Tis and be done with it. Please.

Buy your Furys or not. If you have to have the best performance like within the next 1 millisecond, by a 980 Ti...heck buy two.

Just remember that at some point it there will be a monopoly and you guys voted for it. And also remember we are telling both teams that it is okay to pay thousands of dollars to get 30 fps cinematic.

This squabble won't change anything. Its either wait for drivers or do things the OCN way and buy the other card immediately.


----------



## FallenFaux

Quote:


> Originally Posted by *carlhil2*
> 
> True, I guess what I am trying to say is that AMD most likely didn't think that the "cut" gm200 would show up til AFTER the Fury X release, just guessing..


Considering how close the 980TI is to the Titan X, I'm will to bet that the 980TI was supposed to be cut down more than it is. Nvidia willing sabotaged their own Titan card to make sure that the Fury X couldn't compete.

Edit:
Quote:


> Originally Posted by *edo101*
> 
> Look. Buy your 980 Tis and be done with it. Please.
> 
> Buy your Furys or not. If you have to have the best performance like within the next 1 millisecond, by a 980 Ti...heck buy two.
> 
> Just remember that at some point it there will be a monopoly and you guys voted for it. And also remember we are telling both teams that it is okay to pay thousands of dollars to get 30 fps cinematic.
> 
> This squabble won't change anything. Its either wait for drivers or do things the OCN way and buy the other card immediately.


The market is self-correcting. If AMD doesn't make it someone will buy them, Intel will step up to plate and make a dedicated GPU or any number of other outcomes. Any company that can't make a competitive product for a competitive price deserves to fail.


----------



## Forceman

Quote:


> Originally Posted by *edo101*
> 
> Look. Buy your 980 Tis and be done with it. Please.
> 
> Buy your Furys or not. If you have to have the best performance like within the next 1 millisecond, by a 980 Ti...heck buy two.
> 
> Just remember that at some point it there will be a monopoly and you guys voted for it. And also remember we are telling both teams that it is okay to pay thousands of dollars to get 30 fps cinematic.
> 
> This squabble won't change anything. Its either wait for drivers or do things the OCN way and buy the other card immediately.


So it would be our fault for buying a superior card? Maybe AMD should just make a more compelling product if they want sales. Sorry, but I dont get the whole charity angle here - if AMD wants more market share, make better products just like every other industry. It's not our job to keep AMD in business.


----------



## Blameless

Quote:


> Originally Posted by *geoxile*
> 
> http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/16
> 
> Pixel fill rate test. Not exactly ROP limited but related to pixel rendering. I'm not sure what test is explicitly ROP limited.


Eh, a synthetic test probably using an unusually compressible pattern/pallet. No games show such results.
Quote:


> Originally Posted by *FallenFaux*
> 
> Surely they need a new workstation card more than they need a consumer GPU.


Possibly. Would need to double check their marketshare figures though. If it's too low, sacrificing the opportunity to gain/retain consumer marketshare for a significantly lesser amount of professional marketshare probably wouldn't be wise.
Quote:


> Originally Posted by *FallenFaux*
> 
> On a side note, I would like to see how fast this thing can hash.


Well, hashing doesn't have anything to do with ROPs, or anything but shader count and memory bandwidth/latency, so I'd expect it to do pretty well.


----------



## edo101

No Intel may or may not go for it and if they do, not sure there are profits for them. It would make more sense to pack it in with their mobile platforms just like samsung.

you say 980 Ti is superior but this is based off of day 1 drivers where some of them are even using wrong drivers.

Even if its not superior, you are buying a card that will give you same visuals as console games for a price that would be lower if the competitor had the resource to make a better card and spend on more programmers to make better drivers

I'm exhausted at this point really. Good luck to us all and PC gaming cause its getting more and more expensive and its making some of us or should I say me reconsider even sticking with it, especially with recent developments.

Also forgot to mention a vendor that does false advertising for starters,,,exhibit A: arkham knight gamework features


----------



## svenge

Quote:


> Originally Posted by *Forceman*
> 
> So it would be our fault for buying a superior card? Maybe AMD should just make a more compelling product if they want sales. Sorry, but I dont get the whole charity angle here - if AMD wants more market share, make better products just like every other industry. It's not our job to keep AMD in business.


Without the charity angle, AMD's 22.5% likely goes down to 15%. And the less is said about their x86 side, the better...


----------



## hc_416

I didn't know reading was that hard. This card will be up against the 980 ti, worst case it trades blows. So they price it 50 dollars cheaper. Seems like a win to me. To top it off the card is about as fast as a titan x at 4k. This card was to make sure all the parts work, stick it to N'vida with the new ram and be a stop gap. The real leap will come with HBM2, and the smaller size. The cards were late because of delays and it is what it is. I would also like to see when the new DX comes out to see if there is a change in the performance. Also for the second generation in a row they are using the dual GPU card to fight titian. X-fire scales better so that will be a win for AMD. The way I look at it you can get a AMD card faster and cheaper when the dual comes out verse N'vida in any segment. The real problem was the marking. Once you actually look at what they did it is compelling to say the least.


----------



## Blameless

Fiji and GM200 make up a tiny percentage of market share for either company.

The real market share battles are waged in the 100-200 dollar segments.


----------



## FallenFaux

Quote:


> Originally Posted by *svenge*
> 
> Without the charity angle, AMD's 22.5% likely goes down to 15%. And the less is said about their x86 side, the better...


That issue isn't up to the consumer to fix.


----------



## ambientblue

So glad I got my 290x for $280 and then water-cooled. Too expensive for such little improvement. And look at the 390x prices


----------



## magnek

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> I'd actually argue that GCN is still competitive to maxwell in terms of performance. The 290X/390X is right there with the 970 and 980 and the fury x is very close to both the 980ti and titan x.
> 
> AMD's issue right now isn't performance, it's money problems and bad marketing. I agree that the future of AMD's gpus is murky but I wouldn't go all doom and gloom on them yet. Everyone counted AMD out for the count when the 2900xt was released and they bounced back tremendously from that disaster.
> 
> *On an unrelated note, how the heck did one of AMD's biggest supporters turn a complete 180? I'm not trolling, I'm genuinely curious. It wasn't even 2 weeks ago that you were rooting them on and now you've painted a picture of their future so dreary that a self proclaimed Nvidia fanboy has to defend them.*












Also please do post more, I definitely enjoy reading your posts.


----------



## rx7racer

Quote:


> Originally Posted by *hc_416*
> 
> I didn't know reading was that hard. This card will be up against the 980 ti, worst case it trades blows. So they price it 50 dollars cheaper. Seems like a win to me. To top it off the card is about as fast as a titan x at 4k. This card was to make sure all the parts work, stick it to N'vida with the new ram and be a stop gap. The real leap will come with HBM2, and the smaller size. The cards were late because of delays and it is what it is. I would also like to see when the new DX comes out to see if there is a change in the performance. Also for the second generation in a row they are using the dual GPU card to fight titian. X-fire scales better so that will be a win for AMD. The way I look at it you can get a AMD card faster and cheaper when the dual comes out verse N'vida in any segment. The real problem was the marking. Once you actually look at what they did it is compelling to say the least.


If AMD priced it $50 cheaper than 980 Ti I'd say more of us would consider and appreciate the effort. But at $649.99 for either the Fury X or 980 Ti(MSRP, but both are selling for more by some) it makes one second guess and weigh the plus and minus's.

I think that's what is getting Fury X most, it's not per se just one thing it's the few small things stacked up with only one true plus which is HBM.


----------



## svenge

Quote:


> Originally Posted by *FallenFaux*
> 
> That issue isn't up to the consumer to fix.


Agreed. Sometimes the "invisible hand" of commerce gives a company the finger because of their poor products, but that's their problem.

Back to the charity angle, the concept of an AMD Telethon does amuse me greatly. But who would host it: Roy, or their "Gaming Scientist" (sic) Richard Huddy?


----------



## Sashimi

PC market is shrinking. Let's face it, desktop computer is no longer a need but a mere want in today's age. Mobile devices be it laptops, tablets or simple smart phones are taking over. We are a niche breed. Even when monopolised, if nvidia drives their video card prices any higher it will only discourage more people from move away and further shrink their market. As we know they are not particularly strong in the mobile GPU department so that won't do them any good.

That and also other companies will likely to take over AMD should they fail.

In the world of business charity is unnecessary. Supply and demand is what it's about.


----------



## gooface

I feel confident in my 980ti purchase now, they should of sold this at $600... not $650... HBM looks promising though, cant wait to see what it does in the future.


----------



## edo101

Quote:


> Originally Posted by *FallenFaux*
> 
> That issue isn't up to the consumer to fix.


Hmm might be a stretch for anology but you know how we cut back on stuff or use other alternatives to preserve species and raw materials, its pretty much what I feel like we need to do for the PC market.

Anyways, I am not here to tell y'all how to spend your money. Far be it from me.

Im just worried about monopolies and having started going into technical sales, more and more business are all about mobile platforms. It doesnt make sense for companies to go into discreet GPU markets

spend your money how you want to, its just that it has repercussions. I'm also tired of the pointless arguments and wars that do nothing but bloat up threads and cover up good information. This is the internet, minds won't be changed eitherway.


----------



## leo5111

Quote:


> Originally Posted by *gooface*
> 
> I feel confident in my 980ti purchase now, they should of sold this at $600... not $650... HBM looks promising though, cant wait to see what it does in the future.


im sure as soon as they see sales not doing as well as they want the price will drop some


----------



## ambientblue

Quote:


> Originally Posted by *Sashimi*
> 
> PC market is shrinking. Let's face it, desktop computer is no longer a need but a mere want in today's age. Mobile devices be it laptops, tablets or simple smart phones are taking over. We are a niche breed. Even when monopolised, if nvidia drives their video card prices any higher it will only discourage more people from move away and further shrink their market. As we know they are not particularly strong in the mobile GPU department so that won't do them any good.
> 
> That and also other companies will likely to take over AMD should they fail.
> 
> In the world of business charity is unnecessary. Supply and demand is what it's about.


lol,

The only way the PC market is shrinking is for casual, day-to-day users. People that could benefit from an iPad. Like my parents. The gamers gunna game, game, game.. game game.


----------



## Attomsk

Quote:


> Originally Posted by *Sashimi*
> 
> PC market is shrinking. Let's face it, desktop computer is no longer a need but a mere want in today's age. Mobile devices be it laptops, tablets or simple smart phones are taking over. We are a niche breed. Even when monopolised, if nvidia drives their video card prices any higher it will only discourage more people from move away and further shrink their market. As we know they are not particularly strong in the mobile GPU department so that won't do them any good.
> 
> That and also other companies will likely to take over AMD should they fail.
> 
> In the world of business charity is unnecessary. Supply and demand is what it's about.


I don't believe the PC market is shrinking.


----------



## edo101

Quote:


> Originally Posted by *ambientblue*
> 
> lol,
> 
> The only way the PC market is shrinking is for casual, day-to-day users. People that could benefit from an iPad. Like my parents. The gamers gunna game, game, game.. game game.


I think way more people do LoL type of gaming than our type. And those games are becoming more and more popular. Hardcore PC market which is what we do I think is shrinking. it won't help with expensive cards and games like Arkham Knight and console games looking just as good.

Anyways do your thing people.


----------



## Sashimi

Quote:


> Originally Posted by *ambientblue*
> 
> lol,
> 
> The only way the PC market is shrinking is for casual, day-to-day users. People that could benefit from an iPad. Like my parents. The gamers gunna game, game, game.. game game.


Gamers are now hit with more options. Tablet gaming, mobile gaming, console gaming etc. It's tunnel vision to think gaming is only limited to the PC.


----------



## edo101

Quote:


> Originally Posted by *Sashimi*
> 
> Gamers are now hit with more options. Tablet gaming, mobile gaming, console gaming etc. It's tunnel vision to think gaming is only limited to the PC.


Srsly...I don't know how anyone could think PC market is not shrinking. Again PC market being the types of people that do what we do.

Most people LOL and do stuff that doesn't need 200 dollar cards even.

throw in the popularity of youtube and game comparisons and people just won't even bother with an expensive PC GPU market.

I'm getting into this stuff with my Sales Internships and mobile is the future. Thats why I am extra paranoid of monopolies


----------



## Sashimi

Quote:


> Originally Posted by *edo101*
> 
> Srsly...I don't know how anyone could think PC market is not shrinking. Again PC market being the types of people that do what we do.
> 
> Most people LOL and do stuff that doesn't need 200 dollar cards even.
> 
> throw in the popularity of youtube and game comparisons and people just won't even bother with an expensive PC GPU market.
> 
> I'm getting into this stuff with my Sales Internships and mobile is the future


Absolutely. Back to Nvidia monopolizing GPU market, I still believe it will adjust itself one way or another. As consumers we only need to look after our wallet and buy what's best for ourselves, nothing more.


----------



## blue1512

FuryX sale is quite good.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&bop=And&ActiveSearchResult=True&SrchInDesc=furyx&Page=1&PageSize=30
It's still a good card with quality finish. A bit overpriced but hey, it's new, right?


----------



## edo101

Quote:


> Originally Posted by *Sashimi*
> 
> Absolutely. Back to Nvidia monopolizing GPU market, I still believe it will adjust itself one way or another. As consumers we only need to look after our wallet and buy what's best for ourselves, nothing more.


Thats best case. I am looking at how our cable companies operate here in the U.S. with monopolies and we are all crying but nothing we can do about it. Nvidia wants max profits as they should. This means they'll do just enough to stay away from trouble. If you thought they were shady now, wait till they are the sole people.


----------



## ambientblue

Quote:


> Originally Posted by *Sashimi*
> 
> Gamers are now hit with more options. Tablet gaming, mobile gaming, console gaming etc. It's tunnel vision to think gaming is only limited to the PC.


We've always had console gaming and handheld gaming has been around for a long time. Handheld consoles are what is dying due to mobile and the market for mobile is expanding but it isn't going to cannibalize the true gaming markets like console and PC.


----------



## Sashimi

Quote:


> Originally Posted by *edo101*
> 
> Thats best case. I am looking at how our cable companies operate here in the U.S. with monopolies and we are all crying but nothing we can do about it. Nvidia wants max profits as they should. This means they'll do just enough to stay away from trouble. If you thought they were shady now, wait till they are the sole people.


Don't forget, we also have a choice to move away from PCs. Or better yet, if they set their prices too high only mean that there will be room for new entrants to stir up the market. AMD successor perhaps.

Without looking too far ahead, all AMD has to do is reduce the Fury X price to $600 and it will be a perfectly competitive card at today's market. Surely they will rather do that than go down.

Additionally, cable company requires a lot of infrastructures that acts as barriers to new entrants, the chip market doesn't have such barrier.


----------



## edo101

Quote:


> Originally Posted by *ambientblue*
> 
> We've always had console gaming and handheld gaming has been around for a long time. Handheld consoles are what is dying due to mobile and the market for mobile is expanding but it isn't going to cannibalize the true gaming markets like console and PC.


keep hoping.

I'll just count to you how many devs that used to be PC exclusive that have now started doing multi console developments. What do you think happens in the future.

Another example, I have had more fun playing Gwent ( a card game within the witcher 3) than I have had playing Arkham Knight. Put that game on mobile and it catches wind...what happens then?

Or have been bought out by big console devs...remember Epic games?


----------



## Chargeit

Quote:


> Originally Posted by *blue1512*
> 
> FuryX sale is quite good.
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&bop=And&ActiveSearchResult=True&SrchInDesc=furyx&Page=1&PageSize=30
> It's still a good card with quality finish. A bit overpriced but hey, it's new, right?


Yea, AMD is getting the early sales it can at that price. Once those buyers dry up, I'd expect the card to drop in price to match its performance/company image. Would love to see that drive down Nvidia's prices myself.


----------



## Exilon

Quote:


> Originally Posted by *Chargeit*
> 
> Yea, AMD is getting the early sales it can at that price. Once those buyers dry up, I'd expect the card to drop in price to match its performance/company image. Would love to see that drive down Nvidia's prices myself.


It's not even that. The supply is paper thin. shopblt's back ordered until mid-July and they're only getting 15 from Sapphire. Meanwhile EVGA is sending several hundred units of 980Ti variants by the first week of July.

It's looking like the low supply rumor was right on the money, just like the slower than 980Ti rumor.


----------



## Master__Shake

This is just like the 6970 release.

Except AMD released slower card at a lower price than the flag ship it was supposed to compete with.

This time... Geez.


----------



## DividebyZERO

Quote:


> Originally Posted by *edo101*
> 
> keep hoping.
> 
> I'll just count to you how many devs that used to be PC exclusive that have now started doing multi console developments. What do you think happens in the future.
> 
> Another example, I have had more fun playing Gwent ( a card game within the witcher 3) than I have had playing Arkham Knight. Put that game on mobile and it catches wind...what happens then?
> 
> Or have been bought out by big console devs...remember Epic games?


This is only the beginning, wait until the web is filled with "mobile version" only websites. The new normal will be websites with very little information and features. They will have just enough for those casual tablet and cell phone users.


----------



## bmgjet

And with this Im back with Nvidia card for my main rig.
Wanted to support AMD but they just gave me no choice.

Lack of DVI was nail 1 in the coffin
Being slower then 980ti some of the time was nail 2
Bad overclocking nail 3

Prices are really stupid here in NZ was the final nail.
FuryX is $1,585.68NZ (RRP $1139)
980ti SC is $1206.90NZ (RRP $1150) and price just raised to $1249.99 in most stores with the bad FuryX launch. Luckily I had 980ti already in my cart so managed to get it for the lower price.


----------



## SpeedyVT

I can't believe people would still defend NVidia after the Batman fiasco.


----------



## edo101

http://www.overclock.net/t/1561984/digitimes-ecs-to-leave-retail-diy-motherboard-business#post_24088216

Remember what I just talked about. PC market shrinking and what not?
Quote:


> Originally Posted by *DividebyZERO*
> 
> This is only the beginning, wait until the web is filled with "mobile version" only websites. The new normal will be websites with very little information and features. They will have just enough for those casual tablet and cell phone users.


it makes sense. Look at how often people upgrade their phones.
And less and less people are going with HW btw. Cloud is the next hardware. Even big corporations are doing it


----------



## edo101

I imagine Intel would be salivating to buy out AMD just so they can get their processors into cellphones and be part of the upgrade cycle. Do you know how much more money that is than our little Discrete GPU market.

Hell I might just apply for their Tech sales job. or business strategy job
Quote:


> Originally Posted by *SpeedyVT*
> 
> I can't believe people would still defend NVidia after the Batman fiasco.


You are starting a war nobody wants.., and poking at things people don't wanna look at


----------



## Sashimi

Lol at the moment I do most my gaming on my phone. I do my browsing also mainly on my phone and if I need PC version websites I switch onto my 5 year old laptop. My desktop PC is being used no more than 3 days a week for about 3 hours each day.

Still I build and tweak and upgrade because it's a hobby. But I am a dying breed. If i run into financial issues then the first thing from my budget to be cut will be my spending on PC hardware.

I'm sure I'm not the only one doing this. That's the reality of it.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> Lol at the moment I do most my gaming on my phone..


That's just sad.

PC is losing gamers to mobile phones now?


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> That's just sad.
> 
> PC is losing gamers to mobile phones now?


We've been losing gamers to everything else including mobile phones. Its just we have all been to tunnel visioned with our fanboy wars about the bigger market. A lot of people have said it already...we are a minority.

It makes me sad when I see how ******ed AMD's marketing and business strategy team has been cause there is some stuff I swear my little 14 year old sister could do better.

I feel sorry for their overworked talented engineering team


----------



## Casey Ryback

Quote:


> Originally Posted by *edo101*
> 
> We've been losing gamers to everything else including mobile phones. Its just we have all been to tunnel visioned with our fanboy wars about the bigger market. A lot of people have said it already...we are a minority


Well then people are happy with low quality games. Minecraft is a great example of this.

If people don't want nice graphics then pc gaming is doomed.

Gaming used to be about graphical improvements and being amazed, now it's about farming and overpriced DLC and in game items.


----------



## Xuper

Guys can you check http://wccftech.com/ ? looks like it's down.I saw 5.2K comments on their article AMd Fury x Review.


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> Well then people are happy with low quality games. Minecraft is a great example of this.
> 
> If people don't want nice graphics then pc gaming is doomed.
> 
> Gaming used to be about graphical improvements and being amazed, now it's about farming and overpriced DLC and in game items.


To you low graphics is bad quality, to the majority, its simplicity and social, and cheap.

Its much easier to bill people 5 bucks at a time than it is to bill them 60 bucks at a time.









Dont even talk to them about paying 1000 bucks for a graphics card every other year. and then tweaking it


----------



## oced

Quote:


> Originally Posted by *maarten12100*
> 
> I hope otherwise it is time to climb in the pen and write some angry but polite emails to AMD which they will once again not respond to. About how they pictured an unreal image of "*an overclockers dream*"


Technically a nightmare is still a dream.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Blameless*
> 
> You have been confusing two completely different questions:
> 
> _When_ is 4GiB not enough?
> 
> and
> 
> What _happens_ when 4GiB is not enough?
> 
> Answer to the former is pretty much up in the air, at least when it comes to games. I'm not quite sure how one would go about reliably ensuring that more than 4GiB were required to render a given scene so it could be benched. This used to be easy to test simply by making frame buffers exceed VRAM size, but resolutions have not scaled any where near as fast as VRAM capacity.
> 
> Answer to the latter is a given and it's exactly the same answer as to the question of what happens when you run out of system memory, or space on your SSD, or pages in a book, or room in a fuel tank, or anything else...the bottleneck stops becoming how quick the first collection of stuff is and becomes how fast can you swap things to/from the first collection from some other source.
> GCN has definitely improved in efficiency. HBM may be responsible for respectable power savings, but even the GDDR5 memory subsystem was only a fraction of total board power in prior parts, with the GPU itself being the bulk.
> 3000, 4000, and 5000 were all VLIW5 and likely had no more differences in architecture than the different iterations of GCN.
> 
> I don't think there is anything wrong with GCN 1.2. I think Fury has two main issues:
> 
> 1. Too few ROPs.
> 
> 2. Too much FP64.


But the former is really my question. Everybody knows that if a game reaches the point where it has to exceed the frame buffer to render a scene that the fps will grind to a halt as the page file is utilized. My point is the massive bandwidth from HBM could make it so that the system memory is not needed to render scenes in which it might be necessary with a slower memory pipeline? The only way to know that is to find a game that actually causes a GDDR5 card to go to system memory and then test the identical scenario on HBM and see if the same thing happens. From the way I'm reading the data though it seems to be a moot point anyway because by the time you reach settings and resolutions that actually require the page file the GPU itself has already become insufficient to process playable frame rates anyway (on Titan, 980Ti and Fury).


----------



## Apokalipse

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *edo101*
> 
> keep hoping.
> 
> I'll just count to you how many devs that used to be PC exclusive that have now started doing multi console developments. What do you think happens in the future.
> 
> Another example, I have had more fun playing Gwent ( a card game within the witcher 3) than I have had playing Arkham Knight. Put that game on mobile and it catches wind...what happens then?
> 
> Or have been bought out by big console devs...remember Epic games?
> 
> 
> 
> This is only the beginning, wait until the web is filled with "mobile version" only websites. The new normal will be websites with very little information and features. They will have just enough for those casual tablet and cell phone users.
Click to expand...

This is already starting to happen with those stupid flat Windows 8 metro-like websites.


----------



## Klocek001

pairing it up against 980ti g1 makes it look even more embarrassing. 980ti g1 is already up to 10fps faster than reference 980ti with ~1350mhz boost clock out of the box and people run it 1450-1500 on stock vcore.
the only reservation I have about 980ti is that we don't know dx12 performance yet, and that's not the sort of cash I wanna spend on a card that I'd like to upgrade after 12 months, I wanna keep it for 2-3 years.


----------



## ambientblue

Quote:


> Originally Posted by *edo101*
> 
> keep hoping.
> 
> I'll just count to you how many devs that used to be PC exclusive that have now started doing multi console developments. What do you think happens in the future.
> 
> Another example, I have had more fun playing Gwent ( a card game within the witcher 3) than I have had playing Arkham Knight. Put that game on mobile and it catches wind...what happens then?
> 
> Or have been bought out by big console devs...remember Epic games?


Hoping? there is NO EVIDENCE to prove otherwise.

Like I said CONSOLE and PC gaming isn't going anywhere. Youre delusional. You think people are going to keep buying into this micro-transactional pay-to-win crap forever? Sure the people who weren't even gamers prior to this option might, but not people who are actual gamers, people who spend an hour plus a day dedicated to real games with depth instead of spending 5 minutes here and there at work. Get a grip!


----------



## Fickle Pickle

Was going to get this to replace my liquid cooled 290x, but I might wait until drivers improve it. I mean driver maturity has given the 290x enough life to compete with a 980, so there's that. I still game at 1080p, but some newer games are really pushing my 290x at 1080p even.

Although, I am considering a 980ti.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> Well then people are happy with low quality games. Minecraft is a great example of this.
> 
> If people don't want nice graphics then pc gaming is doomed.
> 
> Gaming used to be about graphical improvements and being amazed, now it's about farming and overpriced DLC and in game items.


You mean to tell me that I have to pay 60 bucks for the game and then another 60 for a 6month season pass for additional content? Count me in!!!...


----------



## Rei86

Quote:


> Originally Posted by *edo101*
> 
> Look. Buy your 980 Tis and be done with it. Please.
> 
> Buy your Furys or not. If you have to have the best performance like within the next 1 millisecond, by a 980 Ti...heck buy two.
> 
> Just remember that at some point it there will be a monopoly and you guys voted for it. And also remember we are telling both teams that it is okay to pay thousands of dollars to get 30 fps cinematic.
> 
> This squabble won't change anything. Its either wait for drivers or do things the OCN way and buy the other card immediately.


But I didn't buy a 980Ti or a Titan X.

I patiently waited for this card to come out to replace my current GPUs. Why? Because the hype behind this and my belief that AMD would and could pull that hail mary.

But oh well, the fail train again.


----------



## edo101

Quote:


> Originally Posted by *ambientblue*
> 
> Hoping? there is NO EVIDENCE to prove otherwise.
> 
> Like I said CONSOLE and PC gaming isn't going anywhere. Youre delusional. You think people are going to keep buying into this micro-transactional pay-to-win crap forever? Sure the people who weren't even gamers prior to this option might, but not people who are actual gamers, people who spend an hour plus a day dedicated to real games with depth instead of spending 5 minutes here and there at work. Get a grip!


I hope you are right about the PC thing







. my work experience and recent development say otherwise but i hope you are right. Consoles for sure are staying in forseable future They got Sony, Nintendo and MS

Goodnight. I need to make a vow to stay out this for the summer and minimize my time at OCN. don't kill off each other now


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> You mean to tell me that I have to pay 60 bucks for the game and then another 60 for a 6month season pass for additional content? Count me in!!!...


This is the main problem with pc gaming. Also spoiled for choice in games.....but 95% of them are crap, and gouge you for DLC.



Afaik that is just the pre-order DLC's, not extra areas/missions that will come later.

Honestly there is only one game that I'm even waiting for and that's star citizen. Even battlefront II will probably be a fail.

The other problem is devs are struggling to find a balance between addictive gameplay and awesome graphics.

You could find a game you love the look of, but in no time you're back to playing whatever game you're addicted to with graphics probably worse than xbox 360.

This because games lack depth, they come in early access, and of course DLC.

As soon as I saw the DLC list I had written off batman and that's before i heard of any performance issues.


----------



## Kokin

Lots of harsh opinions in the last few pages... let's change that with something people can smile about.









EK's single-slot prototype. Glad to see their split-flow design on an AMD block and it will be coming in 4 varieties right off the bat.


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> This is the main problem with pc gaming. Also spoiled for choice in games.....but 95% of them are crap, and gouge you for DLC.
> 
> Afaik that is just the pre-order DLC's, not extra areas/missions that will come later.
> 
> Honestly there is only one game that I'm even waiting for and that's star citizen. Even battlefront II will probably be a fail.
> 
> The other problem is devs are struggling to find a balance between addictive gameplay and awesome graphics.
> 
> You could find a game you love the look of, *but in no time you're back to playing whatever game you're addicted to with graphics probably worse than xbox 360.*
> 
> This because games lack depth, they come in early access, and of course DLC.
> 
> As soon as I saw the DLC list I had written off batman and that's before i heard of any performance issues.


Broke my vow already but yep. that is why mobile and everything else is working and threatening PC gaming. People still play Counter Strike in droves. People play MMOs in mass.

Alright, gotta leave for my own good and minimize this OCN time. But yeah this is why we need a duopoly and AMD getting bought out is not likely to make things better because of mobile (which covers laptops, tablets, phones, phablets). Everybody wants in on this latest cash cow


----------



## edo101

Quote:


> Originally Posted by *Kokin*
> 
> Lots of harsh opinions in the last few pages... let's change that with something people can smile about.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EK's single-slot prototype. Glad to see their split-flow design on an AMD block and it will be coming in 4 varieties right off the bat.


I only did it to try to minimize all the fanboy war going on. That too is negative and I argue more corrosive to our beloved PC gaming


----------



## Sashimi

Quote:


> Originally Posted by *ambientblue*
> 
> Hoping? there is NO EVIDENCE to prove otherwise.
> 
> Like I said CONSOLE and PC gaming isn't going anywhere. Youre delusional. You think people are going to keep buying into this micro-transactional pay-to-win crap forever? Sure the people who weren't even gamers prior to this option might, but not people who are actual gamers, people who spend an hour plus a day dedicated to real games with depth instead of spending 5 minutes here and there at work. Get a grip!


Unless someone is a PC enthusiast like myself, who is going build a PC for $2k and be forced to be able to only enjoy games only when you are at home with nothing else to do? Graphics is the biggest benefit of PC gaming but when cost is thrown into the equation it's not hard to see why people would opt not to have nice graphics.

Back in the days a desktop is a must have. People chat on the desktop, browse on the desktop, listen to music on the desktop. Cost of the PC is a sunk cost people will have to incur regardless. getting better GPU is an optional additional cost they pay to enjoy good looking PC games. Nowadays desktops themselves are optional. Every component of one becomes the additional cost you must pay to enjoy those awesome PC game graphics.

You want evidence a simple google search will net you plenty of articles outlining how and why PC market shrinking.

For the non-gamers, people nowadays have the option of go without a GPU at all. CPUs themselves have enough rendering power to do everything out there except for games.

My point is Nvidia will doom themselves if they get too greedy and drive prices too high because they are no longer playing in an industry in demand.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> My point is Nvidia will doom themselves if they get too greedy and drive prices too high because they are no longer playing in an industry in demand.


Or they'll stay alive in a smaller industry by charging more, depends how you look at it.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> This is the main problem with pc gaming. Also spoiled for choice in games.....but 95% of them are crap, and gouge you for DLC.
> 
> 
> 
> Afaik that is just the pre-order DLC's, not extra areas/missions that will come later.
> 
> Honestly there is only one game that I'm even waiting for and that's star citizen. Even battlefront II will probably be a fail.
> 
> The other problem is devs are struggling to find a balance between addictive gameplay and awesome graphics.
> 
> You could find a game you love the look of, but in no time you're back to playing whatever game you're addicted to with graphics probably worse than xbox 360.
> 
> This because games lack depth, they come in early access, and of course DLC.
> 
> As soon as I saw the DLC list I had written off batman and that's before i heard of any performance issues.


I agree, it is stupid ridiculous that most consumers dont care and just throw money at these piss poor companies that try and develop games.

That was the big reason why I did not play any of the batman, because of all the DLC content. I am starting to love the smaller indie games that look great. One of the being a game called Squad by Offworld Industries.

If there was one thing that excites me right now other than the big name games coming out this holiday season, it would be seeing star fox and zelda on Nintendo again.


----------



## error-id10t

Quote:


> Originally Posted by *Alatar*
> 
> And as for the argument about the Ti, you can make the same argument both ways. But honestly everyone who knows anything about GPUs saw the 980Ti coming at exactly the price it came out at. Happens every time with NV GPUs, competition or no competition. Slightly cut down GPU slightly after the full GPU at a much better price/perf ratio.


lol are you kidding me, this forum was full of people who couldn't even deal with the fact that another TI was going to come, they were so against it there were threads ruined because of their nay-saying. So I guess that says a lot about the type of people we have here...

This Fury X bashing is just the same but different smell. No bench I've seen shows any 4GB vRAM problem existing but somehow that's a major flaw. Everyone knows AMD have "trouble" with their drivers but they've written off the whole generation because it's "only" throwing punches with day 1 release drivers compared to months from the other side.


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> I agree, it is stupid ridiculous that most consumers dont care and just throw money at these piss poor companies that try and develop games.
> 
> That was the big reason why I did not play any of the batman, because of all the DLC content. I am starting to love the smaller indie games that look great. One of the being a game called Squad by Offworld Industries.
> 
> If there was one thing that excites me right now other than the big name games coming out this holiday season, it would be seeing star fox and zelda on Nintendo again.


Yep, games with heaps of DLC are only worth waiting for in a cheap game bundle or steam sale









See you've found a good game, but how does everyone else even find that title? It's like a needle in a haystack amongst so many indie games, early access, alphas, betas and who knows what else.

The amount of games on the steam sales just confused me and I ended up buying nothing









Good old starfox and zelda lol, I almost feel like buying a cheap nintendo and going back to the roots of gaming, they'll probably look a little childish the games themselves, but will play like a dream









I think steam for one needs to dump this whole early access thing and just release games that are finished, to me it's really affected the market.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> Or they'll stay alive in a smaller industry by charging more, depends how you look at it.


You are not wrong there. I only think if one has monopoly over an industry that is shrinking, the sensible thing from a business perspective is to do is try not to speed up its demise. Keeping prices from sky rocketing also prevents new entrants.


----------



## edo101

Quote:


> Originally Posted by *Sashimi*
> 
> You are not wrong there. I only think if one has monopoly over an industry that is shrinking, the sensible thing from a business perspective is to do is try not to speed up its demise. Keeping prices from sky rocketing also prevents new entrants.


we've helped them and continued to help them by buying. Vicious cycle.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yep, games with heaps of DLC are only worth waiting for in a cheap game bundle or steam sale
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See you've found a good game, but how does everyone else even find that title? It's like a needle in a haystack amongst so many indie games, early access, alphas, betas and who knows what else.
> 
> The amount of games on the steam sales just confused me and I ended up buying nothing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good old starfox and zelda lol, I almost feel like buying a cheap nintendo and going back to the roots of gaming, they'll probably look a little childish the games themselves, but will play like a dream
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think steam for one needs to dump this whole early access thing and just release games that are finished, to me it's really affected the market.


Haha and what do you think I did during the steam sale? I got the first 3 Assassins Creed games with all content as well as a couple others.

It is difficult but getting steam updates, looking on gaming forums, looking at kickstarter are the best ways that I know of to find out about these games. OOO and of course watching your favorite live stream of E3 each year. I have a list of 30 games I want to play, obviously I wont be able to play all of those but will narrow it down and only play those.

I have the original Nintendo and the super, which I should get out tomorrow and play a bit. The best Star Fox was Star Fox 64 and then Zelda the ocarina of time.

If they dump the early access thing then those like myself that love being a part of the development phase have to wait... Also it is a way to get it Green light I believe. I do think there are way to many games but the ones that have been Green Lighted are the ones that get put onto steam and all.


----------



## Sashimi

Quote:


> Originally Posted by *edo101*
> 
> we've helped them and continued to help them by buying. Vicious cycle.


Lol Dang!

Thought you've gone off


----------



## Majin SSJ Eric

OoT is still the best game ever made IMO.


----------



## magicc8ball

I will be buying the Wii U to play the new Zelda's and Starfox for sure. There will be a lot others I think too.


----------



## toMsons1987

Quote:


> Originally Posted by *Sashimi*
> 
> PC market is shrinking. Let's face it, desktop computer is no longer a need but a mere want in today's age. Mobile devices be it laptops, tablets or simple smart phones are taking over. We are a niche breed. Even when monopolised, if nvidia drives their video card prices any higher it will only discourage more people from move away and further shrink their market. As we know they are not particularly strong in the mobile GPU department so that won't do them any good.
> 
> That and also other companies will likely to take over AMD should they fail.
> 
> In the world of business charity is unnecessary. Supply and demand is what it's about.


You say this, but PC gaming right now is by far the biggest it has ever been. DOTA, LOL, and CSGO just on their own have grew this community to new heights. People have thought PC gaming market has been dying, and then, bam! We've been on a rise for a while now. Mobile market is growing so fast because of how many gizmo's there are. You have to look at it in a correct perspective. But you are making **** up when you talk about the PC market shrinking. Facts point otherwise. In fact, mobile markets have shown to be HELPING the PC market grow, because of things like Twitch and other media.

Stop throwing *scare* words out without proof.


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> Haha and what do you think I did during the steam sale? I got the first 3 Assassins Creed games with all content as well as a couple others.
> 
> If they dump the early access thing then those like myself that love being a part of the development phase have to wait... Also it is a way to get it Green light I believe. I do think there are way to many games but the ones that have been Green Lighted are the ones that get put onto steam and all.


I think I bought the witcher 2 advanced edition in the previous sales.

The problem with the early access is people buy them and put in probably 10-20 hours then they never play them again, I see it happening often through my friends list.

It would be hard being a dev for pc games in this day and age I imagine.


----------



## Casey Ryback

Quote:


> Originally Posted by *toMsons1987*
> 
> You say this, but PC gaming right now is by far the biggest it has ever been. DOTA, LOL, and CSGO just on their own have grew this community to new heights. People have thought PC gaming market has been dying, and then, bam! We've been on a rise for a while now. Mobile market is growing so fast because of how many gizmo's there are. You have to look at it in a correct perspective. But you are making **** up when you talk about the PC market shrinking. Facts point otherwise. In fact, mobile markets have shown to be HELPING the PC market grow, because of things like Twitch and other media.
> 
> Stop throwing *scare* words out without proof.


But for people who don't play CSGO, LOL and DOTA the communities on other pc games seem to be shrinking.

Those games can also be played on pretty mediocre PC's so it doesn't help advance the technology of hardware.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> I think I bought the witcher 2 advanced edition in the previous sales.
> 
> The problem with the early access is people buy them and put in probably 10-20 hours then they never play them again, I see it happening often through my friends list.
> 
> It would be hard being a dev for pc games in this day and age I imagine.


Well I know the new policy that allows you to return games on steam is hurting those smaller indie developers... Which is a catch 22 in my book.


----------



## Wishmaker

Quote:


> Originally Posted by *toMsons1987*
> 
> You say this, but PC gaming right now is by far the biggest it has ever been. DOTA, LOL, and CSGO just on their own have grew this community to new heights. People have thought PC gaming market has been dying, and then, bam! We've been on a rise for a while now. Mobile market is growing so fast because of how many gizmo's there are. You have to look at it in a correct perspective. But you are making **** up when you talk about the PC market shrinking. Facts point otherwise. In fact, mobile markets have shown to be HELPING the PC market grow, because of things like Twitch and other media.
> 
> Stop throwing *scare* words out without proof.


...and that is the reason every developer is making such good games for the master race because the PC market is by far the biggest it has ever been. Whatever helps you sleep at night, we in the real world, we will facepalm when a new console port is released for the ever growing market you mentioned







.


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> Well I know the new policy that allows you to return games on steam is hurting those smaller indie developers... Which is a catch 22 in my book.


Agreed, and who actually needs a refund on a $5-$10 game.

When you pay $60+ for a polished turd from a major company that gouges for DLC's and doesn't even hit the mark for enjoyable, addictive gameplay, that's when the refund policy is good.

I just wish there was mroe AAA titles that I could say are worth the money, and have no deadlines to churn out games and complex DLC marketing strategies.

I guess that's why I have faith in star citizen, the devs and creator seem to really care about the game and won't be rushing to release it afaik.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> Those games can also be played on pretty mediocre PC's so it doesn't help advance the technology of hardware.


That's exactly it. My point is never on gaming alone but PC and specifically desktop PC, and how it will affect video card prices.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> But for people who don't play CSGO, LOL and DOTA the communities on other pc games seem to be shrinking.
> 
> Those games can also be played on pretty mediocre PC's so it doesn't help advance the technology of hardware.


Quote:


> Originally Posted by *Casey Ryback*
> 
> Agreed, and who actually needs a refund on a $5-$10 game.
> 
> When you pay $60+ for a polished turd from a major company that gouges for DLC's and doesn't even hit the mark for enjoyable, addictive gameplay, that's when the refund policy is good.
> 
> I just wish there was mroe AAA titles that I could say are worth the money, and have no deadlines to churn out games and complex DLC marketing strategies.
> 
> I guess that's why I have faith in star citizen, the devs and creator seem to really care about the game and won't be rushing to release it afaik.


ya i know right?

I regret ever buying BF:Hardline and I am glad I do not hold premium. a little money saved I guess...

Still good game dev's out there but not always on the game that suits you....


----------



## Am3oo

I can see the fury x is about 20-25 euros cheaper here in romania, compared to a reference 980 ti design. This seems fair in pricing, locally, if you forget the fact that the price is about 800$ by the time it gets here.


----------



## DiceAir

So I'm clueless guys. I'm running the following.

R9 280x crossfire and a Qnix qx2710 2560x1440 @ 96hz. I'm unsure if i should keep this gpu, upgrade monitor and then to fury x or just get a 980ti while keeping the monitor. A local pc shop has an amazing deal on the Galax 980ti HOF


----------



## Standards

This year is probably a poor choice to choose to upgrade from 280x xfire, which is still pretty good.


----------



## DiceAir

Quote:


> Originally Posted by *Standards*
> 
> This year is probably a poor choice to choose to upgrade from 280x xfire, which is still pretty good.


That's what i was thinking. The only thing is that if I don't take the 980 ti HOF right now I might loose on +-$160 but if I do and it's not worth it I might loose on about $850 and that's a lot of money for a side grade. Then we have the thing about dx12 and it's been tested by people at guru3d that even on some dx11 titles we see improved performance in win 10 at least for AMD. So yeah I might loose on this deal but maybe I can rather spend the extra money and be sure that it's better or when win 10 and dx12 titles comes out I might not even have to upgrade.


----------



## elect

Quote:


> Originally Posted by *flash2021*
> 
> someone come get me when there are Win10 / DX12 benches. In about a month (for win10 rite?), we'll see the Fury X's true colors...whether good or bad (and I'm assuming AMD will have good DX12 drivers ready for day1)


Quote:


> Originally Posted by *Apexii22*
> 
> I think we should wait until DX12. AMD said DX12 is a game changer for their products. Fury X might pull ahead of Titan X when DX12 is available, who knows?
> 
> Hopefully, it is time we see AMD on top for once.


Indeed, dx12 looks quite promising

https://twitter.com/killyourfm/status/611579128941129728?s=09



Probabily dx11 overhead is hiding its potential


----------



## dantoddd

Hmm....

So roughly the performance of GTX 980 Ti. I need to see both OC performance


----------



## DiceAir

Quote:


> Originally Posted by *Apexii22*
> 
> I think we should wait until DX12. AMD said DX12 is a game changer for their products. Fury X might pull ahead of Titan X when DX12 is available, who knows?
> 
> Hopefully, it is time we see AMD on top for once.


I just get the feeling that the reason for the bad win 8.1 drivers is that they getting it ready for win 10 and dx12. i saw on guru3d forums and on youtube that win10 in project cars at least gives you much better performance using a win10 driver and on win 8.1 using the same driver modified. It's even faster with win10 driver on win 10 and official driver on win 8.1. So we defenately not seeing full potential of Fury X. i know you can't really compare draw calls between GPU's because it doesn't show actual performance but it shows something. Many other things to factor in. Anyway I agree with the reply that's been given to me when asked if i should upgrade to 980ti or not. I think I and many others should wait for dx12 at least. Unless you have a very old system and can't wait any longer. Without knowing true win10 and dx12 performance you can't really say where to go. What if i upgrade to 980ti and fury X is like so much faster and Crossfire scaling is 50% faster than Titan x + less heat.


----------



## Casey Ryback

I'd sit tight as you have a pretty powerful setup there anyway, and we really are at a point of unknown as far as DX12 is concerned, along with the new memory architecture and die shrinks.


----------



## Aelius

Quote:


> Originally Posted by *DividebyZERO*
> 
> This is only the beginning, wait until the web is filled with "mobile version" only websites. The new normal will be websites with very little information and features. They will have just enough for those casual tablet and cell phone users.


"Wait?" Wait? I've been noticing this infuriating mobile web trend for at least a couple years now. Though you're right that it's going to get much worse. What's so baffling is that no one seems to care that websites are becoming more and more barren and enlarged and child-like.


----------



## hawker-gb

DX12 is game changer:

http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html?start=21


----------



## i7monkey

This launch was entirely ruined by AMD's marketing team and the decision to charge $649 for it. Period.

Hype and MSRP is what ruined this card, because it's an otherwise decent card. Fury X should cost $499.


----------



## i7monkey

Nvidia's tricky, is it possible they released false benchmarks of Fury making it look good so it can let everyone down even more and ruin the launch? I'm sure this tactic has been done before.


----------



## error-id10t

Quote:


> Originally Posted by *i7monkey*
> 
> Fury X should cost $499.


lol what, seriously some people are delusional here.


----------



## Newbie2009

Quote:


> Originally Posted by *hawker-gb*
> 
> DX12 is game changer:
> 
> http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html?start=21


Nvidia probably have just not optimised their drivers for DX12 yet, as it's not out.


----------



## flopper

Quote:


> Originally Posted by *Newbie2009*
> 
> Nvidia probably have just not optimised their drivers for DX12 yet, as it's not out.


your kidding right?


----------



## Newbie2009

Quote:


> Originally Posted by *flopper*
> 
> your kidding right?


About? DX12 won't go mass market until end of next month.


----------



## flopper

Quote:


> Originally Posted by *DiceAir*
> 
> I just get the feeling that the reason for the bad win 8.1 drivers is that they getting it ready for win 10 and dx12. i saw on guru3d forums and on youtube that win10 in project cars at least gives you much better performance using a win10 driver and on win 8.1 using the same driver modified. It's even faster with win10 driver on win 10 and official driver on win 8.1. So we defenately not seeing full potential of Fury X. i know you can't really compare draw calls between GPU's because it doesn't show actual performance but it shows something. Many other things to factor in. Anyway I agree with the reply that's been given to me when asked if i should upgrade to 980ti or not. I think I and many others should wait for dx12 at least. Unless you have a very old system and can't wait any longer. Without knowing true win10 and dx12 performance you can't really say where to go. What if i upgrade to 980ti and fury X is like so much faster and Crossfire scaling is 50% faster than Titan x + less heat.


30% amd benefit from win 10 in project cars.
thats like a new card.


----------



## Wishmaker

Youtube is blocked at work. Any text link?


----------



## harney

Quote:


> Originally Posted by *Wishmaker*
> 
> Youtube is blocked at work. Any text link?


Yes its the same engineer who worked for nvidia he seems to have moved over to AMD and made a mess up there too


----------



## LegacyLG

jold on does the r9 295x2 beat the fury x?


----------



## blue1512

Quote:


> Originally Posted by *i7monkey*
> 
> This launch was entirely ruined by AMD's marketing team and the decision to charge $649 for it. Period.
> 
> Hype and MSRP is what ruined this card, because it's an otherwise decent card. Fury X should cost $499.


Which world are you living in???

Remember that this card is shipped with an AIO cooler. The similar solution from Accelero would cost you at least 100$.

And it has HBM, a new piece of tech. It may not shine, but it is expensive for sure.

Furthermore, it is Dx12 compatible, which would bring a significant performance boost for AMD card. It's clear that Dx11 is holding AMD's GPU back.

So, I would say that the price is fine as it is, even though AMD surely don't has much profit from FuryX's selling. Rumors said that AMD had to cut price on FuryX thanks to the surprising arrival of 980Ti


----------



## BoredErica

Quote:


> Originally Posted by *LegacyLG*
> 
> jold on does the r9 295x2 beat the fury x?


Of course it does. The 295x2 beats a Titan X. Fury X beating 295x2 despite being a single card would be balls amazing.


----------



## iLeakStuff

Ive come to a temporary decision which I feel I may stick with:
Wait til August til Skylake is available to buy.
See if Fury X price drop to $549 by then. If Fury X2 is launched at $999, get that.
I feel with the games where 980Ti wins by such big margins, $549-$599 is what Fury X is worth.

If neither happens, I will probably go with GTX 980Ti EVGA Hybrid instead for $749. It will be much faster than Fury X since its overclocked and because the TDP is at 250W it will be both silent and very cool.


----------



## harney

Quote:


> Originally Posted by *blue1512*
> 
> Which world are you living in???
> 
> Remember that this card is shipped with an AIO cooler. The similar solution from Accelero would cost you at least 100$.
> 
> And it has HBM, a new piece of tech. It may not shine, but it is expensive for sure.
> 
> Furthermore, it is Dx12 compatible, which would bring a significant performance boost for AMD card. It's clear that Dx11 is holding AMD's GPU back.
> 
> So, I would say that the price is fine as it is, even though AMD surely don't has much profit from FuryX's selling. Rumors said that AMD had to cut price on FuryX thanks to the surprising arrival of 980Ti


I understand that there are high costs in R&D to start with prototypes testing ect.....but once any card is in mass production it would not surprise me there mark ups on any card are high .....I personally think the higher end cards are way overpriced .....even US prices never mind UK prices


----------



## Casey Ryback

Quote:


> Originally Posted by *iLeakStuff*
> 
> Ive come to a temporary decision which I feel I may stick with:
> Wait til August til Skylake is available to buy.


Wait for the latest intel baby cake series. Skylake is old news









Also wait for fury on air you might be surprised.


----------



## DiceAir

Quote:


> Originally Posted by *flopper*
> 
> 30% amd benefit from win 10 in project cars.
> thats like a new card.


and then I heard that just by implementing dx12 into the game you get up to 40% improved performance. So I heard but i could be wrong. With dx11 you still se some huge gains and I hope in Crossfire it will be even more.


----------



## harney

Surly the DX12 improvements on amd tech would just be the same for nvidia


----------



## Ganf

Quote:


> Originally Posted by *harney*
> 
> Surly the DX12 improvements on amd tech would just be the same for nvidia


HAHAHA! Nope.

Both companies are supporting different sets of DX12 features. Performance will still vary widely from card to card based upon it's DX12 feature implementation, and how the devs use those features.


----------



## Wishmaker

Have I missed a memo where DX12 will not work on NVIDIA cards at all? Isn't this excuse getting a bit embarrassing now? AMD needs to level up and take out better products. Hopefully the markets are not gonna bite them in the tushy. Please, stop this DX12 thing where AMD benefits and NVIDIA simply suffers. You all know clearly well NVIDIA will have improvements and some say even more due to the extensions of DX12 they have compared to AMD.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *DiceAir*
> 
> and then I heard that just by implementing dx12 into the game you get up to 40% improved performance. So I heard but i could be wrong. With dx11 you still se some huge gains and I hope in Crossfire it will be even more.


And odds are you will see the same, or more gains, from nVidia on DirectX 12 too.

Microsoft isn't stupid. They aren't going to have an updated DirectX version that gimps the guy with 76% of the market share and make THEIR new version of DirectX look like hell from and end user perspective. That will hurt Microsoft's sales. Microsoft doesn't give a damn about AMD ... the company that for the past 2 years has been basically bad mouthing Microsoft's DirectX with Mantle and now has given that technology to their competitor ... the Linux market.

I'm sorry, but Microsoft is many things, but stupid isn't one of them. They are going to make sure that the market leader and someone who hasn't attempted to stable them in the back, doesn't look bad, and thus by extension, make themselves look bad.

Quote:


> Originally Posted by *Wishmaker*
> 
> Have I missed a memo where DX12 will not work on NVIDIA cards at all? Isn't this excuse getting a bit embarrassing now? AMD needs to level up and take out better products. Hopefully the markets are not gonna bite them in the tushy. Please, stop this DX12 thing where AMD benefits and NVIDIA simply suffers. You all know clearly well NVIDIA will have improvements and some say even more due to the extensions of DX12 they have compared to AMD.


Exactly.

It's the last desperate act of Team Red fans.

390 series will be awesome ... oh, it's just a rebrand.
Well, the Fury X will beat the Titan X by 20% and do it for $300 less ... oh, it's 3% slower than the GTX 980Ti at the same price.
Well, the drivers will make it faster than both the GTX 980Ti and Titan X.

....


----------



## harney

Quote:


> Originally Posted by *Ganf*
> 
> HAHAHA! Nope.
> 
> Both companies are supporting different sets of DX12 features. Performance will still vary widely from card to card based upon it's DX12 feature implementation, and how the devs use those features.


Maybe same was the wrong word to use ...what i am trying to say if they both used the same api same features then results would more or less the same in terms of performance....but i am no api expert


----------



## DiceAir

Quote:


> Originally Posted by *47 Knucklehead*
> 
> And odds are you will see the same, or more gains, from nVidia on DirectX 12 too.
> 
> Microsoft isn't stupid. They aren't going to have an updated DirectX version that gimps the guy with 76% of the market share and make THEIR new version of DirectX look like hell from and end user perspective. That will hurt Microsoft's sales. Microsoft doesn't give a damn about AMD ... the company that for the past 2 years has been basically bad mouthing Microsoft's DirectX with Mantle and now has given that technology to their competitor ... the Linux market.
> 
> I'm sorry, but Microsoft is many things, but stupid isn't one of them. They are going to make sure that the market leader and someone who hasn't attempted to stable them in the back, doesn't look bad, and thus by extension, make themselves look bad.
> Exactly.
> 
> It's the last desperate act of Team Red fans.
> 
> 390 series will be awesome ... oh, it's just a rebrand.
> Well, the Fury X will beat the Titan X by 20% and do it for $300 less ... oh, it's 3% slower than the GTX 980Ti at the same price.
> Well, the drivers will make it faster than both the GTX 980Ti and Titan X.
> 
> ....


Yes I understand but what we getting to is that AMD has more overhead on dx11 so maybe their hardware is much better so maybe if we get to use the full product efficiently we might actually see it perform to the fullest and get great performance. I know Nvidia will see the same but I think we might see AMD benefit the most cause of their sloppy dx1 driver overhead. I will still wait before i upgrade cause of dx12 and windows 10. Maybe we might see amd getting closer to nvidia in regards to frametime, framerate and so on.


----------



## Kand

Damage Control.


----------



## harney

Quote:


> Originally Posted by *47 Knucklehead*
> 
> And odds are you will see the same, or more gains, from nVidia on DirectX 12 too.
> 
> Microsoft isn't stupid. They aren't going to have an updated DirectX version that gimps the guy with 76% of the market share and make THEIR new version of DirectX look like hell from and end user perspective. That will hurt Microsoft's sales. Microsoft doesn't give a damn about AMD ... the company that for the past 2 years has been basically bad mouthing Microsoft's DirectX with Mantle and now has given that technology to their competitor ... the Linux market.
> 
> I'm sorry, but Microsoft is many things, but stupid isn't one of them. They are going to make sure that the market leader and someone who hasn't attempted to stable them in the back, doesn't look bad, and thus by extension, make themselves look bad.
> Exactly.
> 
> It's the last desperate act of Team Red fans.
> 
> 390 series will be awesome ... oh, it's just a rebrand.
> Well, the Fury X will beat the Titan X by 20% and do it for $300 less ... oh, it's 3% slower than the GTX 980Ti at the same price.
> Well, the drivers will make it faster than both the GTX 980Ti and Titan X.
> 
> ....


yes this ....the performance gap will stay as is even over to dx12....... amd are suddenly not going to be some magic monster card with dx 12 where then nvidia will then be slower...either way from what i have been reading re DX11 12 vulkan mantle ect win 10 with 12 is a good thing for all of us more so with the cpu drawcall increase never mind owt else which seems to be the major problem at the moment for the game code .......


----------



## Tivan

Just don't use DX11/10 and go back to DX9 if you want to compare AMD and Nvidia with driver parity.

Might give you an idea of DX12, when no special features of either side are used.


----------



## Ganf

Quote:


> Originally Posted by *47 Knucklehead*
> 
> And odds are you will see the same, or more gains, from nVidia on DirectX 12 too.
> 
> Microsoft isn't stupid. They aren't going to have an updated DirectX version that gimps the guy with 76% of the market share and make THEIR new version of DirectX look like hell from and end user perspective. That will hurt Microsoft's sales. Microsoft doesn't give a damn about AMD ... the company that for the past 2 years has been basically bad mouthing Microsoft's DirectX with Mantle and now has given that technology to their competitor ... the Linux market.
> 
> I'm sorry, but Microsoft is many things, but stupid isn't one of them. They are going to make sure that the market leader and someone who hasn't attempted to stable them in the back, doesn't look bad, and thus by extension, make themselves look bad.
> Exactly.
> 
> It's the last desperate act of Team Red fans.
> 
> 390 series will be awesome ... oh, it's just a rebrand.
> Well, the Fury X will beat the Titan X by 20% and do it for $300 less ... oh, it's 3% slower than the GTX 980Ti at the same price.
> Well, the drivers will make it faster than both the GTX 980Ti and Titan X.
> 
> ....












Here we go again... Not arguing that Microsoft is going to gimp Nvidia cards, but you're just acting like a fool thinking they've got some grudge against AMD when both companies have been going to one trade show after another with each other talking about how they're working together to implement DX12 on both AMD's cards and the XB1, and Nvidia hasn't been seen cooperating with a single rep from Microsoft.

Microsoft wants to sell playboxes. Windows 10 is free for an entire year. Guess whose hardware features they're pushing?

Nope. Stop right there before you start blabbing on about some inane remark. Microsoft is only pushing THEIR hardware. It just so happens that AMD's logo is on the chip. They're going to do what they need to do to make sure they sell more consoles. That's it.


----------



## DiceAir

Quote:


> Originally Posted by *harney*
> 
> yes this ....the performance gap will stay as is even over to dx12....... amd are suddenly not going to be some magic monster card with dx 12 where then nvidia will then be slower


Ag we would just have to wait and see. All we can do. It's actually no use speculating but I'm looking forward to dx12. I hope all of us even Nvidia fanboys hope AMD does good for once so that Nvidia can lower prices and up their game a little bit more. I'm no AMD fanboy or Nvidia fanboy but would love if amd can at least get 40% market share or more.


----------



## rv8000

Anyone banking on DX12 performance...







, you realize we will have arctic islands and pascal before we see any game thats built up from the ground for DX12...

About the 64 rops, Anandtech's benchmarks are showing the Fury X to have a marginally higher pixel and texel fill rate than the 980ti, still kind of hard to believe the 64 rops are really accountable for holding back performance that much in those games where it is really lagging behind. Curious to know more about how the rops affect performance though.


----------



## iLeakStuff

Quote:


> Originally Posted by *Casey Ryback*
> 
> Wait for the latest intel baby cake series. Skylake is old news
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also wait for fury on air you might be surprised.


Problem is that Fury X is locked to Water Cooling.
Fury air cooled is probably Fiji Pro with less cores which means it will slower.


----------



## AmericanLoco

Quote:


> Originally Posted by *blue1512*
> 
> Which world are you living in???
> 
> Remember that this card is shipped with an AIO cooler. The similar solution from Accelero would cost you at least 100$.


That's retail. It doesn't cost AMD $100 to buy that AIO from CoolerMaster
Quote:


> Furthermore, it is Dx12 compatible, which would bring a significant performance boost for AMD card. It's clear that Dx11 is holding AMD's GPU back.


DX11 isn't holding the Fury back. AMD's bad DX11 drivers are holding the Fury back. It's not an excuse though, and it should have been sorted months ago.


----------



## DiceAir

Quote:


> Originally Posted by *iLeakStuff*
> 
> Problem is that Fury X is locked to Water Cooling.
> Fury air cooled is probably Fiji Pro with less cores which means it will slower.


Just like on paper 980ti is slower than titanx. Look at some of the custom coolers. Gigabyte g1 gaming is faster in some games than a Titan x is just because it can feed more power, overclock better and have lower temps. So Maybe fiji pro might actually be faster. Maybe they will solve the oveclocking issue and unlock the voltage with a driver update or something like that. We never know but I think this just shows that early reviews can't always be trusted. They use very early beta drivers for that card and you might get 1 or 2 other issues.


----------



## raghu78

Multiple reviewers have commented that drivers could be one of the issues. hwc and pcper were a couple of who thought drivers could be holding back the card. Its damn disappointing to see AMD messing up on drivers. AMD can redeem themselves a bit atleast in the next few months by improving performance through driver optimizations. But this generation has gone by a landslide to Nvidia. Nvidia is going to turn the screws sometime in Q3 or Q4 with a few refreshed SKUs (higher core and memory clocks) and effectively kill off AMD market share recovery dreams.


----------



## Phaster89

Quote:


> Originally Posted by *rv8000*
> 
> Anyone banking on DX12 performance...
> 
> 
> 
> 
> 
> 
> 
> , you realize we will have arctic islands and pascal before we see any game thats built up from the ground for DX12...
> 
> About the 64 rops, Anandtech's benchmarks are showing the Fury X to have a marginally higher pixel and texel fill rate than the 980ti, still kind of hard to believe the 64 rops are really accountable for holding back performance that much in those games where it is really lagging behind. Curious to know more about how the rops affect performance though.


i have very little hope for pc games on dx12, games are made for consoles first and only the xbox one is compatible with it and because the ps4 is not compatible, something's gonna give


----------



## 47 Knucklehead

Oh, and let's not forget there is a THIRD player in all this ... someone who has never stabbed Microsoft in the back when they tried to invent Mantle to compete with DirectX ... Intel.

Let's not forget that Intel is what powers the Microsoft Surface Pro 1, 2, and 3 (Intel Core i3 and i5 and Haswell) as well as their lower end Surface 3 (Intel Atom) and Surface 2 (nVidia Tegra). Do you see AMD anywhere in there?

No.

So why would Microsoft NOT make both Intel and nVidia graphics the absolute best they can for Windows 10 and DirectX 12, when Microsoft's own products rely exclusively on NON-AMD CPUs and GPUs?

Bottom line: Don't hang your hat on Windows 10 and DirectX 12 being the savior of a 200 series rebrand and a lack-luster Fury X performance, because Microsoft has ZERO vested interest in helping someone who tried to stab them in the back and they don't even bother using in their own hardware, other than the XBox, which is a money sink anyway and their investors are begging that they sell off that division and get out of this money LOSING venture known as "Consoles".


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Tivan*
> 
> Just don't use DX11/10 and go back to DX9 if you want to compare AMD and Nvidia with driver parity.


By all means, let's go back to DirectX 9.

Has AMD FINALLY fixed the microstuttering issue with DirectX 9 and Crossfire?

NO!

They fixed it for DirectX 10 and 11 after nearly a year of complaints, but they abandoned fixing it on DirectX 9 in favor of working on Mantle ... even though a TON of games are still using DirectX 9.


----------



## Kane2207

Wait for DX12?









After already waiting for this card, now supposedly waiting for drivers, previously waiting 2 years for FreeSync, waiting for a frame timing fix..... it just goes on and on and on.

Unfortunately for AMD, there's a point where people have to stop waiting and actually purchase, and when they do, they'll do it based on whatever the best performance they can obtain for their cash at that point in time.

At this point in time, that'd be a 980ti.


----------



## BinaryDemon

It's nice to look at the DX12 API Overhead Draw Call tests and say "Wow the DX12 improvements are amazing!" but at a certain point the bottleneck isnt Draw Calls anymore. The game has to wait for the CPU to finish up another task like Physics, AI, Networking, Input ect. I would bet most of the DX12 FPS improvements will be a result of improved cpu scheduling/multitasking.

Mantle was basically a preview for DX12 features and it didnt turn the R9 290x into the most powerful GPU ever, it helped alleviate a CPU bottleneck.


----------



## NexusRed

If they released Fury X for $699 CAD, then it would have been a serious hit. Someone needs to partner or buy AMD to really give it a boost. I'm looking at you Samsung and Qualcomm.


----------



## Seel

I'd buy this without the AIO for 100$ less.


----------



## Neon Lights

Who bought the most?

I bought 2 for 1400€ (the same amount in dollars), they are supposed to be sent on the 26th (tomorrow), so I hope that I will get them on Monday.
I also plan to buy Aqua Computer water blocks for them once they are released.


----------



## CasualCat

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's been my question all along though is WILL Fury run out or does it run flawlessly at a resolution and settings that an equivalent GDDR5 card DOES run out, thus proving that HBM does have an advantage, capacity-to-capacity? Nobody has showed me where it IS running out yet so the answer to that question remains speculative.


How do you definitively test/prove that though? PCPer talked about being able adjust settings up on GTA5 that made it stutter and drop to single frames when the 980Ti didn't (speculating that it could be memory), but even they couldn't say whether it was memory or something else. Is there even a proper tool to gauge this?


----------



## Natskyge

Quote:


> Originally Posted by *CasualCat*
> 
> How do you definitively test/prove that though? PCPer talked about being able adjust settings up on GTA5 that made it stutter and drop to single frames when the 980Ti didn't (speculating that it could be memory), but even they couldn't say whether it was memory or something else. Is there even a proper tool to gauge this?


Just record ram usage?


----------



## DiceAir

Quote:


> Originally Posted by *hawker-gb*
> 
> DX12 is game changer:
> 
> http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html?start=21


As far as I know you can't compare graphics cards by testing api overhead. Just because Fury x is getting more draw calls doesn't mean it will do better in real world.
Quote:


> Originally Posted by *Kane2207*
> 
> Wait for DX12?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After already waiting for this card, now supposedly waiting for drivers, previously waiting 2 years for FreeSync, waiting for a frame timing fix..... it just goes on and on and on.
> 
> Unfortunately for AMD, there's a point where people have to stop waiting and actually purchase, and when they do, they'll do it based on whatever the best performance they can obtain for their cash at that point in time.
> 
> At this point in time, that'd be a 980ti.


it all depends on what you play. For me not that many games out now that interest me that much. Yes i would love to play withcer 3 and get decent framerates and frame times but it's not a game worth upgrading. I play project cars on high and get 70FPS and it's still smooth enough. I play bf4 on low cause I found out it's easier for me to get kills due to the fact that it's easier to see people and Much better GPU frametime. Many other games I play runs okish only games like Far cry 3/4 but it runs ok now for me. I officially wanted to upgrade for advanced warfare cause i have to play it on super low to make it either not crash on me or give me huge stutter but I don't see many people play that anymore and switched to black ops 2 again and that runs just fine.


----------



## Pawelr98

From what I was reading the main problem of the Fury X is the lack of ROP units.

I'm curious about a niche way to use this card. For example Arma 3 has built in option to increase the 3D resolution.
So for example the screen is 1080P but you can set 3D res to 200% and in fact the engine renders at 2X the resolution.
Something like supersampling.

Will ROP count affect the performance at such scenario ?
Also even after small OC the card seems to catch up with 980TI.Of course 980TI can OC too but Fury X already has watercooling (which I will mod to run with my main loop)
For now I have such options:
-Buy 980TI for about 50$ more- currently sold out everywhere
-Buy Fury X-still not in microcenter+ sold out everywhere
-Buy cheap 290/290X in microcenter and then slap fullcover block on it

I hope that either 980TI or Fury X will be avaiable in microcenter before I leave US.


----------



## Alatar

Quote:


> Originally Posted by *Natskyge*
> 
> Just record ram usage?


The tools that consumers and reviewers have available for this are useless.


----------



## Kand

Wait for DX12?

Dx11 was released 2008. 7 years later, im still waiting.


----------



## ladcrooks

I am quite happy with the fury's performance, we already knew before it came out that it will match the 980ti give or take, and 4k wise it shines and as many of you are using 2560x1440 instead of 1080p

So i think its winner - not price wise though









well done Amd - thanks for bringing something new to the table


----------



## Boomstick727

Quote:


> Originally Posted by *ladcrooks*
> 
> I am quite happy with the fury's performance, we already knew before it came out that it will match the 980ti give or take, and 4k wise it shines and as many of you are using 2560x1440 instead of 1080p
> 
> So i think its winner - not price wise though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well done Amd - thanks for bringing something new to the table


Agreed, looking forward to testing mine later today against my Titan X.

I fancy a switch to AMD until we get a die shrink, Pascal, Pirate Islands and HBM 2.0 etc. Something different, been on Nvidia cards for a while now Titan Black, GTX 980, and Titan X For the most part I am sure the Fury X will be spot on for gaming until then. As Fury X is new drivers and DX12 might bring something to the table, we haven't even got unlocked voltage or custom BIOS yet. Fury X likely has room in the tank.


----------



## Ganf

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Oh, and let's not forget there is a THIRD player in all this ... someone who has never stabbed Microsoft in the back when they tried to invent Mantle to compete with DirectX ... Intel.
> 
> Let's not forget that Intel is what powers the Microsoft Surface Pro 1, 2, and 3 (Intel Core i3 and i5 and Haswell) as well as their lower end Surface 3 (Intel Atom) and Surface 2 (nVidia Tegra). Do you see AMD anywhere in there?
> 
> No.
> 
> So why would Microsoft NOT make both Intel and nVidia graphics the absolute best they can for Windows 10 and DirectX 12, when Microsoft's own products rely exclusively on NON-AMD CPUs and GPUs?
> 
> Bottom line: Don't hang your hat on Windows 10 and DirectX 12 being the savior of a 200 series rebrand and a lack-luster Fury X performance, because Microsoft has ZERO vested interest in helping someone who tried to stab them in the back and they don't even bother using in their own hardware, other than the XBox, which is a money sink anyway and their investors are begging that they sell off that division and get out of this money LOSING venture known as "Consoles".


Lol.

DX12 is now integral to the success of the Surface Pro? Let me guess, they're targeting the MOBA crowd? Come on... Let's take a look at DX12 feature lists again.

http://www.overclock.net/t/1558938/lightbox/post/24006402/id/2481034

Now if anyone is going to be left out in the cold because Microsoft prioritizes the Surface Pro for a gaming platform over their own console, who is it going to be?


----------



## Serandur

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yes, yes, I've heard all these snarky responses before and yet I still have yet to see PROOF that 4GB of HBM behaves EXACTLY the same as 4GB of GDDR5. If this is such a no-brainer you should have no problem finding an instance of a Fury X running out of memory...


Quote:


> Originally Posted by *CasualCat*
> 
> How do you definitively test/prove that though? PCPer talked about being able adjust settings up on GTA5 that made it stutter and drop to single frames when the 980Ti didn't (speculating that it could be memory), but even they couldn't say whether it was memory or something else. Is there even a proper tool to gauge this?


There's HardOCP's observations as well. Though, as Alatar said, we have no real tools to "prove" VRAM limitations, but severe and uncharacteristic frametime spikes or FPS drops can hint at it. It's pretty obvious what a lack of VRAM looks like in motion, though. Stuttering or otherwise erratic performance usually alongside asset loading and/or pop in during character or camera movement. The degree to which VRAM can be limiting varies greatly. It can be as little as not having enough to cache certain resources ahead of time resulting in a rare and brief hitching during area transitions or it can get so severe that the video card flat-out doesn't have enough to properly render a current scene and the framerate just plummets into the single digits.

HardOCP's observations on 4 GBs limiting 980s and 290Xs:

"The issue to focus on however is that all of these video cards are hitting the maximum capacities in Dying Light. This game demands high VRAM capacities to alleviate stutter. We found that in VRAM limited situations the game would stutter when being grabbed by zombies, or having to combat zombies. There would be a hang, or pause, or freeze if the resolution was too high for the video card."

They observed similar things happening in Dying Light with the Fury X, though their reported data is too limited to dismiss CPU overhead as a potential factor, imo:

"The AMD Radeon R9 Fury X is the first video card from AMD to allow us to play at the highest in-game settings at 1440p. We were able to turn up the shadow map size to "Very High" and max out the view distance. The game was playable, but you will notice it does have a lot lower minimum framerate on the Fury X compared to the GTX 980 Ti. There was some stutter here and there as it loaded new game assets in different areas, which did not occur on the GTX 980 Ti."


----------



## Ganf

Quote:


> Originally Posted by *poii*
> 
> There might be air in the AiO causing the high pitched noise, dafuq?!
> 
> Someone said the noise gets way toned down when the radiator is placed above the GPU meaning air could be the problem...


Sounds more like the pump is way too fast, and needs to be dialed back. Putting the radiator above the pump puts more head pressure on it and slows it down is my theory. They probably just put it at max speed instead of paying a few bucks extra for a PWM model.


----------



## Cool Mike

Agreed. At 4k it beats the Ti and I game at 4K. I will receive my fury tomorrow. I also have a EVGA TI and look forward to benchmarking both.


----------



## BoredErica

Quote:


> Originally Posted by *CasualCat*
> 
> How do you definitively test/prove that though? PCPer talked about being able adjust settings up on GTA5 that made it stutter and drop to single frames when the 980Ti didn't (speculating that it could be memory), but even they couldn't say whether it was memory or something else. Is there even a proper tool to gauge this?


GTA V 5 is one of those games that use more vram, right? I think it's decent evidence that 4 gigs isn't enough for whatever setting they were running. What else could it possibly be?

Quote:


> Originally Posted by *Natskyge*
> 
> Just record ram usage?


The argument is that there is a lot of vram used that don't end up impacting performance. Some of the vram used is excessive and not needed.

As Ryan Shrout mentioned, AMD went ahead and added a lot more memory to their 300 series cards, but their flagship card is stuck at 4 gigs. Either more memory matters or it doesn't, the message there is inconsistent.

I've been kindda defending AMD the whole time, but from the 4gb vram to the lack of DVI-D to not being able to match the 980ti in 1440p despite releasing later, to the water cooling not being enough to get overclocking well, to the voltage lock I'm getting tired of this. Will DX12 help AMD more than Nvidia? I'll believe it when I see it. I also don't have forever to wait for AMD to fix their stuff, because at this rate I might as well wait for Pascal. On the other hand, Nvidia's driver crap is kind of worrying to me - funny, I feel AMD drivers are better because performance actually improves with time. But if I'm going to upgrade yearly then I don't care too much if the performance lags behind 2-3 years down the line due to lack of driver support.


----------



## Casey Ryback

Quote:


> Originally Posted by *Kand*
> 
> Wait for DX12?
> 
> Dx11 was released 2008. 7 years later, im still waiting.


I don't understand.

DX12 games are scheduled for release at the end of the year. What's the problem with people waiting for it to decide on their upgrades?


----------



## Ganf

Quote:


> Originally Posted by *Serandur*
> 
> There's HardOCP's observations as well. Though, as Alatar said, we have no real tools to "prove" VRAM limitations, but severe and uncharacteristic frametime spikes or FPS drops can hint at it. It's pretty obvious what a lack of VRAM looks like in motion, though. Stuttering or otherwise erratic performance usually alongside asset loading and/or pop in during character or camera movement. The degree to which VRAM can be limiting varies greatly. It can be as little as not having enough to cache certain resources resulting in a rare and brief hitching during area transitions or it can get so severe that the video card flat-out doesn't have enough to properly render a current scene and the framerate just plummets into the single digits.
> "


FPS stutter that matches up with spikes in DRAM hard faults and/or storage access would indicate it wouldn't it?


----------



## Kane2207

Quote:


> Originally Posted by *Darkwizzie*
> 
> As Ryan Shrout mentioned, AMD went ahead and added a lot more memory to their 300 series cards, but their flagship card is stuck at 4 gigs. Either more memory matters or it doesn't, the message there is inconsistent.


AMDMatt was pulled up on a forum post stating the requirement and benefits of 8GB on the 300 series.

I don't think the left hand knows what the right hand is doing at AMD 90% of the time...


----------



## gamervivek

Considering that AMD are far more competitive at higher resolutions it's comical that nvidia came up with DSR and now are getting outplayed on that front with VSR having next to no hit and not making the image blurrier.
Though dx12 couldn't come soon enough for AMD's cards.

Hopefully AMD are quicker with the 8GB version and driver improvements, and of course Fury Maxxx.


----------



## Kane2207

Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't understand.
> 
> DX12 games are scheduled for release at the end of the year. What's the problem with people waiting for it to decide on their upgrades?


Maybe we're sick of waiting?

Wait for frame pacing fixes
Wait for wait for the Omega drivers
Wait for Freesync
Wait for Fiji
Wait for DX12

At some point people have to stop waiting for what might arrive and actually base their purchases on the here and now.


----------



## Assirra

Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't understand.
> 
> DX12 games are scheduled for release at the end of the year. What's the problem with people waiting for it to decide on their upgrades?


Because people are constantly changing the goal post.
It's always "wait for x" for a long time now. Wait for the fury, wait for 300x series, wait for freesync.
Sometimes people want to upgrade now, not wait for some potentials.


----------



## ladcrooks

Quote:


> Originally Posted by *Cool Mike*
> 
> Agreed. At 4k it beats the Ti and I game at 4K. I will receive my fury tomorrow. I also have a EVGA TI and look forward to benchmarking both.


look forward to an OC user review and an update in a months time when you have had a little play with it - hopefully driver updates will make this shine a little brighter

Awwwwwwwww! You lucky sod , so jealous


----------



## CasualCat

Quote:


> Originally Posted by *Darkwizzie*
> 
> GTA V 5 is one of those games that use more vram, right? I think it's decent evidence that 4 gigs isn't enough for whatever setting they were running. What else could it possibly be?


Yes there are a couple settings you can crank up such as MSAA, reflection MSAA, and population variety.


----------



## Tivan

Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't understand.
> 
> DX12 games are scheduled for release at the end of the year. What's the problem with people waiting for it to decide on their upgrades?


The problem is, you can't imply that AMD or Nvidia will potentially benefit more or not than each other from a lower overhead API, regardless of how well founded or not your stance is. Because it's not out, and the other side will undoubtedly retort with 'yeah same on this side, DX12 is great for our camp'.

'Wait and see' should be the standard answer for this, which is perfectly reasonable.


----------



## decimator

So what are the chances that we see a Fury X revision? There are still 12 months until the next node and Fury X's flaws are already apparent.

What would make sense to me is a slight decrease in the number of SP's (4096 just seems a bit absurd) to make more room on the die for more ROP's (maybe 96 or ideally 128). I'm not sure exactly how feasible this is, but I think it would behoove AMD to at the very least look into this...


----------



## ladcrooks

Quote:


> Originally Posted by *Kane2207*
> 
> Maybe we're sick of waiting?
> 
> Wait for frame pacing fixes
> Wait for wait for the Omega drivers
> Wait for Freesync
> Wait for Fiji
> Wait for DX12
> 
> At some point people have to stop waiting for what might arrive and actually base their purchases on the here and now.


But that goes for anything we wait for , just a tad longer with amd - all aboard the Skylark


----------



## GorillaSceptre

Quote:


> Originally Posted by *KyadCK*
> 
> For the millionth time...
> 
> It indicates that there is no voltage control yet, and Afterburner (and all the others) need to catch up. As every single review has said.


I was talking about stock voltage..

If you over-volt 980Ti's the difference would be even larger.


----------



## Serandur

Quote:


> Originally Posted by *Ganf*
> 
> FPS stutter that matches up with spikes in DRAM hard faults and/or storage access would indicate it wouldn't it?


It should, but do we have anything to record the two simultaneously?


----------



## hawker-gb

So far i am satisfied with initial Fury X performance.

I am not in hurry,waiting DX12 benchmarks. I think Fury X will be significantly better then.


----------



## BoredErica

I once again go back to the point I made in the past: Nvidia already knew the performance of the Fury X before it launched. They stole the show by having 980ti released before Fury X. The Nvidia fans all ran over there and having bought something they are unlikely to drop it and go for Fury X even if it's faster. And the 980ti ends up faster, but not far faster than it has to be to serve its purpose. It also has more vram than the Fury whereas the 980 by default only has 4. Coincidence? Along with it, the 980 price drop to try to put the Fury X in a harder position, possibly addressing the Fury as well when it comes out in the future. Had Nvidia not come out with the 980TI they would be in trouble and the Fury X would be the bomb. And we couldn't argue and complain about Fury X's vram of only 4 gigs because the 980 has 4 gigs. The Fury X would win with better performance all around but on top of that, extra high 4k performance. No, now the 980ti is out this changes everything. It has turned AMD's reveal from a total victory to what it is right now. This was all pre-planned, Nvidia's strategy to combat the Fury X.

Had AMD not pushed with their new cards, Nvidia would still be feeding us the Titan X as the only high end card. And I think we should all be thankful that's not the case. I suspect Nvidia has more up their sleeves, they're just not revealing it. They'll just release something slightly faster than whatever AMD offers and call it a day, if push came to shove.

I rarely make these kinds of predictions/extrapolations but I think I am right this time.


----------



## AmericanLoco

Quote:


> Originally Posted by *Kane2207*
> 
> AMDMatt was pulled up on a forum post stating the requirement and benefits of 8GB on the 300 series.
> 
> I don't think the left hand knows what the right hand is doing at AMD 90% of the time...


They do. There are many technical reasons why the HBM1 card is limited to 4GB. To use 8GB and keep the 4096 bit bus (8GB would normally need an 8192 bit bus with HBM1) would require a pretty sophisticated transposer that probably has some kind of bank-switching mechanism. Had AMD and Hynix gotten their act together and released this card 8 months ago, we might have had an 8GB version by now.


----------



## Casey Ryback

Quote:


> Originally Posted by *Kane2207*
> 
> Maybe we're sick of waiting?
> 
> Wait for frame pacing fixes
> Wait for wait for the Omega drivers
> Wait for Freesync
> Wait for Fiji
> Wait for DX12
> 
> At some point people have to stop waiting for what might arrive and actually base their purchases on the here and now.


That's cool then they should buy something.









All I was saying is if you want to wait for DX12 before doing upgrades then don't feel like you shouldn't. Do as you please.

Basically I didn't understand the whole 'I've been waiting 7 years' comment.

I don't think I'll buy any card right now, a little disappointed with fury and the 980ti is overpriced. (obviously so is the fury X)

I shall wait for air fury and price drops personally. $650 USD is a little ridiculous to play early access and badly optimised ports.


----------



## 21cage12

Drivers aren't right atm, and dx12 is not here yet suggest the 980ti will be facing a comeback hit in the face.
If Pascal makes this GPU really drop in price, it might be my first CF!
Am waiting


----------



## Alatar

Quote:


> Originally Posted by *Kane2207*
> 
> AMDMatt was pulled up on a forum post stating the requirement and benefits of 8GB on the 300 series.
> 
> I don't think the left hand knows what the right hand is doing at AMD 90% of the time...


AMDMatt (or formerly LtMatt) was just a normal forum poster who was extremely good at copy pasting every single extremely slanted article he could find to the ocuk forums filling the whole place with certain never ending arguments. He was also very good at not actually knowing what he was talking about but still continuing to make the same uninformed points over and over again in such quantities that it drowned out everything else.

And then AMD hired him because of his forum posting habits. Ironically on ocuk that means that he can't engage in discussions about competitors so his effectiveness was drastically lowered.

Either way he's not qualified to make any sorts of statements about memory requirements. At best he can read an internal AMD marketing pdf and repeat what it says. He's just a basic social media rep, not someone with any sort of technical expertise. He's definitely neither the left or the right hand of AMD


----------



## hawker-gb

Quote:


> Originally Posted by *21cage12*
> 
> Drivers aren't right atm, and dx12 is not here yet suggest the 980ti will be facing a comeback hit in the face.
> If Pascal makes this GPU really drop in price, it might be my first CF!
> Am waiting


Wise people wait,others would haste to buy 980ti.


----------



## Newbie2009

Quote:


> Originally Posted by *Alatar*
> 
> AMDMatt (or formerly LtMatt) was just a normal forum poster who was extremely good at copy pasting every single extremely slanted article he could find to the ocuk forums filling the whole place with certain never ending arguments. He was also very good at not actually knowing what he was talking about but still continuing to make the same uninformed points over and over again in such quantities that it drowned out everything else.
> 
> And then AMD hired him because of his forum posting habits. Ironically on ocuk that means that he can't engage in discussions about competitors so his effectiveness was drastically lowered.
> 
> Either way he's not qualified to make any sorts of statements about memory requirements. At best he can read an internal AMD marketing pdf and repeat what it says. He's just a basic social media rep, not someone with any sort of technical expertise. He's definitely neither the left or the right hand of AMD


If only Nvidia would do the same for you


----------



## BoredErica

Quote:


> Originally Posted by *hawker-gb*
> 
> Wise people wait,others would haste to buy 980ti.


We can continue to play the waiting game until we die of old age. The question is what is in the near future. Gains AMD may or may not get in the future at some fuzzy time period makes it hard to continue waiting. The more you wait the closer you get to next gen cards. It's early to be talking about Pascal but then again the same applies to true DX12 games. It's not automatically wiser to decide to wait. Whatever gain the user feels the 980ti has over the Fury X in terms of experience and enjoyment during the time AMD has yet to defeat the 980ti through software has cash value. I'm not claiming the exact opposite of what you're saying either.


----------



## Alatar

Quote:


> Originally Posted by *hawker-gb*
> 
> Wise people wait,others would haste to buy 980ti.


When it comes to computer hardware the wise people know that waiting doesn't work unless we're talking about a week or two for some certain hardware release.

There's always something better coming up soon. If you keep waiting you'll be stuck waiting. The reasonable solution is to just go ahead and buy the part you want when you feel like you need/want it and repeat the process when the previous stuff isn't good enough anymore.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Kand*
> 
> Wait for DX12?
> 
> Dx11 was released 2008. 7 years later, im still waiting.


Zing!









Quote:


> Originally Posted by *Casey Ryback*
> 
> I don't understand.
> 
> DX12 games are scheduled for release at the end of the year. What's the problem with people waiting for it to decide on their upgrades?


People are saying "Wait for better drivers", and after 7 years of waiting for AMD to fix their poor performance with DirectX 11, he is suggesting that you'll have to wait 7 more years for AMD to fix issue with DirectX 12. Basically, AMD has a poor history of running on DirectX, so the whole argument of "Wait for DirectX 12" is a joke.


----------



## Boomstick727

Quote:


> Originally Posted by *Newbie2009*
> 
> If only Nvidia would do the same for you


Haha


----------



## hawker-gb

Quote:


> Originally Posted by *Darkwizzie*
> 
> We can continue to play the waiting game until we die of old age. The question is what is in the near future. Gains AMD may or may not get in the future at some fuzzy time period makes it hard to continue waiting. The more you wait the closer you get to next gen cards. It's early to be talking about Pascal but then again the same applies to true DX12 games. It's not automatically wiser to decide to wait. Whatever gain the user feels the 980ti has over the Fury X in terms of experience and enjoyment during the time AMD has yet to defeat the 980ti through software has cash value. I'm not claiming the exact opposite of what you're saying either.


I think neither of us would get too old in one month time.









I was 50:50 (980ti:Fury X) before Fury X launch. Now i am close to buy Fury X.


----------



## Serandur

Quote:


> Originally Posted by *Darkwizzie*
> 
> I once again go back to the point I made in the past: Nvidia already knew the performance of the Fury X before it launched. They stole the show by having 980ti released before Fury X. The Nvidia fans all ran over there and having bought something they are unlikely to drop it and go for Fury X even if it's faster. And the 980ti ends up faster, but not far faster than it has to be to serve its purpose. It also has more vram than the Fury whereas the 980 by default only has 4. Coincidence? Along with it, the 980 price drop to try to put the Fury X in a harder position, possibly addressing the Fury as well when it comes out in the future. Had Nvidia not come out with the 980TI they would be in trouble and the Fury X would be the bomb. And we couldn't argue and complain about Fury X's vram of only 4 gigs because the 980 has 4 gigs. The Fury X would win with better performance all around but on top of that, extra high 4k performance. No, now the 980ti is out this changes everything. It has turned AMD's reveal from a total victory to what it is right now. This was all pre-planned, Nvidia's strategy to combat the Fury X.
> 
> Had AMD not pushed with their new cards, Nvidia would still be feeding us the Titan X as the only high end card. And I think we should all be thankful that's not the case. I suspect Nvidia has more up their sleeves, they're just not revealing it. They'll just release something slightly faster than whatever AMD offers and call it a day, if push came to shove.
> 
> I rarely make these kinds of predictions/extrapolations but I think I am right this time.


I disagree on the Titan X point. Nvidia didn't need any competition to push out the 780 3 months after the original Titan and I think a 980 Ti in the $650-700 range was inevitable. However, I do think it's possible Nvidia would have cut down the GM200 chip in the 980 Ti a little more if the Fury X wasn't imminent. Was expecting a 2560 or 2688 shader part; the 2816 one we got is better.

About AMD and Nvidia's reactions to each other; I get the impression AMD don't seem to take into account what Nvidia have planned as much as Nvidia do for AMD. I mean, when the 7970 rolled out for $550, did AMD really not foresee Nvidia making another big-die chip that could crush it (meaning the mid-sized could be its competition) or did they just not care? More relevantly, did AMD really not know GM200 was out there in the wild this past year? We've all known, just us on forums let alone a corporation with ties to TSMC, that Maxwell is an efficient monster and that GM200 would be a thing. We've known for 9 months that GM204 alone is pretty beastly and we've known for three months that GM200 (in the Titan X) definitely is an absolute monster in raw performance and efficiency.

It's debatable what AMD could have done after realizing Maxwell would be such a beastly architecture since it was probably too late into Fiji's development (considering the chip actually taped out last year alongside Tonga, apparently) and past the decision-making for the rebrand, but AMD still had plenty of time to adjust pricing, work on drivers, roll out at least a full Tonga chip, etc. and if they didn't anticipate Nvidia being able to just slightly cut down GM200 and take a hit on their massive profit margins, I don't know what they were thinking. AMD have known for at least a year (if they've been paying any attention) that this would happen and if they still don't have everything put into place for Fiji software-wise after realizing it, I question the sanity of their management.


----------



## ladcrooks

Quote:


> Originally Posted by *Darkwizzie*
> 
> I once again go back to the point I made in the past: Nvidia already knew the performance of the Fury X before it launched. They stole the show by having 980ti released before Fury X. The Nvidia fans all ran over there and having bought something they are unlikely to drop it and go for Fury X even if it's faster. And the 980ti ends up faster, but not far faster than it has to be to serve its purpose. It also has more vram than the Fury whereas the 980 by default only has 4. Coincidence? Along with it, the 980 price drop to try to put the Fury X in a harder position, possibly addressing the Fury as well when it comes out in the future. Had Nvidia not come out with the 980TI they would be in trouble and the Fury X would be the bomb. And we couldn't argue and complain about Fury X's vram of only 4 gigs because the 980 has 4 gigs. The Fury X would win with better performance all around but on top of that, extra high 4k performance. No, now the 980ti is out this changes everything. It has turned AMD's reveal from a total victory to what it is right now. This was all pre-planned, Nvidia's strategy to combat the Fury X.
> 
> Had AMD not pushed with their new cards, Nvidia would still be feeding us the Titan X as the only high end card. And I think we should all be thankful that's not the case. I suspect Nvidia has more up their sleeves, they're just not revealing it. They'll just release something slightly faster than whatever AMD offers and call it a day, if push came to shove.
> 
> I rarely make these kinds of predictions/extrapolations but I think I am right this time.


well put


----------



## hawker-gb

Quote:


> Originally Posted by *Alatar*
> 
> When it comes to computer hardware the wise people know that waiting doesn't work unless we're talking about a week or two for some certain hardware release.
> 
> There's always something better coming up soon. If you keep waiting you'll be stuck waiting. The reasonable solution is to just go ahead and buy the part you want when you feel like you need/want it and repeat the process when the previous stuff isn't good enough anymore.


I think jump from my trusty HD7850 to Fury X will be nice leap.


----------



## tconroy135

Quote:


> Originally Posted by *hawker-gb*
> 
> I think neither of us would get too old in one month time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was 50:50 (980ti:Fury X) before Fury X launch. Now i am close to buy Fury X.


This is a lie, if you are still buying the Fury X after the reviews, you were always going to buy a Fury X.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Darkwizzie*
> 
> I once again go back to the point I made in the past: Nvidia already knew the performance of the Fury X before it launched. They stole the show by having 980ti released before Fury X. The Nvidia fans all ran over there and having bought something they are unlikely to drop it and go for Fury X even if it's faster. And the 980ti ends up faster, but not far faster than it has to be to serve its purpose. It also has more vram than the Fury whereas the 980 by default only has 4. Coincidence? Along with it, the 980 price drop to try to put the Fury X in a harder position, possibly addressing the Fury as well when it comes out in the future. Had Nvidia not come out with the 980TI they would be in trouble and the Fury X would be the bomb. And we couldn't argue and complain about Fury X's vram of only 4 gigs because the 980 has 4 gigs. The Fury X would win with better performance all around but on top of that, extra high 4k performance. No, now the 980ti is out this changes everything. It has turned AMD's reveal from a total victory to what it is right now. This was all pre-planned, Nvidia's strategy to combat the Fury X.
> 
> Had AMD not pushed with their new cards, Nvidia would still be feeding us the Titan X as the only high end card. And I think we should all be thankful that's not the case. I suspect Nvidia has more up their sleeves, they're just not revealing it. They'll just release something slightly faster than whatever AMD offers and call it a day, if push came to shove.
> 
> I rarely make these kinds of predictions/extrapolations but I think I am right this time.


I'm not so sure that 980ti was prompted by prior knowledge of Furys performance, perhaps the release date was influenced somewhat. I agree that Nvidia probably have something up their sleeve just in case. I also don't think the 780ti would have turned up to knock the OG Titan off it's perch if it wasn't for Hawaii.

I think the 980ti either had to have 6GB or 3 GB of RAM, isnt that how the 384 bit bus works? Or you end up with partitioned memory that doesnt work as well like the 660ti? I thought that was why 256 or 512 bit cards almost always have 2/4/8 GB, and 384 bit cards have 3/6/12.

Myself being 'only' on 1440p the fact that Fury X has only 4GB doesnt worry me, it's more the fact that I dont believe it will overclock as well as even a reference 980ti.

I just checked and here in Finland 980ti ref cards start at 830 euro and go up to over a thousand for some custom and a Fury X can be had for 729 Euro. That seems like a more sensible price difference then them being the same. Still a bit rich for me right now though, hopefully the Fury non X will be even better value


----------



## Casey Ryback

Quote:


> Originally Posted by *47 Knucklehead*
> 
> People are saying "Wait for better drivers", and after 7 years of waiting for AMD to fix their poor performance with DirectX 11, he is suggesting that you'll have to wait 7 more years for AMD to fix issue with DirectX 12. Basically, AMD has a poor history of running on DirectX, so the whole argument of "Wait for DirectX 12" is a joke.


I didn't make my point properly.

We are on the brink of new technology, along with DX12.

I'm not saying wait for AMD to get it's drivers right, but wait in general.

Pascal and whatever AMD bring to the table next could make even the 980ti look like an old dinosaur, DX12 or not.

I feel like we're at the end of an era, and a new far superior one is fairly close.

That saying if you feel you need an upgrade by all means go for it.


----------



## hawker-gb

Quote:


> Originally Posted by *tconroy135*
> 
> This is a lie, if you are still buying the Fury X after the reviews, you were always going to buy a Fury X.


If it was significantly slower than 980ti i would buy 980ti.
It is pretty much similar in performance so i would buy Fury X.
Simple.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Darkwizzie*
> 
> I once again go back to the point I made in the past: Nvidia already knew the performance of the Fury X before it launched. They stole the show by having 980ti released before Fury X.
> 
> ...
> 
> Had AMD not pushed with their new cards, Nvidia would still be feeding us the Titan X as the only high end card. And I think we should all be thankful that's not the case. I suspect Nvidia has more up their sleeves, they're just not revealing it. They'll just release something slightly faster than whatever AMD offers and call it a day, if push came to shove.


Except you are missing something here.

nVidia released the Titan X WAY BEFORE much was known about Fury X. nVidia was planning the Titan X the day after the previous Titan was released.

Everyone knew nVidia was coming out with a Ti card since the whole Maxwell line was introduced. AMD has ZERO to do with that.

The only thing that really changed was that nVidia used a cut down Titan X as the Ti instead of using an upgunned GTX 980 as the Ti.

Once again, AMD people are running around, now that they got beat AGAIN (G-Sync STILL destroys FreeSync despite all the promises ... Mantle was a flop and DirectX won ... and I won't even get into AMD's CPUs) and people are running around with the "but, but, but ... if it wasn't for AMD, nothing new would have happened. I'm sorry, that is getting to be BULL. AMD has once again proven they are all hype and no delivery.

Does AMD have a MINOR influence on things? Sure. But you and others blow that influence WAY out of proportion.


----------



## tconroy135

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I'm not so sure that 980ti was prompted by prior knowledge of Furys performance, perhaps the release date was influenced somewhat. I agree that Nvidia probably have something up their sleeve just in case. I also don't think the 780ti would have turned up to knock the OG Titan off it's perch if it wasn't for Hawaii.
> 
> I think the 980ti either had to have 6GB or 3 GB of RAM, isnt that how the 384 bit bus works? Or you end up with partitioned memory that doesnt work as well like the 660ti? I thought that was why 256 or 512 bit cards almost always have 2/4/8 GB, and 384 bit cards have 3/6/12.
> 
> Myself being 'only' on 1440p the fact that Fury X has only 4GB doesnt worry me, it's more the fact that I dont believe it will overclock as well as even a reference 980ti.
> 
> I just checked and here in Finland 980ti ref cards start at 830 euro and go up to over as thousand and a Fury X can be had for 729 Euro. That seems like a more sensible price difference then them being the same. Still a bit rich for me right now though, hopefully the Fury non X will be even better value


If they Fury X had better performance than it does NVIDIA could have released the 980TI with the full die. The yield would be lower and thus cut into their profits, but the performance would have been increased.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Casey Ryback*
> 
> I didn't make my point properly.
> 
> We are on the brink of new technology, along with DX12.
> 
> I'm not saying wait for AMD to get it's drivers right, but wait in general.
> 
> Pascal and whatever AMD bring to the table next could make even the 980ti look like an old dinosaur, DX12 or not.
> 
> I feel like we're at the end of an era, and a new far superior one is fairly close.
> 
> That saying if you feel you need an upgrade by all means go for it.


Ah, ok.

Yeah, I am waiting for Pascal.

I got sick of waiting for FreeSync and jumped on board the G-Sync train.

Sometimes you can wait, other times you have to stop waiting and go with what works now.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *tconroy135*
> 
> If they Fury X had better performance than it does NVIDIA could have released the 980TI with the full die. The yield would be lower and thus cut into their profits, but the performance would have been increased.


Yup. Basically just make a Titan X with 6GB of memory.

nVidia had a ton of options.


----------



## Tivan

Quote:


> Originally Posted by *Alatar*
> 
> When it comes to computer hardware the wise people know that waiting doesn't work unless we're talking about a week or two for some certain hardware release.
> 
> There's always something better coming up soon. If you keep waiting you'll be stuck waiting. The reasonable solution is to just go ahead and buy the part you want when you feel like you need/want it and repeat the process when the previous stuff isn't good enough anymore.


Agreed, though the most cost efficient path is to go for second hand/1 generation older stuff right now. So if paying a premium for novelty anyway, might as well pick whichever novelty promises more lasting value. Which is entirely subjective/speculation.


----------



## BoredErica

Quote:


> Originally Posted by *Serandur*
> 
> I disagree on the Titan X point. Nvidia didn't need any competition to push out the 780 3 months after the original Titan and I think a 980 Ti in the $650-700 range was inevitable. However, I do think it's possible Nvidia would have cut down the GM200 chip in the 980 Ti a little more if the Fury X wasn't imminent. Was expecting a 2560 or 2688 shader part; the 2816 one we got is better.
> 
> About AMD and Nvidia's reactions to each other; I get the impression AMD don't seem to take into account what Nvidia have planned as much as Nvidia do for AMD. I mean, when the 7970 rolled out for $550, did AMD really not foresee Nvidia making another big-die chip that could crush it (meaning the mid-sized could be its competition) or did they just not care? More relevantly, did AMD really not know GM200 was out there in the wild this past year? We've all known, just us on forums let alone a corporation with ties to TSMC, that Maxwell is an efficient monster and that GM200 would be a thing. We've known for 9 months that GM204 alone is pretty beastly and we've known for three months that GM200 (in the Titan X) definitely is an absolute monster in raw performance and efficiency.
> 
> It's debatable what AMD could have done after realizing Maxwell would be such a beastly architecture since it was probably too late into Fiji's development (considering the chip actually taped out last year alongside Tonga, apparently) and past the decision-making for the rebrand, but AMD still had plenty of time to adjust pricing, work on drivers, roll out at least a full Tonga chip, etc. and if they didn't anticipate Nvidia being able to just slightly cut down GM200 and take a hit on their massive profit margins, I don't know what they were thinking. AMD have known for at least a year (if they've been paying any attention) that this would happen and if they still don't have everything put into place for Fiji software-wise after realizing it, I question the sanity of their management.


Yeah, and I think the 390 having 8 gigs of vram for example, while the Fiji card only has 4 gigs wasn't due to lack of communication within the company, it's just a roadblock they couldn't get past no matter how hard they tried. Maybe they thought they had 8gb but some things went south and they could only turn up 4 gigs? I think AMD also knew on some level, whatever Nvidia was going to bring to the table, they just couldn't really counter it effectively. The Fury X came out pretty late and now Hawker is essentially saying we have to wait another month (which is not a promise but a guess) for whatever secret sauce AMD can put inside the software to make the Fury X better. If there are more serious improvements to be had, I think the late release of the Fury X wasn't on accident (or just because AMD wanted to get rid of old stock), AMD's been working hard to get everything ready and needed that extra time.

About the 780 I didn't think about that, but I think the Titan had a decent lead over the 780. I was thinking along the lines of 780ti after Titan vs 980ti after Titan X. While the 780ti took 8 months to show up after the Titan, but it also had a decent lead over the Titan, while the 980ti only took 3 months but doesn't surpass the Titan X. So, ok.

While I am thinking about vram for Skyrim soon I will have to think about vram for Fallout 4!


----------



## Slink3Slyde

Quote:


> Originally Posted by *tconroy135*
> 
> If they Fury X had better performance than it does NVIDIA could have released the 980TI with the full die. The yield would be lower and thus cut into their profits, but the performance would have been increased.


I was thinking more that if Fury X had beaten the 980ti significantly it would also likely have beaten the stock Titan X, which would have made it pointless to release a full die. I still think they have soemthing bigger again up their sleeves just in case. As you say there's probably less profit there for them so now they don't need it this gen.


----------



## Forceman

Quote:


> Originally Posted by *flopper*
> 
> 30% amd benefit from win 10 in project cars.
> thats like a new card.


People keep using pCars as the poster child for DX12 gains, but have you seen AMD performance in that game? 30% improvement just gets them back to where they should be in line with other games. Expecting that kind of jump on a game where they aren't already abysmally behind is kind of a stretch.
Quote:


> Originally Posted by *Ganf*
> 
> HAHAHA! Nope.
> 
> Both companies are supporting different sets of DX12 features. Performance will still vary widely from card to card based upon it's DX12 feature implementation, and how the devs use those features.


But Nvidia is the one with better developer relations. How many Gaming Evolved games are there compared to Gameworks/TWIMTBP? Do you think Gameworks will be taking full advantage of asynchronous compute if Nvidia is weak there?

On a related note, what kind of DX12 feature level do the consoles have?


----------



## ondoy

XSPC also reveals water block for Radeon R9 Fury X






EK Water Blocks Ready with its Radeon R9 Fury X Water Block, Single-slot Capable



not sure if this has been posted here...


----------



## Casey Ryback

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Ah, ok.
> 
> Yeah, I am waiting for Pascal.
> 
> I got sick of waiting for FreeSync and jumped on board the G-Sync train.
> 
> Sometimes you can wait, other times you have to stop waiting and go with what works now.


I'm on the fence with all these cards currently, I almost bought a 970 when it came out, then I almost picked up a cheap sapphire tri-X 290.

I usually go for the mid range type cards hoping for value at 1080p.

It's actually the software more than anything that has turned me away from doing it, if my 7970 crumbles with star citizen (which it no doubt will) I'm pretty sure I'll be straight down the shop buying whatever is available at the time.

I'm not flying a fancy ship at low framerates


----------



## BoredErica

That top water block looks like it's a ton of copper?









Anyways, maybe scratch the Titan X comment I made, but I think at least the rest of the comment rings true.









Quote:


> Originally Posted by *Casey Ryback*
> 
> I'm on the fence with all these cards currently, I almost bought a 970 when it came out, then I almost picked up a cheap sapphire tri-X 290.
> 
> I usually go for the mid range type cards hoping for value at 1080p.
> 
> It's actually the software more than anything that has turned me away from doing it, if my 7970 crumbles with star citizen (which it no doubt will) I'm pretty sure I'll be straight down the shop buying whatever is available at the time.
> 
> I'm not flying a fancy ship at low framerates


YOLO get whatever you want.


----------



## jomama22

Not sure why so many people are surprised at the performance tbh. We're talking very similar GCN arch and an increase of 45% more SPs over the 290x but with the same ROP count.

It's like growing a foot taller but still sitting in a child's car seat.

I dunno, I said it a few times here to expect Fiji to be 5-10% slower then the 980ti, but was hoping for a $550 release point as that would be the best marketing price.

Drivers will help a bit in certain circumstances with some games (especially those where we don't see much spread between the 290x and Fiji) but I don't expect gains to be anything like the 7970 saw in its first 6 months.

Personally, I'm not so disappointed but rather underwhelmed. Had the 290x had the same, well worked hype and announcement that fiji had, the 290x probably would have been received a bit better as the 290x really brought high end competition to nvidia and the then 780ti.

Ah the days when the 290x came out and me and alatar had a fun back and forth with his titan. Can't see me being able to do that with the fury x unfortunately.

If I was in for any new gfx cards, its hard to not get a 980ti especially if you plan to put water blocks on them anyway. That's where it comes down for me.


----------



## ZealotKi11er

I know a lot of people like big cards but having one of these for a ITX build with custom WB would be nice. I have been playing with a R9 290X @ 1200/1500 @ 1440p in games CFX support is not there and i am happy with the performance. An extra 40-50% would make it ideal for me.


----------



## Casey Ryback

Quote:


> Originally Posted by *Darkwizzie*
> 
> YOLO get whatever you want.


I'm too indecisive so don't know what I want lol.

Fury on air for $500 is my hope at this point. Any more than that for 1080p just doesn't seem justifiable.


----------



## blue1512

Quote:


> Originally Posted by *Tivan*
> 
> The problem is, you can't imply that AMD or Nvidia will potentially benefit more or not than each other from a lower overhead API, regardless of how well founded or not your stance is. Because it's not out, and the other side will undoubtedly retort with 'yeah same on this side, DX12 is great for our camp'.
> 
> 'Wait and see' should be the standard answer for this, which is perfectly reasonable.


Yes you can't compare cross platform. But it's true that AMD sucks at draw calls in Dx11 and bosses it in Dx12 implies a significant performance boost


----------



## ZealotKi11er

HD 4870 512MB GDDR5 -> 800 SP
HD 5870 1GB GDDR5 -> 1600 SP

Fury X 4GB HBM1 -> 4096SP
Next Gen GPU 8GB -> 8192SP

The good old days.


----------



## Klocek001

I still consider 980ti the ultimate dx11 gpu, that's just undeniable. but I'll wait until first dx12 gaming benches to prove if it's worth the price of two new 390s/295x2


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> I still consider 980ti the ultimate dx11 gpu, that's just undeniable. but I'll wait until first dx12 gaming benches to prove if it's worth the price of two new 390s/295x2


Very true. I also been hearing people that DX11 is a bit better in Windows 10 for AMD.


----------



## Klocek001

And, I cannot stress that enough, for dx11 on win8 980ti just shreds every other gpu to pieces, including 295x2 cause avg fps isn't what matters the most. But dx12 promises native cf support, which might put 295x2 in the position in which 980ti is now against 290x (performance wise, not power efficiency wise that is







)

In the meantime I'll have plenty of time to improve my storage with another ssd(s),I do not wanna hurry a GPU purchase with dx12 coming up soon. that would be just unreasonable in my eyes.


----------



## sugalumps

Quote:


> Originally Posted by *Tivan*
> 
> There's just no real hook to make fun off.


"Overclockers dream?"


----------



## Dhoulmagus

I think it's right where it was honestly expected. Typical AMD, drivers and dx12 will probably give the card a decent boost while the price drops from their initial overpricing to squeeze out a few extra bucks from the people who need it right now. I'm not disappointed because this is what we all knew it would be, that's why myself and quite a few other users who were waiting decided to grab dirt cheap 290s and wait til next year.

It's the first iteration of HBM, lets not forget that Nvidia is also switching over to this new tech next year, so in the end all of this does pay off for the consumer though AMD does like to gamble its money much further into R&D then they probably should at this point. I think it was a bit crazy to expect these to come out and devastate even Titan X/980Ti performance. GDDR5 is mature, those are fantastically clocked cards on an excellent architecture. HBM is literally just going to see its first day on the shelves. Unfortunately progress on silicon is becoming quite the struggle, we all knew HBM would only remove the memory bandwidth limitation, though it seems some people really expected a monster GPU that would take down the top cards, we'll have to wait for 2016 for *hopefully* beefier GPUs.

If you need to pinch pennies I would highly suggest grabbing 290s and waiting for pascal/HD4xx Radeon for the major upgrades.

I do understand the disappointment but we have to realize that literally everything was speculation, I'm also not too happy with AMD overall either, ~5 years ago most of us were drooling over HD5xxx and phenom II was bang up performance for an ultra low price. I'll wait until zen hits the shelf before I jump into the M&A chat though







.

I also believe this card would be a major success if it could get down to $499, especially if an 8GB revision can make it to market before Pascal and kill off the low memory argument. I just want to see it be a stable card that proves HBM is truly the new generation of memory. AMD needs sales revenue, not toying with the ultra enthusiast price tags, drop the price!


----------



## Mrzev

Quote:


> Originally Posted by *Cool Mike*
> 
> Agreed. At 4k it beats the Ti and I game at 4K. I will receive my fury tomorrow. I also have a EVGA TI and look forward to benchmarking both.


If your using their HDMI (1.4a) , your stuck at 24 Hz = 24 FPS on the Fury, while the Ti uses HDMI (2.0) which is at 60Hz = 60FPS. If your using DisplayPort your fine in either case.


----------



## Tivan

Quote:


> Originally Posted by *sugalumps*
> 
> "Overclockers dream?"


Because a simple MSI AB update wouldn't fix the lack of voltage adjustment, right.

I'd expect it to be fun to clock once that's available. Maybe not beating Nvidia, but yeah~

edit: it's just hard to put your finger on one spot of criticizing that sticks, throughout the whole video. A lot of smaller points is harder to make fun off than one slightly bigger point. So while the video might have equally valid criticism as the nvidia version, it's just less funny. = D


----------



## undeadhunter

I Feel sorry for the people who want to defend the "waiting game" reality is while you are "waiting" to get stuff , whoever bought a decent product enjoyed it already for a set time and that it's priceless , there is really no profit to wait months or years on promises or dreams when you can get stuff done now when there is a proven working product. It's like real life, people who never get stuff done while waiting for "miracles" or opportunities... lol. Gotta get stuff done with what you have at the moment and make the best of it, in this case nvidia covered their bases with products on different price points and performance, expensive? overpriced? yes but it was available and working while the competition was selling teasers to what we got yesterday (Fury) and that's why Fiji launch was a flop, not due to performance that much, it was bad marketing, hype and a terrible PR. Reality here is Fury took a while to get here, and im sorry fellas is nothing revolutionary. Yes it is an good card and gets the job done but for the price and considering you could get a 980 ti weeks ago for the same price... the card should have launched at a 550$-600$ tops, on top of it forcing you to go with a clc solution on release date, no hdmi 2.0, with less vram, you do the math. Fiji looks like a test bed for HBM for me.I love amd and had a 290x for a while and really had high hopes for this launch, but it's not what it was hyped to be right now (overclockers dream, yeah right lol) could have been awesome if priced right but that's not the case right now. maybe as prices settle, voltage gets worked and some drivers here and there will make a bit of difference, but that will not change the bad impression it left on most of us sadly. Hopefully AMD will get their stuff together for their next launch


----------



## Alatar

Quote:


> Originally Posted by *Tivan*
> 
> Because a simple MSI AB update wouldn't fix the lack of voltage adjustment, right.
> 
> I'd expect it to be fun to clock once that's available. Maybe not beating Nvidia, but yeah~
> 
> edit: it's just hard to put your finger on one spot of criticizing that sticks, throughout the whole video. A lot of smaller points is harder to make fun off than one slightly bigger point.


I'm assuming you're quoting someone else because I never posted that


----------



## Tivan

Quote:


> Originally Posted by *Alatar*
> 
> I'm assuming you're quoting someone else because I never posted that


Indeed, not sure how that happened! Fixed.

edit:
Quote:


> Originally Posted by *undeadhunter*
> 
> ...


Power to you, if you enjoy your purchases. c:
I'm not sure how any of that turns into a criticism of waiting for prices, performance, and other values to be at the right spot, for each individual, but yeah. I agree that sooner and better product launches are preferable.


----------



## DFroN

Card looks great with the single slot WB


----------



## 21cage12

I think the thread needs to shut down for CLEANING: future is not looking good if it stays like this.


----------



## CrazyElf

As Blameless and a few others have noted, I think the big problems are:


Overemphasis on FP64 (not needed for games and uses power).
Needed 96 ROPs - they scaled up the SP and TMUs but not the ROPs.
People were expecting a miracle from HBM. Although it has indirectly helped by lowering power consumption, memory bandwidth was not the miracle solution that many were hoping for.
The card is not a total failure - it just needs to be priced at perhaps $550 or $500 USD and then it will be fine. Perhaps if AMD could release an 8GB version with 96 ROPs, it might help things.

I think though there's a far more interesting question - buy now or wait?


We know that 16nm GPUs are probably happening in H2 of 2016, unless there are further issues with the 16nm node.
It will probably be a 300mm^2 chip at first, but still the sheer number of transistors will probably outperform current GPUs substantially, even with the declining marginal benefit per node.
We also know that both sides are planning HBM2 and a few other technologies.
So I think a moderate performance gain is expected. A "big" 16nm chip is probably not happening until 2017 though.

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I think people that are expecting driver improvements like the 7970 are going to be sorely disappointed. There were a lot of optimizations to be had for the 7970s because it was using the entirely new GCN architecture after having used VLIW for quite some time. That gave a lot of headroom in terms of what they had for optimizing for the best performance. Fiji is just another slightly newer, but still based on the same GCN base architecture to begin with. So a VAST majority of optimizations are going to have already been made for GCN. There might be some, but it isn't going to be on the same level. Or at least common sense and history shows this to be true. Kinda why I laugh that people think Kepler over for Nvidia has a ton of optimization still left. That architecture is old. At this point, they've already probably gotten about 98% of what it has to offer. Don't expect older architectures to perform up to par with a newer architecture that still has a lot of optimization headroom to begin with.


Sadly this.

I don't think either side will get much in the way of optimizations from Drivers. GCN is now a very mature architecture, as is Maxwell (been out since the 750Ti).

So I think that we should buy based on what we have now. We may see single digit improvements, but that is about it.

The 4GB VRAM is also a problem for the Fury X - it's just not going to age as well as other AMD cards have. For the 7970, if you exceeded 3GB of VRAM needed, even with excess VRAM, you probably are at a position where the core cannot render at a satisfactory frame rate anyways, unless you have maybe 3-4 GPUs (there were a couple of 6GB 7970s and 1 8GB 290X). With this chip though, at 4GB, you will run out of VRAM before the core cannot take it anymore.

Quote:


> Originally Posted by *tconroy135*
> 
> If they Fury X had better performance than it does NVIDIA could have released the 980TI with the full die. The yield would be lower and thus cut into their profits, but the performance would have been increased.


Quote:


> Originally Posted by *47 Knucklehead*
> 
> Yup. Basically just make a Titan X with 6GB of memory.
> 
> nVidia had a ton of options.


This is kind of off-topic, but does anyone anticipate this happening?

I know that Nvidia would not have as much incentive to do this now that AMD has played it's hand (the 780Ti I imagine was in no small part due to the 290X release), but say, 5-6 months from now as yields improve, would it be possible for Nvidia to release a 3072 core variant of the 980Ti for say, $700 USD? Yields would have improved on the Titan X and the 28nm is pretty mature anyways.

It would only perform about 3% faster than the 980Ti (about the same performance as Titan X, save where the GPU actually needs the 12GB of VRAM).

Quote:


> Originally Posted by *edo101*
> 
> we've helped them and continued to help them by buying. Vicious cycle.


Basically this:


Less competitive product
Means you lose market share
This leads to less money for R&D
In turn, this leads to even worse products in the future

I am afraid that is happening right now.


----------



## Casey Ryback

Quote:


> Originally Posted by *undeadhunter*
> 
> I Feel sorry for the people who want to defend the "waiting game"


I feel sorry for you thinking you know everybody's situation, I literally don't play any games where my 7970 struggles at 1080p.

I don't plan on buying GTAV, because GTA4 was terribly boring. Witcher doesn't interest me, the list goes on.

The only reason I've even considered upgrading is because of the upgrade bug that bites every now and then.

I'll be playing star citizen when it's released so I can afford to wait, each to their own.

So please don't feel sorry for people who want to wait, or defend the waiting game.

Their choice is about them, not you.

And nice wall of text lol.


----------



## sugalumps

Quote:


> Originally Posted by *Casey Ryback*
> 
> I feel sorry for you thinking you know everybody's situation, I literally don't play any games where my 7970 struggles at 1080p.
> 
> I don't plan on buying GTAV, because GTA4 was terribly boring. Witcher doesn't interest me, the list goes on.
> 
> The only reason I've even considered upgrading is because of the upgrade bug that bites every now and then.
> 
> I'll be playing star citizen when it's released so I can afford to wait, each to their own.
> 
> So please don't feel sorry for people who want to wait, or defend the waiting game.
> 
> Their choice is about them, not you.
> 
> And nice wall of text lol.


I think he meant the waiting game as in the people waiting because they think the furyx(like amd are going to unlock some hidden power level) is going to somehow get alot better performance wise instead of jumping on the ti. Not people that have no interest in buying a card atm and just waiting all together.


----------



## Klocek001

Quote:


> Originally Posted by *undeadhunter*
> 
> I Feel sorry for the people who want to defend the "waiting game" reality is while you are "waiting" to get stuff , whoever bought a decent product enjoyed it already for a set time and that it's priceless , there is really no profit to wait months or years on promises or dreams when you can get stuff done now when there is a proven working product. It's like real life, people who never get stuff done while waiting for "miracles" or opportunities... lol. Gotta get stuff done with what you have at the moment and make the best of it, in this case nvidia covered their bases with products on different price points and performance, expensive? overpriced? yes but it was available and working while the competition was selling teasers to what we got yesterday (Fury) and that's why Fiji launch was a flop, not due to performance that much, it was bad marketing, hype and a terrible PR. Reality here is Fury took a while to get here, and im sorry fellas is nothing revolutionary. Yes it is an good card and gets the job done but for the price and considering you could get a 980 ti weeks ago for the same price... the card should have launched at a 550$-600$ tops, on top of it forcing you to go with a clc solution on release date, no hdmi 2.0, with less vram, you do the math. Fiji looks like a test bed for HBM for me.I love amd and had a 290x for a while and really had high hopes for this launch, but it's not what it was hyped to be right now (overclockers dream, yeah right lol) could have been awesome if priced right but that's not the case right now. maybe as prices settle, voltage gets worked and some drivers here and there will make a bit of difference, but that will not change the bad impression it left on most of us sadly. Hopefully AMD will get their stuff together for their next launch


that was an unnecessary and unrealistic generalization. and since you seem to bring some empathy into the discussion then feel sorry for those who bought 780ti for $600 in early 2014 and sold it for $350 in the fall cause I got my 290 for $240 in the summer and now it seems better than 780 and close to 780ti in dx11, with better dx12 support as well. It's not like all of us gotta upgrade NOW, we do have working rigs that play games too. The 980ti isn't the lord Saviour for me ATM.
Fury (non-X) will be cheaper, with better vrm cooling and 3500sp might not make that much difference as well since it doesn't seem to scale that well on fury x with 4096 of them.


----------



## Wishmaker

I think AMD should add a new sticker on their Fury X Box :

'Wait for DX12 when Fury X goes 超サイヤ人







.

PS: Super Saiyan


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> I think he meant the waiting game as in the people waiting because they think the furyx(like amd are going to unlock some hidden power level) is going to somehow get alot better performance wise instead of jumping on the ti. Not people that have no interest in buying a card atm and just waiting all together.


I do have interest in buying a card at the moment, and would if I saw something I liked, but I'll wait instead as neither company fulfills my needs currently









The moral of the story is if people want to have faith that the AMD cards will get better let them, the product is already close enough without improvements not to judge them for their purchase.

I see people buying weaker/more expensive nvidia cards over AMD cards at lower price points constantly, but if someone picks a fury X with some hope of improvements they are labelled all sorts of things.

Why should people get a ti if they don't want a ti?

Just like why do people pick a $330 970, when you can get an R9 290 for $240?

Why do people pick a 750ti over an AMD 270/270X?

It's personal choice. There's nothing wrong with people buying the fury X (even though I wouldn't personally) but it's up to them.

Claiming they are waiting for performance increases, is just them hoping for a performance bump here or there, or with DX12. Nothing wrong with with that.


----------



## undeadhunter

Quote:


> Originally Posted by *sugalumps*
> 
> I think he meant the waiting game as in the people waiting because they think the furyx(like amd are going to unlock some hidden power level) is going to somehow get alot better performance wise instead of jumping on the ti. Not people that have no interest in buying a card atm and just waiting all together.


You sir win! exactly what I meant







my grammar fails : /


----------



## Tivan

Quote:


> Originally Posted by *sugalumps*
> 
> I think he meant the waiting game as in the people waiting because they think the furyx(like amd are going to unlock some hidden power level) is going to somehow get alot better performance wise instead of jumping on the ti. Not people that have no interest in buying a card atm and just waiting all together.


But if you expect the FuryX to get better in performance, and are in the game for a card for 650 dollars in the ballpark of a 980ti, then why even wait!

As for people waiting for DX12 performance figures, maybe they want to play DX12 games.


----------



## undeadhunter

Quote:


> Originally Posted by *Casey Ryback*
> 
> I feel sorry for you thinking you know everybody's situation, I literally don't play any games where my 7970 struggles at 1080p.
> 
> I don't plan on buying GTAV, because GTA4 was terribly boring. Witcher doesn't interest me, the list goes on.
> 
> The only reason I've even considered upgrading is because of the upgrade bug that bites every now and then.
> 
> I'll be playing star citizen when it's released so I can afford to wait, each to their own.
> 
> So please don't feel sorry for people who want to wait, or defend the waiting game.
> 
> Their choice is about them, not you.
> 
> And nice wall of text lol.


Quote:


> Originally Posted by *Casey Ryback*
> 
> I feel sorry for you thinking you know everybody's situation, I literally don't play any games where my 7970 struggles at 1080p.
> 
> I don't plan on buying GTAV, because GTA4 was terribly boring. Witcher doesn't interest me, the list goes on.
> 
> The only reason I've even considered upgrading is because of the upgrade bug that bites every now and then.
> 
> I'll be playing star citizen when it's released so I can afford to wait, each to their own.
> 
> So please don't feel sorry for people who want to wait, or defend the waiting game.
> 
> Their choice is about them, not you.
> 
> And nice wall of text lol.


*I think he meant the waiting game as in the people waiting because they think the furyx(like amd are going to unlock some hidden power level) is going to somehow get alot better performance wise instead of jumping on the ti. Not people that have no interest in buying a card atm and just waiting all together.*

Suga explained my feelings in a sentence there without the need of wall text!







And yes, you are completely right my friend, the choice is a personal thing for everybody.


----------



## blue1512

Quote:


> Originally Posted by *Tivan*
> 
> But if you expect the FuryX to get better in performance, and are in the game for a card for 650 dollars in the ballpark of a 980ti, then why even wait!
> 
> As for people waiting for DX12 performance figures, maybe they want to play DX12 games.


Because they hate nVidia's gut? Gimpworks and Physick for example.

Do you know that Project cars has a Dx12 patch already?

Do you know that Dx12 will be a must to save current gen console? Do you know that they are powered with GCN architecture?


----------



## ZealotKi11er

I knew HBM1 will not be as big of a performance bump because going from GTX 780 Ti to GTX 980 Ti all Nvidia did was Delta Color Compression. Same 384-Bit same 7GHz memory. 20-30% increase in effective memory bandwidth.


----------



## Krusher33

I have the upgrade bug too but I can't justify buying anything at the moment when all my games are playing fine right now. Other than the pursuit of performance of course.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Krusher33*
> 
> I have the upgrade bug too but I can't justify buying anything at the moment when all my games are playing fine right now. Other than the pursuit of performance of course.


I think the problem is what is next that i can upgrade for.


----------



## sugalumps

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think the problem is what is next that i can upgrade for.


Bit premature I know but when is both camps next big gpu update? Would coast by on a 980 if they are both set for something next year, but if it's like early 2017 or something I would probably upgrade to fury/ti soon after seeing these magical dx12 gains on both sides.


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> Bit premature I know but when is both camps next big gpu update? Would coast by on a 980 if they are both set for something next year, but if it's like early 2017 or something I would probably upgrade to fury/ti soon after seeing these magical dx12 gains on both sides.


Pascal featuring HBM2 is set to debut in 2016. Not sure which quarter.

http://blogs.nvidia.com/blog/2015/03/17/pascal/

A gtx 980 will do more than coast by lol (unless of course you have a 4K or multiple display)

AMD is also planning for 2016 releases too.


----------



## sugalumps

Quote:


> Originally Posted by *Casey Ryback*
> 
> Pascal featuring HBM2 is set to debut in 2016. Not sure which quarter.
> 
> http://blogs.nvidia.com/blog/2015/03/17/pascal/
> 
> A gtx 980 will do more than coast by lol (unless of course you have a 4K or multiple display)


Nope just 1440p 60hz, you are right it does well. Turn a setting or two down here and there and 60 fps easy, not sure why I even want to upgrade. Guess just for the sake of it, something to do. This hobby is dangerous, the second a new gpu is out yours starts to feel aicent in your mind


----------



## blue1512

Anyone here enjoys a plot twist?
Quote:


> Just putting this out there. Here's a review that seems quite different from the other ones: https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml This one actually shows that Fury beats 980 ti in a lot of tests and even titan x in some tests. Of course in lower resolutions it seems that NVidia still wins a lot of times, but it's not as bad as it is with some other reviewers. Now here's a possible reason why. Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th. The press driver on AMD's FTP server is 15.15-150611a-185358E. I think this is probably the reason of this inconsistency.


Somebody is going to be fired at AMD, maybe there is an nVidia double agent there lol (980Ti timing anyone?)


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> Nope just 1440p 60hz, you are right it does well. Turn a setting or two down here and there and 60 fps easy, not sure why I even want to upgrade. Guess just for the sake of it, something to do. This hobby is dangerous, the second a new gpu is out yours starts to feel aicent in your mind


lol imagine how I feel running a 7970, honestly though it's been such a good card.


----------



## 47 Knucklehead

Quote:


> Originally Posted by *Casey Ryback*
> 
> I'm on the fence with all these cards currently, I almost bought a 970 when it came out, then I almost picked up a cheap sapphire tri-X 290.
> 
> I usually go for the mid range type cards hoping for value at 1080p.
> 
> It's actually the software more than anything that has turned me away from doing it, if my 7970 crumbles with star citizen (which it no doubt will) I'm pretty sure I'll be straight down the shop buying whatever is available at the time.
> 
> I'm not flying a fancy ship at low framerates


If you are still rocking a 1080p single monitor, then honestly, I wouldn't touch the Fury X with a 10 foot poll.

It performs horribly in 1080p. At 1440p and 4K it's better.

Honestly, I'd sit tight on the 7970 until AMD (or nVidia) comes out with their HBM GEN 2 cards (Pascal and what ever AMD is going to call theirs). Honestly, if you have issues with Star Citizen, pick up a second 7970 for like $120 and Crossfire it.

That's just my 2 cents.


----------



## ladcrooks

i might get this card as it does well at 4k and should be even better for 3440x1440 - at the moment I am sitting on the fence 4k or 3440x1440

4k is too demanding but i like the idea of a 40+ '', oh decisions, decisions


----------



## Casey Ryback

Yeah there's no way I'd buy the fury X, although the air version maybe.

My reasoning was to maybe get a 1440p monitor for star citizen and try to really enjoy pc gaming again, I also have an xbox one but never even turn it on.

My board can run two cards at 8x/8x but I'm hesitant to run xfire honestly, also my PSU isn't good enough, and I've already got another antec TP 650 sitting around too. Buying a third unit....nope not on the cards. The one I have can power any single gpu anyway.

I also figure if there's single cards that are good enough I may as well just buy them. Really can't be bothered risking getting a noisy or faulty second hand card (plus xfire will be loud)

Considering all this I'm thinking for the fury air could be a decent buy for the right price, and 4GB vram will be plenty too for 1440p









edit wow star citizen isn't having it's official release until the end of 2016! sigh.

Might play the waiting game people despise so much lol, HBM2 card>1440p>star citizen>epic pc gaming?


----------



## sugalumps

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yeah there's no way I'd buy the fury X, although the air version maybe.
> 
> My reasoning was to maybe get a 1440p monitor for star citizen and try to really enjoy pc gaming again, I also have an xbox one but never even turn it on.
> 
> My board can run two cards at 8x/8x but I'm hesitant to run xfire honestly, also my PSU isn't good enough, and I've already got another antec TP 650 sitting around too. Buying a third unit....nope not on the cards. The one I have can power any single gpu anyway.
> 
> I also figure if there's single cards that are good enough I may as well just buy them. Really can't be bothered risking getting a noisy or faulty second hand card (plus xfire will be loud)
> 
> Considering all this I'm thinking for the fury air cooled could be a decent buy for the right price, and 4GB vram will be plenty too for 1440p
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit wow star citizen isn't having it's official release until the end of 2016! sigh.


A bump in res is probably not going to get you back into gaming, it may mean you are burnt out. Happened to me.


----------



## Schmuckley

It looks like Fury X will be the card for Gpu PI for a while.


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> A bump in res is probably not going to get you back into gaming, it may mean you are burnt out. Happened to me.


Nah I more meant star citizen would get me back into it.

I've played around on it a bit and love it so far. can't wait for the fps, exploration and all that too.

It was more I'd treat myself to a new screen and card at the same time as celebrating a game I'd have waited for a long time to play.


----------



## gamervivek

Quote:


> Originally Posted by *Wishmaker*
> 
> I think AMD should add a new sticker on their Fury X Box :
> 
> 'Wait for DX12 when Fury X goes 超サイヤ人
> 
> 
> 
> 
> 
> 
> 
> .
> 
> PS: Super Saiyan


More like russian.









https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml


----------



## Casey Ryback

Quote:


> Originally Posted by *ladcrooks*
> 
> i might get this card as it does well at 4k and should be even better for 3440x1440 - at the moment I am sitting on the fence 4k or 3440x1440
> 
> 4k is too demanding but i like the idea of a 40+ '', oh decisions, decisions


With a decision like that you'd have to spend some sort of time in front of an ultra wide to see if you like it.

Honestly from what I've seen I prefer full 4K.
Quote:


> Originally Posted by *blue1512*
> 
> Anyone here enjoys a plot twist?
> Somebody is going to be fired at AMD, maybe there is an nVidia double agent there lol (980Ti timing anyone?)


I'd like to believe it, but I don't.


----------



## ladcrooks

yes one on air would be nice, forgot the water thingy for a minute - give me air any day


----------



## Dhoulmagus

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yeah there's no way I'd buy the fury X, although the air version maybe.
> 
> My reasoning was to maybe get a 1440p monitor for star citizen and try to really enjoy pc gaming again, I also have an xbox one but never even turn it on.
> 
> My board can run two cards at 8x/8x but I'm hesitant to run xfire honestly, also my PSU isn't good enough, and I've already got another antec TP 650 sitting around too. Buying a third unit....nope not on the cards. The one I have can power any single gpu anyway.
> 
> I also figure if there's single cards that are good enough I may as well just buy them. Really can't be bothered risking getting a noisy or faulty second hand card (plus xfire will be loud)
> 
> Considering all this I'm thinking for the fury air could be a decent buy for the right price, and 4GB vram will be plenty too for 1440p
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit wow star citizen isn't having it's official release until the end of 2016! sigh.
> 
> Might play the waiting game people despise so much lol, HBM2 card>1440p>star citizen>epic pc gaming?


I can't play the waiting game anymore, I went for cheap graphics in my two home rigs (280x and 7950) over a year ago to hold me over until I could find something that would be the ultimate upgrade. I usually only like to upgrade once every 3-5 years, before those cards I was using CF 5770s and 5850s







. My 4790k is about to celebrate its first birthday and its still driving a budget card and 1080P monitors from 2010







.

I have to say though, it's not just the GPU market that's causing me to go insane, I have been going nuts over what monitor to buy all year. I was about to pull the trigger on the MG279Q but something made me cancel the order. There is too much of a mess in the market right now, I may just have to close my eyes, pick the cards + monitors and just deal with it and hopefully in 2020 things are the way they were in 2010. Back then it was just pick the best GPU you could afford from a simple lineup, flagship cards were $399, monitors were 1080P or not 1080P and nobody knew the difference between TN and IPS yet


----------



## Themisseble

About reviews.. some are using wrong drivers

https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml

interesting - new drivers?
Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th.
The press driver on AMD's FTP server is 15.15-150611a-185358E.

Fury beats TITAN X at 4K ....


----------



## Tivan

Quote:


> Originally Posted by *Themisseble*
> 
> About reviews.. some are using wrong drivers
> 
> https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml
> 
> interesting - new drivers?
> Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th.
> The press driver on AMD's FTP server is 15.15-150611a-185358E.
> 
> Fury beats TITAN X at 4K ....


Some reviewers run 15.5 or older drivers, even.

here's a link to the post where I did a quick write up of the drivers used for the reviews in the OP.
Quote:


> Originally Posted by *Tivan*
> 
> ...


edit: personally, I'm waiting with making any fundamental statement on FuryX performance till AMD announce a new big driver release. Doesn't mean they can't offer improvements after that, but I mean, there's been some frame stutter issues with FuryX in some instances where older AMD cards had no problems, so giving em at least 1 driver release to figure out the rough edges c;

edit: also would be nice if that 15.15v2 you seem to have found is in fact a big improvement on its own, but yeah, definitely a bit of a mess with the drivers either way.


----------



## Casey Ryback

Quote:


> Originally Posted by *Serious_Don*
> 
> Back then it was just pick the best GPU you could afford from a simple lineup, flagship cards were $399, monitors were 1080P or not 1080P and nobody knew the difference between TN and IPS yet


Agreed, it also seems a bit more lucrative now.


----------



## TopicClocker

Damn, I don't even know what to say, I expected better performance from this, like at-least on par with the 980 Ti, but in the reviews I've seen it barely is, hopefully drivers will help the card out.

At the moment it isn't looking too good, it's priced pretty closely to the 980 Ti (In the UK), it has 2GB less VRAM and is getting beaten by the 980 Ti in most benchmarks, it's a tough sell IMO.

Smh.


----------



## sugalumps




----------



## PhRe4k

Quote:


> Originally Posted by *Serious_Don*
> 
> There is too much of a mess in the market right now, I may just have to close my eyes, pick the cards + monitors and just deal with it and hopefully in 2020 things are the way they were in 2010. Back then it was just pick the best GPU you could afford from a simple lineup, flagship cards were $399, monitors were 1080P or not 1080P and nobody knew the difference between TN and IPS yet


This post speaks to me







Well said! I also miss the simple inexpensive days of my 1080p Asus TN monitor, $389 5870, $150 Phenom X4 955.


----------



## Dhoulmagus

Quote:


> Originally Posted by *PhRe4k*
> 
> This post speaks to me
> 
> 
> 
> 
> 
> 
> 
> Well said! I also miss the simple inexpensive days of my 1080p Asus TN monitor, $389 5870, $150 Phenom X4 955.


I'm still using my triple 23" asus vh236 setup I bought way back then and I just retired my 955BE to my brother yesterday. Hahahah. We really are all the same here.
I see we joined at the same time too, it really was easy in 2010


----------



## p4inkill3r

Quote:


> Originally Posted by *sugalumps*


I'm sure AMD saw that that blurb was there, and considering nobody expected a "knockout blow" dealt to nvidia's product, I see nothing wrong with them sharing a favorable review.


----------



## Klocek001

GPUs are getting way too pricey. Back in early 2014 (q1/q2) when I was building my current rig 3200PLN , the price of a reference 980ti in PL, bought me a new cpu+mobo combo, 8 gigs of ddr3 2666 ram (also new), a 290 trix (2 months old), a new 1080p display and a new 1TB HDD. I know 980ti is fast but damn that's the price of my whole rig for one component.


----------



## BigMack70

Quote:


> Originally Posted by *p4inkill3r*
> 
> I see nothing wrong with them sharing a favorable review.


I see something very wrong with sharing a review from IGN, rather it be favorable or not. A GPU review from IGN is about as meaningful as if I put my dog in the driver's seat of my car, told him to drive it for an hour, and then asked him to review the car.


----------



## tconroy135

Quote:


> Originally Posted by *CrazyElf*
> 
> This is kind of off-topic, but does anyone anticipate this happening?
> 
> I know that Nvidia would not have as much incentive to do this now that AMD has played it's hand (the 780Ti I imagine was in no small part due to the 290X release), but say, 5-6 months from now as yields improve, would it be possible for Nvidia to release a 3072 core variant of the 980Ti for say, $700 USD? Yields would have improved on the Titan X and the 28nm is pretty mature anyways.
> 
> It would only perform about 3% faster than the 980Ti (about the same performance as Titan X, save where the GPU actually needs the 12GB of VRAM).
> Basically this:


I don't see this happening. The only thing that would really make sense is if they allowed EVGA, etc. to do custom modifications to the Titan X, but then NVIDIA would still expect an MSRP of 999USD.


----------



## PhRe4k

Quote:


> Originally Posted by *Serious_Don*
> 
> I'm still using my triple 23" asus vh236 setup I bought way back then and I just retired my 955BE to my brother yesterday. Hahahah. We really are all the same here.
> I see we joined at the same time too, it really was easy in 2010


Good times! For me that was the peak of PC hardware/gaming experience, it's just not the same anymore. I found that the used and budget markets are far more interesting, I get my parts for ultra cheap. When you worry less about prices and e-peen size, that's more time dedicated to the actual gaming experience









/thread hijack


----------



## tconroy135

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I was thinking more that if Fury X had beaten the 980ti significantly it would also likely have beaten the stock Titan X, which would have made it pointless to release a full die. I still think they have soemthing bigger again up their sleeves just in case. As you say there's probably less profit there for them so now they don't need it this gen.


If you put the full die on custom PCB and ran the cards stock near 1500Mhz I don't think the Fury X given any configuration could have touched it.


----------



## Casey Ryback

Quote:


> Originally Posted by *Klocek001*
> 
> I know 980ti is fast but damn that's the price of my whole rig for one component.


Yep, and it's the fastest part to depreciate


----------



## Themisseble

Quote:


> Originally Posted by *TopicClocker*
> 
> Damn, I don't even know what to say, I expected better performance from this, like at-least on par with the 980 Ti, but in the reviews I've seen it barely is, hopefully drivers will help the card out.
> 
> At the moment it isn't looking too good, it's priced pretty closely to the 980 Ti (In the UK), it has 2GB less VRAM and is getting beaten by the 980 Ti in most benchmarks, it's a tough sell IMO.
> 
> Smh.


most of reviews are failing ...
Lets just compare all of them
- just 1 of them used latest driver
https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml

- Inaccurate power consumption
Example:
techpowerup vs tomshardware

- Inaccurate performance ( you can see more than 20% difference)
Techpowerup vs Tomshardware

which one to trust?
- some shows that at 4K it is on pair with GTX 980Ti or faster or slower or even faster than TITAN X.

Check this one:
http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/#diagramm-evolve-3840-2160
vs
http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9#.VYwxp7bFyT5

which means that one of them is wrong - computerbase.de = Not trusted!
but then you see this:
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/#.VYwx-bbFyT4

GCn 1.3? L2 cache to 2MB... please no jokes! = review not trusted.

so clearly review for the R9 Fury X, but you can notice that OC R9 290X is well behind R9 390X?! How?! please explain it to me!!!

PS: all reviews should be in 5% difference of perf and power consumption... so for me - untill is see it with my own eyes I dont care what they say.


----------



## BigMack70

Quote:


> Originally Posted by *Themisseble*
> 
> most of reviews are failing ...
> Lets just compare all of them
> - just 1 of them used latest driver
> https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml
> 
> - Inaccurate power consumption
> - Inaccurate performance ( you can see more than 20% difference)
> 
> which one to trust
> - some shows that at 4K it is on pair with GTX 980Ti or faster or slower or even faster than TITAN X.


Heeeeeeeeeeeeeeeeeeeeere we go again... it's always the same thing with AMD fanboys... "_the reviews are wrong!!!!!!_"... "_major tech sites are all biased_"... "_Nvidia paid them off_"... "_this review didn't talk exclusively about the one niche scenario where AMD's product is better so it's bad_"


----------



## criminal

Quote:


> Originally Posted by *sugalumps*
> 
> Nope just 1440p 60hz, you are right it does well. Turn a setting or two down here and there and 60 fps easy, not sure why I even want to upgrade. Guess just for the sake of it, something to do. *This hobby is dangerous, the second a new gpu is out yours starts to feel aicent in your mind*


You got that right. That is the reason I want to upgrade!


----------



## Casey Ryback

Quote:


> Originally Posted by *Themisseble*
> 
> - Inaccurate power consumption
> Example:
> techpowerup vs tomshardware


So wait which one are you saying uses the correct driver? Tom's or techpowerup?

The results seem fairly similar?

The fury X is competitive at 1440p and 4K.


----------



## p4inkill3r

Quote:


> Originally Posted by *BigMack70*
> 
> I see something very wrong with sharing a review from IGN, rather it be favorable or not. A GPU review from IGN is about as meaningful as if I put my dog in the driver's seat of my car, told him to drive it for an hour, and then asked him to review the car.


I don't have Facebook so I'm not sure if that's the only review that AMD has posted.
Is it?


----------



## Blameless

Quote:


> Originally Posted by *Serious_Don*
> 
> I have to say though, it's not just the GPU market that's causing me to go insane, I have been going nuts over what monitor to buy all year. I was about to pull the trigger on the MG279Q but something made me cancel the order. There is too much of a mess in the market right now, I may just have to close my eyes, pick the cards + monitors and just deal with it and hopefully in 2020 things are the way they were in 2010. Back then it was just pick the best GPU you could afford from a simple lineup, flagship cards were $399, monitors were 1080P or not 1080P and nobody knew the difference between TN and IPS yet


I've been paying 300-400 dollars for my primary GPUs since the GeForce 256 DDR in 1999 and there have been 650+ dollar flagship GPUs for sale since the 8800 series in late 2006.

IPS vs. TN was also something debated long before 2010 and resolution choices were typically between 1050p, 1200p, and 1560p 16:10 panels, or 900 and 1080p in 16:9.

GPU line ups weren't really simpler either.


----------



## Themisseble

Quote:


> Originally Posted by *Casey Ryback*
> 
> So wait which one are you saying uses the correct driver? Tom's or techpowerup?


which one is better .. I dont know?!
techpowerup shows 50W more than tomshardware in gaming! ? how is that possible? No it is not possible. Most people are blind and they see what they want to see...

But then you check NVIDIA power consumption
- mostkly they show us same numbers! ? amm wuut?
Quote:


> Originally Posted by *BigMack70*
> 
> Heeeeeeeeeeeeeeeeeeeeere we go again... it's always the same thing with AMD fanboys... "_the reviews are wrong!!!!!!_"... "_major tech sites are all biased_"... "_Nvidia paid them off_"... "_this review didn't talk exclusively about the one niche scenario where AMD's product is better so it's bad_"


Please read 3 reviews and then tell me if you think that they have reviewed same card!

I have read 6 reviews and basically i have feeling that they have been reviewing different cards!


----------



## BigMack70

Quote:


> Originally Posted by *Themisseble*
> 
> Please read 3 reviews and then tell me if you think that they have reviewed same card!


I've read at least a dozen major reviews for this card and I have no sympathy for conspiracy theories... particularly when they're the exact same types of conspiracy theories that get trotted out every time AMD releases an underwhelming product.


----------



## Casey Ryback

Quote:


> Originally Posted by *Themisseble*
> 
> which one is better .. I dont know?!
> techpowerup shows 50W more than tomshardware in gaming! ? how is that possible? No it is not possible. Most people are blind and they see what they want to see...
> Please read 3 reviews and then tell me if you think that they have reviewed same card!


Well can you point out the reviews that are conflicting and obviously have a large difference? This will therefore prove your point.

I actually hadn't looked at the toms review it looks very good for fury X.


----------



## Slink3Slyde

Quote:


> Originally Posted by *tconroy135*
> 
> If you put the full die on custom PCB and ran the cards stock near 1500Mhz I don't think the Fury X given any configuration could have touched it.


I'll have to disagree, it was impossible to tell exactly how fast FuryX was going to be if you take into account HBM and other improvements we didnt know about. Hence a lot of peoples dissapointment that its only barely matching a stock 980ti right now.


----------



## BigMack70

Also power consumption disparities are not a good way to determine if reviews are being consistent with each other... GPUs have fairly variable power draw and a 50W variance between two reviews can easily be attributed to different testing methodology. I could easily come up with a test in which I could measure one of my Titan X cards drawing 250W of power under load, and I could easily make a test where it would draw closer to 350W under load.

Reviews are for the most part extremely consistent on reporting how this card performs relative to the 290X/980 Ti/Titan X/etc.


----------



## Themisseble

Quote:


> Originally Posted by *BigMack70*
> 
> I've read at least a dozen major reviews for this card and I have no sympathy for conspiracy theories... particularly when they're the exact same types of conspiracy theories that get trotted out every time AMD releases an underwhelming product.


they you cant read! No offense, but you must be blind.


----------



## Z-Kev

well thats funny


----------



## tconroy135

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I'll have to disagree, it was impossible to tell exactly how fast FuryX was going to be if you take into account HBM and other improvements we didnt know about. Hence a lot of peoples dissapointment that its only barely matching a stock 980ti right now.


Yeah, maybe, but that is just ignorance. Fury X is still running the 28nm process and still running the same architecture. If AMD had been able to bring 14/16nm to market for the Fury X then they would have had something...


----------



## BigMack70

Quote:


> Originally Posted by *Themisseble*
> 
> they you cant read!


ohhhhhhhhhhh the irony...


----------



## Tojara

Quote:


> Originally Posted by *Themisseble*
> 
> most of reviews are failing ...
> Lets just compare all of them
> - just 1 of them used latest driver
> https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml
> 
> - Inaccurate power consumption
> Example:
> techpowerup vs tomshardware
> 
> - Inaccurate performance ( you can see more than 20% difference)
> Techpowerup vs Tomshardware
> 
> which one to trust?
> - some shows that at 4K it is on pair with GTX 980Ti or faster or slower or even faster than TITAN X.
> 
> Check this one:
> http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/#diagramm-evolve-3840-2160
> vs
> http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9#.VYwxp7bFyT5
> 
> which means that one of them is wrong - computerbase.de = Not trusted!
> but then you see this:
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/#.VYwx-bbFyT4
> 
> GCn 1.3? L2 cache to 2MB... please no jokes! = review not trusted.
> 
> so clearly review for the R9 Fury X, but you can notice that OC R9 290X is well behind R9 390X?! How?! please explain it to me!!!
> 
> PS: all reviews should be in 5% difference of perf and power consumption... so for me - untill is see it with my own eyes I dont care what they say.


Am I seriously missing something or isn't Computerbase reviewing a Fury X at 4k and Hardocp 390X at 1440p? From what I can tell, the 4k results in GTA and Witcher seem extremely similar (for Fury X and 980 ti), although Far Cry 4 is off, but I'm putting the last one on settings used.


----------



## Casey Ryback

Quote:


> Originally Posted by *BigMack70*
> 
> I've read at least a dozen major reviews for this card and I have no sympathy for conspiracy theories.


Quote:


> Originally Posted by *BigMack70*
> 
> Heeeeeeeeeeeeeeeeeeeeere we go again... it's always the same thing with AMD fanboys... "_the reviews are wrong!!!!!!_"... "_major tech sites are all biased_"... "_Nvidia paid them off_"... "_this review didn't talk exclusively about the one niche scenario where AMD's product is better so it's bad_"


Noted.

Please let this member try to prove their theory I want to see where this is going.

There'll be plenty of time for you to say I told you so, call people fanboys etc etc, don't worry you won't miss out on your fun


----------



## fcman

Quote:


> Originally Posted by *Themisseble*
> 
> Please read 3 reviews and then tell me if you think that they have reviewed same card!
> 
> I have read 6 reviews and basically i have feeling that they have been reviewing different cards!


And that's supposed to make us want to buy the card? The fact that the benches are so all over the place is terrifying, who would buy this thing when you know you could get one that severely underperforms relative to other copies?


----------



## Themisseble

Quote:


> Originally Posted by *Tojara*
> 
> Am I seriously missing something or isn't Computerbase reviewing a Fury X at 4k and Hardocp 390X at 1440p? From what I can tell, the 4k results in GTA and Witcher seem extremely similar (for Fury X and 980 ti), although Far Cry 4 is off, but I'm putting the last one on settings used.


That wanst the point.
Point is that computebase.de is shwoing 24% difference between R9 390X and R9 290X .. also oC-ed R9 290X is slower.
HardOcp Is hwoing that there is no difference clock per cloc... so the question is: How can OC-ed R9 290X not compete with R9 390X and how can R9 390X beat it for 24% with only 5-10% OC?


----------



## Casey Ryback

Quote:


> Originally Posted by *Themisseble*
> 
> they you cant read! No offense, but you must be blind.


As I said, please show us the exact differences in benchmark results that make it clear that they are very different.

You have to use the major websites that have a reputation to uphold.


----------



## Themisseble

Quote:


> Originally Posted by *fcman*
> 
> And that's supposed to make us want to buy the card? The fact that the benches are so all over the place is terrifying, who would buy this thing when you know you could get one that severely underperforms relative to other copies?


I did not mention it?! All I am saying that THE INTHERNET GOT LAZY and THEY SHOW NUMBERS WHICH THEY WANT OR WE WANT TO SEE?
Who even bothere using latest drivers from amd? just 1 of them. lame!


----------



## criminal

I am excited about the Fury @ $550 to be honest. If the Fury X is ROP limited and is bottle necking the 4096 gcn cores, then a cut to 3584 gcn cores may not hurt the card at all. I would pay $550 for this performance on the off chance that future drivers will bring more performance. The Fury and Fury Nano might just be AMD saving grace.


----------



## BigMack70

Quote:


> Originally Posted by *Themisseble*
> 
> I did not mention it?! All I am saying that THE INTHERNET GOT LAZY and THEY SHOW NUMBERS WHICH THEY WANT OR WE WANT TO SEE?
> Who even bothere using latest drivers from amd? just 1 of them. lame!


I think this proves my point. It's all an _inthernet_ conspiracy, man.


----------



## blue1512

Quote:


> Originally Posted by *BigMack70*
> 
> Also power consumption disparities are not a good way to determine if reviews are being consistent with each other... GPUs have fairly variable power draw and a 50W variance between two reviews can easily be attributed to different testing methodology. I could easily come up with a test in which I could measure one of my Titan X cards drawing 250W of power under load, and I could easily make a test where it would draw closer to 350W under load.
> 
> Reviews are for the most part extremely consistent on reporting how this card performs relative to the 290X/980 Ti/Titan X/etc.


It DOES matter when you choose the number to public. If you want to promote nVidia, post that 250W number, else, post that 350W number.

Another is TPU reviews, in which they tossed Gameworks titles to the mix and suddenly FuryX became a joke in the overall performance chart. I would take those reviews with a pinch of salt.


----------



## Themisseble

Quote:


> Originally Posted by *Casey Ryback*
> 
> As I said, please show us the exact differences in benchmark results that make it clear that they are very different.
> 
> You have to use the major websites that have a reputation to uphold.


http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html

clearly beating TITAN X

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/20.html
not beating GTX 980Ti


----------



## BigMack70

Quote:


> Originally Posted by *blue1512*
> 
> It DOES matter when you choose the number to public. If you want to promote nVidia, post that 250W number, else, post that 350W number.
> 
> Another is TPU reviews, in which they tossed Gameworks titles to the mix and suddenly FuryX became a joke in the overall performance chart. I would take those reviews with a pinch of salt.


Or you could, you know, just post the number your testing methodology gets you









As I said, it's the same conspiracy theories every time. Get some new material folks... this garbage is getting old...


----------



## Blameless

Quote:


> Originally Posted by *Themisseble*
> 
> techpowerup shows 50W more than tomshardware in gaming! ? how is that possible? No it is not possible


Techpowerup showed peak gaming power consumption they recorded, Tom's used the average (because they are measuring with a high res scope that picks up transients).

I read both reviews and the power consumption figures look perfectly plausible in both if one reads the methodology.


----------



## keikei

Quote:


> Originally Posted by *criminal*
> 
> *I am excited about the Fury @ $550 to be honest*. If the Fury X is ROP limited and is bottle necking the 4096 gcn cores, then a cut to 3584 gcn cores may not hurt the card at all. I would pay $550 for this performance on the off chance that future drivers will bring more performance. The Fury and Fury Nano might just be AMD saving grace.


That is the only contention I have with the card, its price. Not performance. To match the price of a competitor's card and not also match their overall performance is pretty ballzy of AMD. I expect Fury X's price to drop after the diehard AMD fans pick up the launch cards because they are the only ones buying them at this price. I'm a fan, just not a blind one.


----------



## Slink3Slyde

Quote:


> Originally Posted by *tconroy135*
> 
> Yeah, maybe, but that is just ignorance. Fury X is still running the 28nm process and still running the same architecture. If AMD had been able to bring 14/16nm to market for the Fury X then they would have had something...


Same architecture with improvements, in tessellation for example that were unknown quantities. Nor did anyone know how much difference HBM would make. Not enough for me as it turns out.


----------



## Tojara

Quote:


> Originally Posted by *Themisseble*
> 
> That wanst the point.
> Point is that computebase.de is shwoing 24% difference between R9 390X and R9 290X .. also oC-ed R9 290X is slower.
> HardOcp Is hwoing that there is no difference clock per cloc... so the question is: How can OC-ed R9 290X not compete with R9 390X and how can R9 390X beat it for 24% with only 5-10% OC?


I'm guessing a throttling stock card. If you noticed, the 290 is a single percent behind the 290X. The 290 and X gaining significantly on the 970 once overclocked support that pretty well. At worst the 290(X) results might even be with old drivers and they just reused the results.


----------



## Themisseble

Quote:


> Originally Posted by *Blameless*
> 
> Techpowerup showed peak gaming power consumption they recorded, Tom's used the average (because they are measuring with a high res scope that picks up transients).
> 
> I read both reviews and the power consumption figures look perfectly plausible in both if one reads the methodology.


Still you missed the point...
Tomshardware is showing lower power consumption than GTX 980Ti. So R9 Fury X is using less power than GTx 980Ti in gaming! (220vs233)
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html - GTX 980Ti power consumption
http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-7.html - Fury X

Now, techpower up is showing different story. They are saying that GTX 980Ti is using less power than R9 Fury X!

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/29.html


----------



## Themisseble

Quote:


> Originally Posted by *Tojara*
> 
> I'm guessing a throttling stock card. If you noticed, the 290 is a single percent behind the 290X. The 290 and X gaining significantly on the 970 once overclocked support that pretty well. At worst the 290(X) results might even be with old drivers and they just reused the results.


+1
So why would they compare different drivers?
Suddenly R9 290X is on pair with GTX 980?.. they dont want to show that to average Joe.


----------



## Tojara

Quote:


> Originally Posted by *criminal*
> 
> I am excited about the Fury @ $550 to be honest. If the Fury X is ROP limited and is bottle necking the 4096 gcn cores, then a cut to 3584 gcn cores may not hurt the card at all. I would pay $550 for this performance on the off chance that future drivers will bring more performance. The Fury and Fury Nano might just be AMD saving grace.


Pretty much, AMD flagships have pretty much always been terrible value against the card one step down.
Quote:


> Originally Posted by *keikei*
> 
> That is the only contention I have with the card, its price. Not performance. To match the price of a competitor's card and not also match their overall performance is pretty ballzy of AMD. I expect Fury X's price to drop after the diehard AMD fans pick up the launch cards because they are the only ones buying them at this price. I'm a fan, just not a blind one.


Completely agree, the Fury X should be max $600 with the performance seen.
Quote:


> Originally Posted by *Themisseble*
> 
> +1
> So why would they compare different drivers?
> Suddenly R9 290X is on pair with GTX 980?.. they dont want to show that to average Joe.


I did some slight digging, here are the card frequencies. No wonder the 290(X) look bad vs. 970.
http://www.computerbase.de/2015-05/grafikkarten-testverfahren-testsystem/2/
Poor testing methodology, at the very least.


----------



## Themisseble

Quote:


> Originally Posted by *Tojara*
> 
> Pretty much, AMD flagships have pretty much always been terrible value against the card one step down.
> Completely agree, the Fury X should be max $600 with the performance seen.


but you guys are completely wrong- most likely GTX 980Ti prices will drop.
http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-is-considering-a-price-cut-of-high-end-graphics-cards/

Fury X is selling like hot cakes
http://wccftech.com/amds-radeon-r9-fury-sold-out-day-launch/


----------



## Casey Ryback

Quote:


> Originally Posted by *Themisseble*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html
> 
> clearly beating TITAN X
> 
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/20.html
> not beating GTX 980Ti


The differences in those reviews is very interesting, I'll give you that.

That frame variance is also very admirable on the fury card.

Think I'll use the tom's review against the anti-fury brigade here on OCN


----------



## Tojara

Quote:


> Originally Posted by *Themisseble*
> 
> but you guys are completely wrong- most likely GTX 980Ti prices will drop.
> http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-is-considering-a-price-cut-of-high-end-graphics-cards/
> 
> Fury X is selling like hot cakes
> http://wccftech.com/amds-radeon-r9-fury-sold-out-day-launch/


People are dumb. I get that overclocking isn't something everyone considers, but even then the factory OC models of the 980 ti are a fair bit faster.


----------



## VSG

Funny that someone else mentioned that driver thing. I was following up on it myself when I saw Scott Wasson's tweet. It may well be that a lot of reviewers inadvertently used not so optimized drivers as the last one that should have been on the FTP server.


----------



## Themisseble

Quote:


> Originally Posted by *Tojara*
> 
> People are dumb. I get that overclocking isn't something everyone considers, but even then the factory OC models of the 980 ti are a fair bit faster.


Not!
people are not dumb for supporting AMD. People are dumb for supporting NVIDIA. I have few frends and they have bought nvidia card just because nvidia titles runs like a crap on AMD card. And this is why I hate NVIDIA... yes I had most of their cards but with actions like these... screw them. Big Bad boy must NOT BE FED!!! Please remember that

Look at this OC review
https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url


----------



## keikei

Quote:


> Originally Posted by *Tojara*
> 
> People are dumb. I get that overclocking isn't something everyone considers, but even then the factory OC models of the 980 ti are a fair bit faster.


I read somewhere (dont kill me because i cant remember the source) that there is a driver in the works to unlock the voltage on the Fury X's. I think at that point, if true, 'the gloves are off'.


----------



## Rookie1337

Quote:


> Originally Posted by *Themisseble*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html
> 
> clearly beating TITAN X
> 
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/20.html
> not beating GTX 980Ti


Toms using Nvidia cards with 352.9 or 347.25 beta drivers while techpowerup using 353.06WHQL drivers for their Nvidia results. And Toms using an X99 setup(Haswell 2011 hexa-core CPU) vs techpowerup using an z87 4770k setup plus Toms using Windows8.1 Pro and techpowerup using Windows 7 SP1. So, really this just shows how hard it is to get comparable data and why we have such blatant ignorance about how easy it is to pick and choose results either claiming FuryX is a failure or something great. I bet even on the same site they make important changes for every new platform they test and forget to update results throughout their sites because that is too hard to do. At this point I'm finding it hard to trust any site with reviews any more and I'm not even talking about factoring in things like the whole "oh they're being paid to say that". At this point I'm just going to assume failure to keep variables as controlled as possible as the modeus operandum.

But your post really helped remind me of just how unreliable reviews really are especially across websites.

On the flips side am I the only one that actually find's Toms amount of data (min, max, average) more useful than techpowerups single data result per set/test?


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> Another is TPU reviews, in which they tossed Gameworks titles to the mix and suddenly FuryX became a joke in the overall performance chart. I would take those reviews with a pinch of salt.


People still play Gameworks titles, don't they? You can't just toss out the results you don't like, what kind of review would that be?
Quote:


> Originally Posted by *keikei*
> 
> I read somewhere (dont kill me because i cant remember the source) that there is a driver in the works to unlock the voltage on the Fury X's. I think at that point, if true, 'the gloves are off'.


Should just need an Afterburner update, which will hopefully come soon. Even then, though, I wouldn't expect miracles. Hawaii cards top out around 1200 even with voltage control, so I'd expect something similar with Fury.


----------



## Themisseble

I dont trust reviews. So I agree with you.

Nope- toms is using better equipment for power consumption and actually I think that test is more accurate


----------



## Klocek001

Quote:


> Originally Posted by *criminal*
> 
> I am excited about the Fury @ $550 to be honest. If the Fury X is ROP limited and is bottle necking the 4096 gcn cores, then a cut to 3584 gcn cores may not hurt the card at all. I would pay $550 for this performance on the off chance that future drivers will bring more performance. The Fury and Fury Nano might just be AMD saving grace.


yes and no, if you strip fury x of sp units it'll basically be hawaii, same 64rops and very similar clock speed. hbm is an advantage but it's not like 290x has problems with memory transfer, with 1600 OC you'll likely hit +400GB/s. It could be a total misfire or the sweet spot. We'll see.


----------



## gamervivek

Quote:


> Originally Posted by *Themisseble*
> 
> About reviews.. some are using wrong drivers
> 
> https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml
> 
> interesting - new drivers?
> Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th.
> The press driver on AMD's FTP server is 15.15-150611a-185358E.
> 
> Fury beats TITAN X at 4K ....


Interesting, someone pointed out that they are using win7 and not win8, but then TPU are using win7 as well.
Quote:


> Originally Posted by *Themisseble*
> 
> That wanst the point.
> Point is that computebase.de is shwoing 24% difference between R9 390X and R9 290X .. also oC-ed R9 290X is slower.
> HardOcp Is hwoing that there is no difference clock per cloc... so the question is: How can OC-ed R9 290X not compete with R9 390X and how can R9 390X beat it for 24% with only 5-10% OC?


computerbase are using a throttling reference 290X which is why their 290X is that much slower than 390X.


----------



## bossie2000

Quote:


> There, sitting alongside Nvidia's gaming champion, Radeon R9 Fury X now shares the throne. It's not faster, it's not cheaper and it's certainly not any more elegant. The card is just enough to yield a bit of parity. And for the AMD faithful, that's enough to warrant a purchase. We have to wonder if the company stopped just short of the gold, though. More speed, a lower price, some sort of game bundle-it could have gone in several directions, really, to convince enthusiasts that Fury X is the better buy. - See more at: http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-9.html#sthash.5Gnmwnni.dpuf


This!!


----------



## FallenFaux

Quote:


> Originally Posted by *Klocek001*
> 
> yes and no, if you strip fury x of sp units it'll basically be hawaii, same 64rops and very similar clock speed. hbm is an advantage but it's not like 290x has problems with memory transfer, with 1600 OC you'll likely hit +400GB/s. It could be a total misfire or the sweet spot. We'll see.


It doesn't need as many ROPs because of the improvements from GCM 1.2 though and If the Fury X truly is ROP limited than the Fury is going to be pretty close to the Fury X.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> People still play Gameworks titles, don't they? You can't just toss out the results you don't like, what kind of review would that be?
> Should just need an Afterburner update, which will hopefully come soon. Even then, though, I wouldn't expect miracles. Hawaii cards top out around 1200 even with voltage control, so I'd expect something similar with Fury.


They should also include all Mantle games running Mantle lol. It's only fair.


----------



## bossie2000

Quote:


> In the end, AMD has plenty to be proud of. By combining a more resource-rich GPU and our first taste of HBM, it successfully leapfrogs the GeForce GTX 780 Ti, which first cast a shadow over Radeon R9 290X, and the GeForce GTX 980 that sat in a class of its own for several months, landing right next to GeForce GTX 980 Ti. - See more at: http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-9.html#sthash.5Gnmwnni.dpuf


And this !

The Fury is a beast after all. New drivers is going to drive it further!


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They should also include all Mantle games running Mantle lol. It's only fair.


They should. That's like one game, right?

Edit: but I think a couple of reviews mentioned that DX11 was faster than Mantle on the Fury, so maybe it's good they don't.


----------



## Casey Ryback

Quote:


> Originally Posted by *Themisseble*
> 
> Nope- toms is using better equipment for power consumption and actually I think that test is more accurate


Tom's results for power consumption don't really matter honestly.

They state 440-ish watt for total system consumption.

Anandtech state 408W, so they are obviously using different rigs.

This has nothing to with drivers and your theories anyway so just forget about power consumption.


----------



## Casey Ryback

Quote:


> Originally Posted by *gamervivek*
> 
> Interesting, someone pointed out that they are using win7 and not win8, but then TPU are using win7 as well.


Odd to me they'd use windows 7.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> They should. That's like one game, right?


BF4, Sniper Elite 3, Civilization Beyond Earth, BF Hard Line, Theif, Dragon Age: Inquisition,Plants vs. Zombies: Garden Warfare 2.


----------



## Woundingchaney

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They should also include all Mantle games running Mantle lol. It's only fair.


If you look at the Toms article that is essentially what they did. Many of the titles benched are AMD preferred titles (Tomb Raider, Thief, etc). This of course is fine but to base a review primarily around older AMD titles doesn't make sense when nearly all currently titles are Nvidia centric. I cant see any reason to include benchmarks for games like Thief and Tombraider on a card that has released in the last week.

-Im also wondering if they simply copy and paste benches from their past review of gpus or if they rebench for comparisons. Some of the numbers look a bit off.


----------



## Rookie1337

Quote:


> Originally Posted by *Forceman*
> 
> *People still play Gameworks titles, don't they? You can't just toss out the results you don't like, what kind of review would that be?*
> Should just need an Afterburner update, which will hopefully come soon. Even then, though, I wouldn't expect miracles. Hawaii cards yippee out around 1200 even with voltage control, so I'd expect something similar.


Well, when a title comes out that punishes one brand but favors another simply because it is designed to reduce, weaken, etc the specific competitor brand then you have issues. This is the same reason why the whole Intel compiler harming AMD and VIA CPU results was a thing. If there is a way to compare a piece of software without the anti-competitive shennanigans then yes the software in question should be included. Now, I'm not saying exclude a piece of software because the designers didn't optimize it for a particular brand; I'm saying if they used something that has the ability (intentionally or not) of simply saying "hey, they're using brand x but we don't like brand x so do something like ineffecient draw calls or whatever" then we have an issue. PS: I hope that made sense, I haven't had my coffee today.


----------



## Casey Ryback

Quote:


> Originally Posted by *Woundingchaney*
> 
> If you look at the Toms article that is essentially what they did. Most of the titles benched are AMD preferred titles (Tomb Raider, Thief, etc). This of course is fine but to base a review primarily around older AMD titles doesn't make sense when nearly all currently titles are Nvidia centric.


I didn't think last light, shadows of mordor and witcher 3 were AMD titles?

Fury X does fine in them.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Woundingchaney*
> 
> If you look at the Toms article that is essentially what they did. Most of the titles benched are AMD preferred titles (Tomb Raider, Thief, etc). This of course is fine but to base a review primarily around older AMD titles doesn't make sense when nearly all currently titles are Nvidia centric.


I think Nvidia is smart. Just release GamwWork tittles 6 month before the lauch of their flagship GPU so no matter what AMD has games will simply run better on Nvidia hardware. Just looked at the last 8 months. All the games are GameWorks. FC4, Dying Light, Witcher 3, "Batman".


----------



## Casey Ryback

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Nvidia is smart. Just release GamwWork tittles 6 month before the lauch of their flagship GPU so no matter what AMD has games will simply run better on Nvidia hardware. Just looked at the last 8 months. All the games are GameWorks. FC4, Dying Light, Witcher 3, "Batman".


FC4 AMD dominates the benches (iirc)

Dying light is a terrible game so who cares.

Witcher 3 is ok if you like that series......no multiplayer.

Batman.....ROFL.....terrible port and extensive DLC


----------



## SpeedyVT

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Nvidia is smart. Just release GamwWork tittles 6 month before the lauch of their flagship GPU so no matter what AMD has games will simply run better on Nvidia hardware. Just looked at the last 8 months. All the games are GameWorks. FC4, Dying Light, Witcher 3, "Batman".


I'm pretty sure a lawsuit will come of it. NVidia securing it's role with unfair business practices that the consumer is not made aware of.


----------



## Casey Ryback

Quote:


> Originally Posted by *SpeedyVT*
> 
> I'm pretty sure a lawsuit will come of it. NVidia securing it's role with unfair business practices that the consumer is not made aware of.


Yep I'll be suing them myself just for releasing such terrible games................


----------



## Rookie1337

OK guys maybe we should do a chart grouping reviews based on software (drivers including those of the older nvidia and AMD cards used in comparison, games, OS,) and hardware (CPU, RAM, PSU, etc.) and try to aggregate more comparable data that way we can have more valid results instead of comparing an 4770k Win7 setup with one set of drivers against a haswell octa or hexa core running Win8 and a different set of drivers for all of it's GPUs.

That way we can have better data to compare and more variables are controlled. Then we aren't picking and choosing our data to support our biases.


----------



## Woundingchaney

Quote:


> Originally Posted by *Casey Ryback*
> 
> I didn't think last light, shadows of mordor and witcher 3 were AMD titles?
> 
> Fury X does fine in them.


Im not saying that it is performing poorly by any means, but these are all relevant titles. It doesn't make sense to add Thief, Tomb Raider, etc to the benches outside of them being AMD geared titles. The majority of the customer base for a card this new do not care about titles that are literally years old at this point.

For some reason I always thought that the Metro series was AMD geared but not sure about that.


----------



## Klocek001

wait a minute if fury x is truly ROP limited and it's bad rop/shader ratio on fury x that makes 980ti come out superior then wouldn't asynchronous shaders in dx12 allow the ROP bottleneck on Fury X to improve ?


----------



## Casey Ryback

Quote:


> Originally Posted by *Woundingchaney*
> 
> Im not saying that it is performing poorly by any means, but these are all relevant titles. It doesn't make sense to add Thief, Tomb Raider, etc to the benches outside of them being AMD geared titles. The majority of the customer base for a card this new do not care about titles that are literally years old at this point.
> 
> For some reason I always thought that the Metro series was AMD geared but not sure about that.


When we say 'geared for AMD' it actually just means not gimped.

These same titles seem to come up in nvidia reviews too btw..............

Maybe it's because they have an actual benchmarking loop in them?

metro 2033 was over rated so I never bought last light, and also I never played tombraider either so wouldn't know. Same goes for thief never played it.


----------



## gamervivek

Quote:


> Originally Posted by *FallenFaux*
> 
> It doesn't need as many ROPs because of the improvements from GCM 1.2 though and If the Fury X truly is ROP limited than the Fury is going to be pretty close to the Fury X.


It might be ROP limited, but has parity or even faster than 980Ti at 4k with fewer ROPs, less vram and lower clockspeed. Great job AMD.


----------



## tconroy135

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Same architecture with improvements, in tessellation for example that were unknown quantities. Nor did anyone know how much difference HBM would make. Not enough for me as it turns out.


I always get confused with people thinking HBM is going to make a major improvement. Im not sure what game is currently experiencing issue because of memory bandwidth. The enhancement you can expect from HBM when the 14/16nm architectures come out is from what can be done to the GPU because of the lower power, etc. from HBM.

I always thought using HBM with the already fleshed out 28nm process was a bit of a waste. TBH I think Volta is where the next big thing and by that time AMD will be out of business


----------



## Rookie1337

Quote:


> Originally Posted by *Rookie1337*
> 
> OK guys maybe we should do a chart grouping reviews based on software (drivers including those of the older nvidia and AMD cards used in comparison, games, OS,) and hardware (CPU, RAM, PSU, etc.) and try to aggregate more comparable data that way we can have more valid results instead of comparing an 4770k Win7 setup with one set of drivers against a haswell octa or hexa core running Win8 and a different set of drivers for all of it's GPUs.
> 
> That way we can have better data to compare and more variables are controlled. Then we aren't picking and choosing our data to support our biases.


Seriously...anyone want to actually help with that idea or should we just continue bickering and cherry picking results?


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Nvidia is smart. Just release GamwWork tittles 6 month before the lauch of their flagship GPU so no matter what AMD has games will simply run better on Nvidia hardware. Just looked at the last 8 months. All the games are GameWorks. FC4, Dying Light, Witcher 3, "Batman".


Every FC4 4K test I've seen has the FuryX ahead of the 980 Ti.


----------



## mltms




----------



## rv8000

Quote:


> Originally Posted by *Klocek001*
> 
> wait a minute if fury x is truly ROP limited and it's bad rop/shader ratio on fury x that makes 980ti come out superior then wouldn't asynchronous shaders in dx12 allow the ROP bottleneck on Fury X to improve ?


I don't get why everyone keeps bringing this up along with the pixel fill rate thing. Sites that did synthetic tests for pixel fill rate showing the Fury X to have a small lead over the 980ti. Now I im no expert on ROPs but if Fury X or 290X where rop limited for any reason wouldn't they scale poorly going to higher res due to the large amount of pixels that need to be drawn? It doesn't make any sense when both Fury X and 290X/390X scale better as the resolution increases.


----------



## Klocek001

Quote:


> Originally Posted by *gamervivek*
> 
> It might be ROP limited, but has parity or even faster than 980Ti at 4k with fewer ROPs, less vram and lower clockspeed. Great job AMD.


but with 4096 cores. We'll see how dx12 utilizes the card's resources, 96 rops with less sp / 64 rops with 4096 sp.
Quote:


> Originally Posted by *rv8000*
> 
> I don't get why everyone keeps bringing this up along with the pixel fill rate thing. Sites that did synthetic tests for pixel fill rate showing the Fury X to have a small leave over the 980ti. Now I im no expert on ROPs but if Fury X or 290X where rop limited for any reason wouldn't they scale poorly going to higher res due to the large amount of pixels that need to be drawn? It doesn't make any sense when both Fury X and 290X/390X scale better as the resolution increases.


didn't think of that. you got a point.


----------



## Woundingchaney

Quote:


> Originally Posted by *Casey Ryback*
> 
> When we say 'geared for AMD' it actually just means not gimped.
> 
> These same titles seem to come up in nvidia reviews too btw..............
> 
> Maybe it's because they have an actual benchmarking loop in them?
> 
> metro 2033 was over rated so I never bought last light, and also I never played tombraider either so wouldn't know. Same goes for thief never played it.


Its not a matter of "not gimped" all of those titles were Mantle, Tress FX, etc derived which makes them the equivalent of Game Works titles. The only problem is that these titles are really not relevant to gamers anymore. They are literally years old.

I honestly don't know if they do or do not have performance testing in them natively.


----------



## Randomdude

Considering how 28 nm took ~3 years to mature to the point where you have 600mm^2 monster chips that clock well - factor in the time it took between the 680 and 780ti release (mid-range and high-end chip of the same process and arch), and then the time until we got to the fully matured end-of-the-road chip in the name of the GM200 (new arch) - is it a stretch to assume a similar scenario playing out with the coming of the new node?

I see a lot of people saying that Pascal is the next coming, but I suspect it'll actually go the same route the last process traversed. In other words, we'll have the following repeating cycle:

1st step. A mid-range chip marketed as high-end because yields are low in the name of the GTX X80, using the next iteration of HBM. It will be not be a huge chip, but it will likely be extremely power efficient and a bit faster than GM200, perhaps more than a bit since it'll be the first time for nVidia where they will save die space from the memory controller and can use that space for a bit of extra oomph. This will be the first generation of Pascal. I'd say this will come around H2 2016. All in all decent, but not massive performance gains. 14/16 nm process node.

2nd step. Next will come Big Pascal. The 14/16 nm process will have matured roughly enough for them to bring volume production to the big brother of the GTX X80 (680 to 780Ti). Let's call it X180 for now. I'd say Big Pascal will come around Q2 2017 and will be the second and last generation of the Pascal architecture. Massive performance gains. Still on the same process node.

3rd step. And finally, Volta (a la Maxwell). The last bits of performance squeezed out of the 14/16 nm node. The refined successor to the previously so-named X180 takes the steering wheel - the best GPU's both companies have to offer on this process are released now. Released around Q2-Q3 2018. Massive performance gains, end of the 14/16 nm process lifespan.

The best time upgrade in each of these 3 step cycles, is in my opinion on each 3rd step or between 2nd step of each cycle, as you get the most out of your hardware that way.

This was completely and totally out of the blue, just food for thought on whether to upgrade now or wait for some time.


----------



## Blameless

Quote:


> Originally Posted by *Themisseble*
> 
> Still you missed the point...
> Tomshardware is showing lower power consumption than GTX 980Ti. So R9 Fury X is using less power than GTx 980Ti in gaming! (220vs233)
> http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html - GTX 980Ti power consumption
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-7.html - Fury X
> 
> Now, techpower up is showing different story. They are saying that GTX 980Ti is using less power than R9 Fury X!
> 
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/29.html


Different cards/architectures behave differently in different titles. Combined with the natural variances from sample to sample this more than explains how one site's tests can have a Fury X at slightly lower average power consumption than a 980Ti while another site, with a different 980Ti, a different Fury X, and a different game, can show the opposite.

There is nothing unusual or unbelievable about these results.


----------



## Wezzor

So all we can do now is to wait on Windows 10 to really see how this gpu performs?


----------



## Casey Ryback

Quote:


> Originally Posted by *Woundingchaney*
> 
> Its not a matter of "not gimped"


that line was a joke, but i didn't make that clear I apologise.


----------



## hamzta09

Quote:


> Originally Posted by *mltms*


All Reference cards?
If so.. Nvidia > Fury with non-Reference.

Will Fury X never leave reference?


----------



## szeged

Quote:


> Originally Posted by *hamzta09*
> 
> All Reference cards?
> If so.. Nvidia > Fury with non-Reference.
> 
> Will Fury X never leave reference?


amd said they will only allow reference model fury X cards but regular fury non x cards can have non reference i think.


----------



## Ramzinho

Quote:


> Originally Posted by *tconroy135*
> 
> I always get confused with people thinking HBM is going to make a major improvement. Im not sure what game is currently experiencing issue because of memory bandwidth. The enhancement you can expect from HBM when the 14/16nm architectures come out is from what can be done to the GPU because of the lower power, etc. from HBM.
> 
> I always thought using HBM with the already fleshed out 28nm process was a bit of a waste. TBH I think Volta is where the next big thing and by that time AMD will be out of business


Well maybe that's why a 650$ GPU with 1/3 the memory of a 12GB GPU is so close in performance at 4k..!


----------



## Ganf

Still think the Fury X will win the latency race in VR, but it's not going to be my primary setup. I'll pick one up used and leave it disabled for my non-VR games. A single Fury X will run pretty much every VR game I have/am getting with ease.

Async shaders is kind of a big deal in the latency department.


----------



## Klocek001

Quote:


> Originally Posted by *Casey Ryback*
> 
> Dying light is a terrible game so who cares.


blasphemy! you seem to dislike most games. why would you even need a gpu then?
Quote:


> Originally Posted by *mltms*


980's getting owned by 390x at 4k.


----------



## Alatar

Quote:


> Originally Posted by *rv8000*
> 
> I don't get why everyone keeps bringing this up along with the pixel fill rate thing. Sites that did synthetic tests for pixel fill rate showing the Fury X to have a small leave over the 980ti. Now I im no expert on ROPs but if Fury X or 290X where rop limited for any reason wouldn't they scale poorly going to higher res due to the large amount of pixels that need to be drawn? It doesn't make any sense when both Fury X and 290X/390X scale better as the resolution increases.


Quote:


> Originally Posted by *hamzta09*
> 
> All Reference cards?
> If so.. Nvidia > Fury with non-Reference.
> 
> Will Fury X never leave reference?


There's no reference 390X so that's obviously not reference (and honestly for example all MSI gaming 390X tests should have been done against MSI gaming 980s, not ref 980s).

Either way the video just shows what reviews already showed on average, and as DigitalFoundry puts it in their video description:
Quote:


> All of AMD's 'in-house' benchmarks have come from games running at 4K, and it's easy to see why - it's a real rival for the GTX 980 Ti here, offering better performance than the Nvidia card on four of the nine games tested here. However, on aggregate, the GTX 980 Ti is still faster overall. There's not much in it in many cases, but on Battlefield 4, Nvidia's hardware is almost 20 per cent faster.


Reference 980Ti slightly faster at 4K (best case scenario res for Fury) than Fury X.


----------



## Mrzev

Quote:


> Originally Posted by *Ramzinho*
> 
> Well maybe that's why a 650$ GPU with 1/3 the memory of a 12GB GPU is so close in performance at 4k..!


What were the settings at?



This shows different?


That is with hairworks disabled.


----------



## hamzta09

Quote:


> Originally Posted by *Klocek001*
> 
> blasphemy! you seem to dislike most games. why would you even need a gpu then?
> 980's getting owned by 390x at 4k.


¨

Dying Light is quite a fun game.


----------



## ANN1H1L1ST

Told everyone from the start not to buy into the AMD hype. Nvidia has been better for a while now. AMD really needs to re-think their business or they wont have one.


----------



## Kand

Be honest with me.

Waiting for dx12 games?

In the last 7 years, how many dx11 games actually came out with impressive levels of hardware tesselation?

How many -relevant- games actually use dx11 features, setting them apart from dx9 games?

There are only a handful of games that do this. That are actually worth playing and not just another benchmark title..

With publishers still clinging to the console business scheme, i dont see any relevant dx12 games coming out in the short term. None to fully showcase any hidden potential, if any, that the fury x has.

Also, isnt AMD stuck on dx12 and Nvidia supports up to 12.1?


----------



## tconroy135

Quote:


> Originally Posted by *Ramzinho*
> 
> Well maybe that's why a 650$ GPU with 1/3 the memory of a 12GB GPU is so close in performance at 4k..!


True enough in certain scenarios, but when the Fury X/Titan X/980 Ti are OCed the Titan X/980 Ti are going to perform better, even at 4k (the 980Ti is 650 so you lose the argument their).

And on top of that because of the limitations of AMD's architecture even with the 4000+ shaders it still can't compete. By Compete I mean beat the 980Ti in many scenarios not just the odd outlier. Who purchases a GPU based on performance in outlier applications?


----------



## Klocek001

Quote:


> Originally Posted by *Kand*
> 
> Be honest with me.
> 
> Waiting for dx12 games?
> 
> In the last 7 years, how many dx11 games actually came out with impressive levels of hardware tesselation?
> 
> How many -relevant- games actually use dx11 features, setting them apart from dx9 games?
> 
> There are only a handful of games that do this. That are actually worth playing and not just another benchmark title..
> 
> With publishers still clinging to the console business scheme, i dont see any relevant dx12 games coming out in the short term. None to fully showcase any hidden potential, if any, that the fury x has.
> 
> Also, isnt AMD stuck on dx12 and Nvidia supports up to 12.1?


dx12 is supposed to bring performance gains too. and dx9 to dx11 is a separate thing.


----------



## Tivan

Quote:


> Originally Posted by *Kand*
> 
> Be honest with me.
> 
> Waiting for dx12 games?
> 
> In the last 7 years, how many dx11 games actually came out with impressive levels of hardware tesselation?
> 
> How many -relevant- games actually use dx11 features, setting them apart from dx9 games?
> 
> There are only a handful of games that do this. That are actually worth playing and not just another benchmark title..
> 
> With publishers still clinging to the console business scheme, i dont see any relevant dx12 games coming out in the short term. None to fully showcase any hidden potential, if any, that the fury x has.
> 
> Also, isnt AMD stuck on dx12 and Nvidia supports up to 12.1?


AMD and Nvidia support different things in dx12, though AMD more on the hardware communication side while nvidia on the feature side. I'd expect AMD cards to run solid in DX12 with any decent engine built on DX12/Vulkan. Since it's mostly back-end and not fancy stuff they got a lead on.

edit: Not to say the Nvidia available features aren't going to have a general impact with a good engine. Wait and see if you want to play Dx12 games is what I'd recommend.


----------



## NuclearPeace

So now its wait for new drivers. Wait for the voltage unlock. Wait for Windows 10. Wait for DX12. This is honestly becoming pathetic, especially for something that is supposed to be a flagship. NVIDIA cards might cost a lot but at least they get their ducks in a row before releasing things.

I'm getting frustrated with AMD and I don't even use any of their products at the moment. The 300 series outside of the Fury lineup is a mess. Who is going to buy a 7850 for $150 these days? Why is full Tonga missing? Why is there 8GB of VRAM on cards that don't need it? They are still well behind NVIDIA on DX11 overhead which sometimes makes a 750 Ti perform faster than a R9 270x with an i3 and I don't think much is being done about it. There's always much talk about how how amazing Vulkan and DX12 is so amazing and how it will basically make GCN super powerful and annihilate Maxwell. But what about right now were AMD desperately needs the cash? Judging from DX11 adoption DX12 isnt going to be very popular for a while, and those debt repayments are getting closer by the day.


----------



## Ramzinho

Quote:


> Originally Posted by *tconroy135*
> 
> True enough in certain scenarios, but when the Fury X/Titan X/980 Ti are OCed the Titan X/980 Ti are going to perform better, even at 4k (the 980Ti is 650 so you lose the argument their).
> 
> And on top of that because of the limitations of AMD's architecture even with the 4000+ shaders it still can't compete. By Compete I mean beat the 980Ti in many scenarios not just the odd outlier. Who purchases a GPU based on performance in outlier applications?


well we have seen what AMD Drivers do.. the very old architecture people are hating is now neck to neck with the 980 due to some OC and driver optimization. so who knows what drivers will bring us in the future !


----------



## Alatar

Buying a 4GB card while 28nm is on its last legs and banking on it getting better with future software seems somewhat crazy to me.

Maybe that's just me though.


----------



## hawker-gb

My all time favorite game which i still play(WITP) needs Athlon64 + some 32mb GPU.









More than 10 years and still going strong. I wonder why new games get forgotten after few months.


----------



## Kand

How many relevant games currently run mantle?


----------



## ZealotKi11er

Quote:


> Originally Posted by *mltms*


Wow. All i can say Fury X needs driver updates because is 390X is too dam close. Apart from 1 game 390X stops GTX980.
Quote:


> Originally Posted by *Alatar*
> 
> Buying a 4GB card while 28nm is on its last legs and banking on it getting better with future software seems somewhat crazy to me.
> 
> Maybe that's just me though.


It better this way. AMD want the upgrade cake too. Why is Nvidia only one to do it and eat the cake. People want reasons to upgrade.


----------



## Themisseble

Quote:


> Originally Posted by *mltms*


R9 390X rocks


----------



## Tivan

Quote:


> Originally Posted by *Alatar*
> 
> Buying a 4GB card while 28nm is on its last legs and banking on it getting better with future software seems somewhat crazy to me.
> 
> Maybe that's just me though.


Buying a 28nm card with massive GDDR5 buffer to 'future proof' doesn't sound much better. Chances are it's gonna last about as long and perform about as good for the most part, still getting outclassed by dual GPU solutions.

But yeah, not the most amazing time to buy a GPU.

edit: And HBM is definitely unproven technology at the moment, so banking on it to be amazing with a couple api improvements or driver updates is definitely not a safe bet. But then again, no reason not to get it.


----------



## Ha-Nocri

390(x) definitely was tweaked. 390 is as fast as 290x clock-for-clock.


----------



## rv8000

Quote:


> Originally Posted by *Kand*
> 
> How many relevant games currently run mantle?


Relevant is subjective, different people play different games. Someone posted a list of 6 or 7 games a few pages back.

The more and more I look at synthetics, the only thing Fury X has a big issue with is tessellation being almost 33% slower than the 980ti, I wish there was a way to find out the exact number for tessellation factors in these gaming engines.


----------



## SpeedyVT

Everyone is making a deal of 1 to 3 frames. People should just choose what fits their system build and preference better.


----------



## Tivan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Wow. All i can say Fury X needs driver updates because is 390X is too dam close. Apart from 1 game 390X stops GTX980.
> 
> It better this way. AMD want the upgrade cake too. Why is Nvidia only one to do it and eat the cake. People want reasons to upgrade.


Agreed on both!


----------



## Ha-Nocri

Quote:


> Originally Posted by *rv8000*
> 
> Relevant is subjective, different people play different games. Someone posted a list of 6 or 7 games a few pages back.
> 
> The more and more I look at synthetics, the only thing Fury X has a big issue with is tessellation being almost 33% slower than the 980ti, I wish there was a way to find out the exact number for tessellation factors in these gaming engines.


Almost none. Look for benches of 285 vs 280x. 285 has 2.5x better tessellation performance, yet it usually loses. It will win in The Witcher 3 with HW on.


----------



## Kylar182

Quote:


> Originally Posted by *Ha-Nocri*
> 
> 390(x) definitely was tweaked. 390 is as fast as 290x clock-for-clock.


?? So the newer/far more expensive card is faster than the much older cheaper card? Shhhhhoooockkkeeerrr


----------



## Ha-Nocri

Quote:


> Originally Posted by *Kylar182*
> 
> ?? So the newer/far more expensive card is faster than the much older cheaper card? Shhhhhoooockkkeeerrr


k









I pointed that out b/c many ppl think AMD didn't change 390(x) at all, except to increase core and memory clock. That's not true.


----------



## Tivan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> k
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I pointed that out b/c many ppl think AMD didn't change 390(x) at all, except to increase core and memory clock. That's not true.


Yeah they updated drivers as well.


----------



## Klocek001

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Almost none. Look for benches of 285 vs 280x. 285 has 2.5x better tessellation performance, yet it usually loses. It will win in The Witcher 3 with HW on.


meybe we need benchmarks with tessellation override settings in CCC.


----------



## pengs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> 390(x) definitely was tweaked. 390 is as fast as 290x clock-for-clock.


I think it's got a lot to do with drivers and that good ol' reference cooled 290x. The 390x just cleans everyone's perceptual slate.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Tivan*
> 
> Yeah they updated drivers as well.


Tested with those same drivers, still my 290 trails behind 390's


----------



## GorillaSceptre

I can't find a single upside in going with the Fury X over the 980Ti.

All the talk of AMD having to optimize drivers for the 4GB of memory, makes it an even easier choice.

Not to mention all the games (and many future titles) that are inevitably going to gimp AMD cards. The whole Game Works situation sucks, but no amount of moaning about it is going to change anything, as a consumer you can go AMD and suffer through that garbage, or choose the brand that has none of those issues.

Regardless of whether or not AMD are the "good guys", Nvidia is delivering where it counts. AMD's hardware competes well enough, but their software side is severely lacking. They need to step it up if they want to remain relevant.


----------



## Asmodian

Quote:


> Originally Posted by *Tivan*
> 
> Yeah they updated drivers as well.


And didn't let them run on the 200s, even though the cards are identical and the drivers work fine.

The plan to convince people they really did update something for the 300 series is working...


----------



## Elmy

390X is king @ 425 range and beats the GTX 980 for 100.00 less.....

No brainer for most people.


----------



## Ganf

Quote:


> Originally Posted by *Asmodian*
> 
> And didn't let them run on the 200s, even though the cards are identical and the drivers work fine.
> 
> The plan to convince people they really did update something for the 300 series is working...


Nice to hear your educated opinion.

Meanwhile those of us who are using the drivers anyways know what the gap is.


----------



## Blameless

Quote:


> Originally Posted by *Klocek001*
> 
> you seem to dislike most games. why would you even need a gpu then?


I dislike 95%+ of PC games I try. Doesn't mean I don't like many of them, or that there aren't several I play that justify a respectable GPU.
Quote:


> Originally Posted by *rv8000*
> 
> Sites that did synthetic tests for pixel fill rate showing the Fury X to have a small lead over the 980ti.


Raw fill rate tests are memory bandwidth as well as ROP intensive, and synthetic tests seem to be more easily compressible than games. If they are multitexturing tests, TMUs and memory bandwidth (both of which Fury is quite strong in) will probably be the limiting factor before ROPs.
Quote:


> Originally Posted by *rv8000*
> 
> Now I im no expert on ROPs but if Fury X or 290X where rop limited for any reason wouldn't they scale poorly going to higher res due to the large amount of pixels that need to be drawn? It doesn't make any sense when both Fury X and 290X/390X scale better as the resolution increases.


Depends on what limit is touched upon first.

The color compression that both Maxwell2 and GCN 1.2+ have is going to make it very difficult to isolate the ROPs in tests.
Quote:


> Originally Posted by *Kand*
> 
> Also, isnt AMD stuck on dx12 and Nvidia supports up to 12.1?


12.1 is a revision, not a feature level.

AMD (Hawaii, Tonga, probably Fiji) is "stuck" on feature level 12. NVIDIA (Maxwell2) can do feature level 12_1.

NVIDIA is "stuck" on resource binding tier 2 (and no asynchronous shaders). All AMD GCN parts can handle resource binding level 3.

There are currently _no_ GPU architectures available that support all DX12 features.
Quote:


> Originally Posted by *Kand*
> 
> How many relevant games currently run mantle?


Depends on what you consider relevant.

I've never even used Mantel outside of one off tests because I don't play any Mantle games.


----------



## Alatar

The 390Xs look abnormally good in reviews because they're non ref cards being compared against reference cards. As a general rule of thumb reference cards of all kinds throttle easily these days and non ref cards are pushed much higher out of the box

Same happened with the 970 when the review cards looked extremely close to the 980.

290X and 390X silicon is identical. So far the OCing potential looks identical. 390X doesn't compete any better than custom 290X and it shouldn't be compared to the ref 980 any more than custom 290Xs were.

ref. vs ref and custom vs custom. Only exceptions are if there are big price difference between custom and ref or if custom/ref models don't exist.


----------



## pengs

Quote:


> Originally Posted by *Asmodian*
> 
> And didn't let them run on the 200s, even though the cards are identical and the drivers work fine.
> 
> The plan to convince people they really did update something for the 300 series is working...


They did, they got rid of the thermal throttling







I mean, that's what it comes down to. Most of these guys grab a reference cooled and clocked 290x at 1000/1250, never changed the fan curve and just let it do it's thing. Rid the thermal throttling, clock it to 1100/1500 and it's a different story.


----------



## Slink3Slyde

Quote:


> Originally Posted by *tconroy135*
> 
> I always get confused with people thinking HBM is going to make a major improvement. Im not sure what game is currently experiencing issue because of memory bandwidth. The enhancement you can expect from HBM when the 14/16nm architectures come out is from what can be done to the GPU because of the lower power, etc. from HBM.
> 
> I always thought using HBM with the already fleshed out 28nm process was a bit of a waste. TBH I think Volta is where the next big thing and by that time AMD will be out of business


HBM was only one of the things that could possibly have been improved to be fair, and it must help at 4k because thats where the Fury looks best compared to the Nvidia cards. I dont think it was unreasonable or stupid to hope for more though.

But anyway it didnt happen, very good performance, but they priced it the same as a card that performs mostly better at stock, most likely overclocks better, and has more VRAM.


----------



## sugalumps

Quote:


> Originally Posted by *p4inkill3r*
> 
> I'm sure AMD saw that that blurb was there, and considering nobody expected a "knockout blow" dealt to nvidia's product, I see nothing wrong with them sharing a favorable review.


From the people that's slogan is never settle, or the competition is green with envy. Or dont just upgrade revolutionize, I am sure this was a mistake.


----------



## Ganf

Quote:


> Originally Posted by *Alatar*
> 
> The 390Xs look abnormally good in reviews because they're non ref cards being compared against reference cards. As a general rule of thumb reference cards of all kinds throttle easily these days and non ref cards are pushed much higher out of the box
> 
> Same happened with the 970 when the review cards looked extremely close to the 980.
> 
> 290X and 390X silicon is identical. So far the OCing potential looks identical. 390X doesn't compete any better than custom 290X and it shouldn't be compared to the red 980 any more than custom 290Xs were.
> 
> ref. vs ref and custom vs custom. Only exceptions are if there are big price difference between custom and ref or if custom/ref models don't exist.


Meh. People have been comparing aftermarket 780ti's/980's to the reference 290x for 2 years now. Tit for Tat.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Alatar*
> 
> The 390Xs look abnormally good in reviews because they're non ref cards being compared against reference cards. As a general rule of thumb reference cards of all kinds throttle easily these days and non ref cards are pushed much higher out of the box
> 
> Same happened with the 970 when the review cards looked extremely close to the 980.
> 
> 290X and 390X silicon is identical. So far the OCing potential looks identical. 390X doesn't compete any better than custom 290X and it shouldn't be compared to the ref 980 any more than custom 290Xs were.
> 
> ref. vs ref and custom vs custom. Only exceptions are if there are big price difference between custom and ref or if custom/ref models don't exist.


Not true. We tested here on OCN. FireStrike only for now tho, but 390 is just faster than 290, even if 290 is clocked higher. It tells the story that many reviews showed. What exactly AMD did I don't know.


----------



## Blameless

I'll happily pit my non-reference 290X using the 15.15 drivers against a similarly clocked 390X in any freely available benchmark.


----------



## mltms

Quote:


> Originally Posted by *Alatar*
> 
> The 390Xs look abnormally good in reviews because they're non ref cards being compared against reference cards. As a general rule of thumb reference cards of all kinds throttle easily these days and non ref cards are pushed much higher out of the box
> 
> Same happened with the 970 when the review cards looked extremely close to the 980.
> 
> 290X and 390X silicon is identical. So far the OCing potential looks identical. 390X doesn't compete any better than custom 290X and it shouldn't be compared to the ref 980 any more than custom 290Xs were.
> 
> ref. vs ref and custom vs custom. Only exceptions are if there are big price difference between custom and ref or if custom/ref models don't exist.


anandtech test saying another story boosting to 1200mhz

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alatar*
> 
> The 390Xs look abnormally good in reviews because they're non ref cards being compared against reference cards. As a general rule of thumb reference cards of all kinds throttle easily these days and non ref cards are pushed much higher out of the box
> 
> Same happened with the 970 when the review cards looked extremely close to the 980.
> 
> 290X and 390X silicon is identical. So far the OCing potential looks identical. 390X doesn't compete any better than custom 290X and it shouldn't be compared to the ref 980 any more than custom 290Xs were.
> 
> ref. vs ref and custom vs custom. Only exceptions are if there are big price difference between custom and ref or if custom/ref models don't exist.


Still it never loses to GTX980 only in TR which is probably CPU limit because fps are 100+. What AMD has been able to do with Hawaii is truly amazing.


----------



## sugalumps

When the 390x is winning in reviews it's reliable. When the fury x is getting beat it's biased incorrect reviewers?


----------



## rv8000

Quote:


> Originally Posted by *Blameless*
> 
> I dislike 95%+ of PC games I try. Doesn't mean I don't like many of them, or that there aren't several I play that justify a respectable GPU.
> Raw fill rate tests are memory bandwidth as well as ROP intensive, and synthetic tests seem to be more easily compressible than games. If they are multitexturing tests, TMUs and memory bandwidth (both of which Fury is quite strong in) will probably be the limiting factor before ROPs.
> Depends on what limit is touched upon first.
> 
> The color compression that both Maxwell2 and GCN 1.2+ have is going to make it very difficult to isolate the ROPs in tests.
> 12.1 is a revision, not a feature level.
> 
> AMD (Hawaii, Tonga, probably Fiji) is "stuck" on feature level 12. NVIDIA (Maxwell2) can do feature level 12_1.
> 
> NVIDIA is "stuck" on resource binding tier 2 (and no asynchronous shaders). All AMD GCN parts can handle resource binding level 3.
> 
> There are currently _no_ GPU architectures available that support all DX12 features.
> Depends on what you consider relevant.
> 
> I've never even used Mantel outside of one off tests because I don't play any Mantle games.


This is why I find it hard to believe Fury X is ROP limited, and I'm sure AMD's engineers know far better than all of us.

It's very weird when in comparison to a 290x, Fury X has a 50% higher Texel Fillrate and a Whopping 140% higher Pixel Fillrate. I know Synthetics are the best case scenario and often games will not be the case, but from a hardware standpoint and synthetic figures Fury X should be leagues away from a 290x; in some cases this is true, meanwhile in other cases its right with a 290x/390x and now miles away from a 980ti/TX, *which really should make people wonder why some games show a minor advantage against the 980ti, some are even, and then some by huge deficits*.


----------



## ambientblue

Quote:


> Originally Posted by *i7monkey*
> 
> This launch was entirely ruined by AMD's marketing team and the decision to charge $649 for it. Period.
> 
> Hype and MSRP is what ruined this card, because it's an otherwise decent card. Fury X should cost $499.


Exactly, no need for AMD to follow Nvidia in to the ultra high end price range. Their r9 290x was priced better at launch over the Gtx 780 ti


----------



## Alatar

Quote:


> Originally Posted by *Ganf*
> 
> Meh. People have been comparing aftermarket 780ti's/980's to the reference 290x for 2 years now. Tit for Tat.


No, it's just that normal reviewing practice is to compare whatever non ref card you're reviewing to all reference cards. Non ref 290X reviews were also mainly against reference cards.

Makes the AIB happy but does a disservice to the consumer.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not true. We tested here on OCN. FireStrike only for now tho, but 390 is just faster than 290, even if 290 is clocked higher. It tells the story that many reviews showed. What exactly AMD did I don't know.


Proof? Someone who has both cards on the same setup is showing differences that are outside the ranges of small changes you could see from a different brand of memory chips?
Quote:


> Originally Posted by *mltms*
> 
> anandtech test saying another story boosting to 1200mhz
> 
> http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22


This has absolutely nothing to do with what I said.

I said that non ref. vs. ref. is a mostly pointless comparison. No matter what the cards are.


----------



## Blameless

Quote:


> Originally Posted by *rv8000*
> 
> I'm sure AMD's engineers know far better than all of us.


I'm sure they do, but they may not have the same performance priorities.

I could have told AMD I thought a bunch of narrow, high clocked, cores were a bad idea five years before Bulldozer...and I would have been right, but not because I know better than AMD's engineers; just their management.


----------



## Ganf

Quote:


> Originally Posted by *Alatar*
> 
> No, it's just that normal reviewing practice is to compare whatever non ref card you're reviewing to all reference cards. Non ref 290X reviews were also mainly against reference cards.
> 
> Makes the AIB happy but does a disservice to the consumer.


That disservice to the consumer has been doing a good job of running AMD's GPU market into the ground, so it can go back to work letting them regain some ground in the middle of the market for all I care.


----------



## Alatar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still it never loses to GTX980 only in TR which is probably CPU limit because fps are 100+. What AMD has been able to do with Hawaii is truly amazing.


980 and 290X were always within 10% of each other and non reference cards are frequently 10% faster than their reference versions so I can't really see anything amazing about it.

Compare MSI gaming 980 vs. MSI gaming 390X or something like that and if it still matches/beats the thing then I'll say it's amazing.


----------



## ANN1H1L1ST

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still it never loses to GTX980 only in TR which is probably CPU limit because fps are 100+. What AMD has been able to do with Hawaii is truly amazing.


It never loses to a 980? What reviews are you looking at? GURU3d has the Custom MSI 390x losing to the (4gb DDR5) 980 reference 90% of the time...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alatar*
> 
> 980 and 290X were always within 10% of each other and non reference cards are frequently 10% faster than their reference versions so I can't really see anything amazing about it.
> 
> Compare MSI gaming 980 vs. MSI gaming 390X or something like that and if it still matches/beats the thing then I'll say it's amazing.


Can't really say. What is typical boost of GTX980 reference? I am sure they used a 1050/1500 390X on that review or else it would be stupid.


----------



## hamzta09

Quote:


> Originally Posted by *Klocek001*
> 
> dx12 is supposed to bring performance gains too. and dx9 to dx11 is a separate thing.


They said DX11 was supposed to bring performance gains too.


----------



## dmasteR




----------



## pengs

Quote:


> Originally Posted by *Alatar*
> 
> 980 and 290X were always within 10% of each other and non reference cards are frequently 10% faster than their reference versions so I can't really see anything amazing about it.
> 
> Compare MSI gaming 980 vs. MSI gaming 390X or something like that and if it still matches/beats the thing then I'll say it's amazing.


We're talking about the Digital Foundry FuryX video, yeah? He doesn't state which 390x it is so if he has any type of integrity he's probably using the reference clocked 1050/1500 390x and a reference 980 and updated drivers.


----------



## Ganf

Quote:


> Originally Posted by *dmasteR*


Sweet Jeebus. Newegg is only buying 100 units at a time of each brand?

NOW who is creating the artificial scarcity?

What are they thinking....


----------



## dir_d

Quote:


> Originally Posted by *Ganf*
> 
> Sweet Jeebus. Newegg is only buying 100 units at a time of each brand?
> 
> NOW who is creating the artificial scarcity?
> 
> What are they thinking....


A flop


----------



## Forceman

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not true. We tested here on OCN. FireStrike only for now tho, but 390 is just faster than 290, even if 290 is clocked higher. It tells the story that many reviews showed. *What exactly AMD did I don't know*.


They changed the memory timings.
Quote:


> Originally Posted by *Ganf*
> 
> Sweet Jeebus. Newegg is only buying 100 units at a time of each brand?
> 
> NOW who is creating the artificial scarcity?
> 
> What are they thinking....


Only buying 100 or only being allowed to buy 100? They may be under allocation from AMD or the AIBs for all we know.


----------



## Thoth420

I got an xfx one from tiger direct yesterday pretty easily. Everywhere else was oos all day. I got confirmation it shipped today as well so hopefully Monday.


----------



## Blameless

Quote:


> Originally Posted by *rv8000*
> 
> This is why I find it hard to believe Fury X is ROP limited


Hawaii seems to be ROP limited. I benched one of Stilt's 290X mining BIOSes (better memory tables/performance, but disables half the ROPs to save power/heat while hashing) on my 290X and it cut the frame rate of most GPU limited benchmarks I tried nearly in half. If there was not an ROP limitation, the hit would have been much smaller.

Even with better color compression, I have no doubts that Fury (or even Hawaii) would benefit from more ROPs.


----------



## Forceman

I don't know the full impact of the ROP count, but here's what Tech Report said about it:
Quote:


> In other respects, including peak triangle throughput for rasterization and pixel fill rates, Fiji is simply no more capable in theory than Hawaii. As a result, Fiji offers a very different mix of resources than its predecessor. There's tons more shader and computing power on tap, and the Fury X can access memory via its texturing units and HBM interfaces at much higher rates than the R9 290X.
> 
> In situations where a game's performance is limited primarily by shader effects processing, texturing, or memory bandwidth, the Fury X should easily outpace the 290X. *On the other hand, if gaming performance is gated by any sort of ROP throughput-including raw pixel-pushing power, blending rates for multisampled anti-aliasing, or effects based on depth and stencil like shadowing-the Fury X has little to offer beyond the R9 290X. The same is true for geometry throughput*.


----------



## tconroy135

Quote:


> Originally Posted by *Slink3Slyde*
> 
> HBM was only one of the things that could possibly have been improved to be fair, and it must help at 4k because thats where the Fury looks best compared to the Nvidia cards. I dont think it was unreasonable or stupid to hope for more though.
> 
> But anyway it didnt happen, very good performance, but they priced it the same as a card that performs mostly better at stock, most likely overclocks better, and has more VRAM.


Quote:


> Originally Posted by *Ramzinho*
> 
> well we have seen what AMD Drivers do.. the very old architecture people are hating is now neck to neck with the 980 due to some OC and driver optimization. so who knows what drivers will bring us in the future !


Yeah I had a couple of 7970s in the past and was amazed at the performance gain after what was I think 2 years post release. The problem is that as a consumer how can you spend money based on this concept. Specifically with the Fury X though because of the Architecture remaining mostly unchanged I would not expect that kind of improvement.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> They changed the memory timings.
> Only buying 100 or only being allowed to buy 100? They may be under allocation from AMD or the AIBs for all we know.


Can you apply those memory timing with a flash?


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you apply those memory timing with a flash?


People have flashed the 390 BIOS on a 290, but I think success was kind of mixed. There's a thread somewhere.


----------



## Blameless

Memory timings on most 290X BIOSes are pretty solid at 1250 and 1500MHz. 1600MHz memory is generally slower than 1500MHz on all of my stock BIOSes, when the parts can clock that high.


----------



## iLeakStuff

I think I have to pass on Fury X.

Eurgamer did an extensive test on Fury X on 4K, 1440p and 1080p. Everything pretty much maxed out. And they did overclocking.
*Also notice the overclock scores on each test*. Notice how much better GTX 980Ti is on overclocking compared to Fury X. Its not even funny
*
4K resolution.*
980Ti and Fury X are pretty close. 980Ti doing extremely well on Battlefield 4.


*1440p resolution*
A pretty big win for 980Ti over Fury X when both are not overclocked. Even with Hairworks off on Witcher 3 it takes a huge lead. Nvidia takes a bigger lead in BF4 compared to 4K and is 10-20% faster in other games where they was even in 4K. This is pretty shocking to me since I wanted Fury X, and I will def be playing 1440p and 144Hz with my next build. Overclocked 980Ti is so much ahead. Just wow


1080p resolution
Didnt want to include the results but its similar to 1440p, 980ti maybe even faster than 1440p.

Man how dissappointed I am over Fury X compared to the competition. The only way I`m considering it is if it get a price reduction down to $500. Because at $650 the GTX 980Ti is a much better package which cost the same.


----------



## Kaltenbrunner

AMD has put me in a bad position, I'm giving up on the hassle of r9 290 CF, so I need a super powerful single card.

The 980ti is basically the same price as the fury X, but really does do a bit better. I'd rather support AMD, but ^^^^


----------



## iLeakStuff

I want to support the underdog as well but I have to be honest and get the best my hard earned money can buy which undoubtly seems to be 980Ti. Maybe AMD will go under faster and merge with other companies like Samsung to get the ship in to shape again. Or my fantasy, have ATI break out and start their own company to put Nvidia to shame.
Imagine AMD/Samsung vs Intel only concentrating on CPUs and using all their funds to kick Intel`s behind. ATI only doing graphic cards and using their expertise to hurt Nvidia.

That would be a dream scenario I think


----------



## aDyerSituation

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> AMD has put me in a bad position, I'm giving up on the hassle of r9 290 CF, so I need a super powerful single card.
> 
> The 980ti is basically the same price as the fury X, but really does do a bit better. I'd rather support AMD, but ^^^^


Similar position. I am going to wait and see this driver fiasco get sorted out first.
http://www.overclock.net/t/1562097/reddit-fury-x-possibly-reviewed-with-wrong-drivers#post_24091221


----------



## harney

Quote:


> Originally Posted by *ladcrooks*
> 
> i might get this card as it does well at 4k and should be even better for 3440x1440 - at the moment I am sitting on the fence 4k or 3440x1440
> 
> 4k is too demanding but i like the idea of a 40+ '', oh decisions, decisions


Quote:


> Originally Posted by *ladcrooks*
> 
> i might get this card as it does well at 4k and should be even better for 3440x1440 - at the moment I am sitting on the fence 4k or 3440x1440
> 
> 4k is too demanding but i like the idea of a 40+ '', oh decisions, decisions


Go with 3440x1440 Ultra wide great for gaming excellent for movies in 2.35 above aspect ratios..... had my doubts at 1st until i tried now will never look back recommend the dell curved one


----------



## harney

Quote:


> Originally Posted by *blue1512*
> 
> Anyone here enjoys a plot twist?
> Somebody is going to be fired at AMD, maybe there is an nVidia double agent there lol (980Ti timing anyone?)


Interesting


----------



## ElectroGeek007

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> AMD has put me in a bad position, I'm giving up on the hassle of r9 290 CF, so I need a super powerful single card.
> 
> The 980ti is basically the same price as the fury X, but really does do a bit better. I'd rather support AMD, but ^^^^


I have the exact same situation.


----------



## dubldwn

Quote:


> Originally Posted by *iLeakStuff*
> 
> I think I have to pass on Fury X.
> *1440p resolution*
> A pretty big win for 980Ti over Fury X when both are not overclocked. Even with Hairworks off on Witcher 3 it takes a huge lead. Nvidia takes a bigger lead in BF4 compared to 4K and is 10-20% faster in other games where they was even in 4K. This is pretty shocking to me since I wanted Fury X, and I will def be playing 1440p and 144Hz with my next build. Overclocked 980Ti is so much ahead. Just wow


Too bad they didn't overclock the 980. Would've been interesting. Mine does [email protected] all day.


----------



## ANN1H1L1ST

Quote:


> Originally Posted by *iLeakStuff*
> 
> I want to support the underdog as well but I have to be honest and get the best my hard earned money can buy which undoubtly seems to be 980Ti. Maybe AMD will go under faster and merge with other companies like Samsung to get the ship in to shape again. Or my fantasy, have ATI break out and start their own company to put Nvidia to shame.
> Imagine AMD/Samsung vs Intel only concentrating on CPUs and using all their funds to kick Intel`s behind. ATI only doing graphic cards and using their expertise to hurt Nvidia.
> 
> That would be a dream scenario I think


Except Nvidia have better experts than AMD/ATI...
Quote:


> Originally Posted by *dubldwn*
> 
> Too bad they didn't overclock the 980. Would've been interesting. Mine does [email protected] all day.


My 980's OC like beasts as well. 980 is still an amazing card.


----------



## KuuFA

Quote:


> Originally Posted by *ANN1H1L1ST*
> 
> Except Nvidia have better experts than AMD/ATI...
> My 980's Maxwell OC's like beasts as well. 980 is still an amazing card Maxwell Is an Amazing architecture.


Kinda Fixed?


----------



## ANN1H1L1ST

Quote:


> Originally Posted by *KuuFA*
> 
> Kinda Fixed?


Haha this is true as well. My 970 was also awesome. Maxwell is amazing in general.

I do not like AMD at all and would never run their products in one of my builds, but I was hoping they would knock one out of the park with the Fury X just to force Nvidia to stay on their toes. Too much to ask I guess.


----------



## dmasteR

Quote:


> Originally Posted by *Ganf*
> 
> Sweet Jeebus. Newegg is only buying 100 units at a time of each brand?
> 
> NOW who is creating the artificial scarcity?
> 
> What are they thinking....


I feel like Newegg may only be allowed to buy 100 at a time.


----------



## Orivaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> I think I have to pass on Fury X.
> 
> Eurgamer did an extensive test on Fury X on 4K, 1440p and 1080p. Everything pretty much maxed out. And they did overclocking.
> *Also notice the overclock scores on each test*. Notice how much better GTX 980Ti is on overclocking compared to Fury X. Its not even funny
> *
> 4K resolution.*
> 980Ti and Fury X are pretty close. 980Ti doing extremely well on Battlefield 4.
> 
> 
> *1440p resolution*
> A pretty big win for 980Ti over Fury X when both are not overclocked. Even with Hairworks off on Witcher 3 it takes a huge lead. Nvidia takes a bigger lead in BF4 compared to 4K and is 10-20% faster in other games where they was even in 4K. This is pretty shocking to me since I wanted Fury X, and I will def be playing 1440p and 144Hz with my next build. Overclocked 980Ti is so much ahead. Just wow
> 
> 
> 1080p resolution
> Didnt want to include the results but its similar to 1440p, 980ti maybe even faster than 1440p.
> 
> Man how dissappointed I am over Fury X compared to the competition. The only way I`m considering it is if it get a price reduction down to $500. Because at $650 the GTX 980Ti is a much better package which cost the same.


Do we have unlocked voltage yet?


----------



## dubldwn

What's Fury X stock load voltage anyway? 1.22? Will be interesting to see what the 1.3 crowd gets. That's pretty high for this process, though.


----------



## harney

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I can't find a single upside in going with the Fury X over the 980Ti.
> 
> All the talk of AMD having to optimize drivers for the 4GB of memory, makes it an even easier choice.
> 
> Not to mention all the games (and many future titles) that are inevitably going to gimp AMD cards. The whole Game Works situation sucks, but no amount of moaning about it is going to change anything, as a consumer you can go AMD and suffer through that garbage, or choose the brand that has none of those issues.
> 
> Regardless of whether or not AMD are the "good guys", Nvidia is delivering where it counts. AMD's hardware competes well enough, but their software side is severely lacking. They need to step it up if they want to remain relevant.


My thought exactly...i was on 1200p with a 970 perfect set-up ..but upgraded to 3440x 1440 and now need to upgrade the gpu side of things so i have been waiting for a few months to see what the fury x would bring to the table....

Now i expected better from the furyx but now that the dust is settling its still a good card hell looking at the dig foundry video the 390x is owning the 980 go AMD and really do wish the best for AMD for all are sakes....now re the fury yes maybe it will get better with volt unlocks drivers ect but i aint got time for that ....so my option is still Fury x and hope it will mature or 980 ti at this moment i am swinging over to the Ti great OC ect just a shame about the price £500+ for the board i want ouch.....
I never spent over £300 for any GFX card but either i was on 1080p or 1200p so there was no need too i am a bang for buck kind of guy....but with being on 1440p i have no option...but gee these gfx prices are silly they really are £500+ for gfx card and where just talking one component here







....

I would love to support AMD i really do that with nvidia's dirty tricks hell i am still chasing money for the bump gate crap back in 2010 never mind the crap i had to deal with the 970 ram gate issue......

So unfortunately its looking like the Ti for me as i can not wait any longer


----------



## harney

Quote:


> Originally Posted by *Ganf*
> 
> Sweet Jeebus. Newegg is only buying 100 units at a time of each brand?
> 
> NOW who is creating the artificial scarcity?
> 
> What are they thinking....


I think every retailer where only given a 100 so the 30'000 thing that was floating about could have been true ....hence the reason for the delay i thought it was a driver optimization delay but turns out it was a stock supply issue interesting .....


----------



## PhRe4k

As an AMD fan, I'm a bit underwhelmed but I mostly blame AMD's crap marketing for getting our hopes up







I think a $50 price drop minimum would put Fury X in a decent spot, but $100 off would be perfect. As always, future driver updates should improve things for AMD. Not that I bother with $500+ GPU's, I'm far more content picking up last years high end for cheap and saving the rest for games and other upgrades


----------



## curlyp

Hello Everyone - I purchased the R9 Fury X yesterday from my local retailer (they only received two during their morning shipment).

Wow...I am not impressed at all with this card.

My benchmarks are not the greatest as I am using the firestrike demo; however, it beats my GTX 970 Windforce by under 2k. Looks like I will return the card tomorrow and order the GTX 980TI OC Hybrid edition. The benchmarks released by AMD had the Fury X beating the 980 TI. Apparently, other reviews/benchmarks are showing this card is hardly beating the GTX 980. I think this card was hyped up and it is not living up to it.

Please correct me if I am work. I have only card the R9 for a little over 24 hours.

Check out the 3D Mark Benchmark below. The funny thing is, the G-card is not even recognized by the software!


----------



## iLeakStuff

Quote:


> Originally Posted by *curlyp*
> 
> Hello Everyone - I purchased the R9 Fury X yesterday from my local retailer (they only received two during their morning shipment).
> 
> Wow...I am not impressed at all with this card.
> 
> My benchmarks are not the greatest as I am using the firestrike demo; however, it beats my GTX 970 Windforce by under 2k. Looks like I will return the card tomorrow and order the GTX 980TI OC Hybrid edition. The benchmarks released by AMD had the Fury X beating the 980 TI. Apparently, other reviews/benchmarks are showing this card is hardly beating the GTX 980. I think this card was hyped up and it is not living up to it.
> 
> Please correct me if I am work. I have only card the R9 for a little over 24 hours.
> 
> Check out the 3D Mark Benchmark below. The funny thing is, the G-card is not even recognized by the software!


Get the 980 Ti Hybrid. Sure it cost $100 more, but it got stellar cooling and noise, plus the performance should be a good deal higher than Fury X.
Should be well worth the money I think

Thats what I plan to do as well, unless I do water on everything. Then I just get a 980Ti for $650 and use EK Waterblock

Gamernexus did a review on it
http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks/Page-2

edit: and Jayz2cents
https://www.youtube.com/watch?v=qtRqmzRMar8


----------



## MadRabbit

Quote:


> Originally Posted by *PhRe4k*
> 
> As an AMD fan, I'm a bit underwhelmed but I mostly blame AMD's crap marketing for getting our hopes up
> 
> 
> 
> 
> 
> 
> 
> I think a $50 price drop minimum would put Fury X in a decent spot, but $100 off would be perfect. As always, future driver updates should improve things for AMD. Not that I bother with $500+ GPU's, I'm far more content picking up last years high end for cheap and saving the rest for games and other upgrades


I'm with you on this one. While drivers eventually will fix some of the problems at the moment they need to take a hit on the cooler.


----------



## PhRe4k

Quote:


> Originally Posted by *curlyp*
> 
> Hello Everyone - I purchased the R9 Fury X yesterday from my local retailer (they only received two during their morning shipment).
> 
> Wow...I am not impressed at all with this card.
> 
> My benchmarks are not the greatest as I am using the firestrike demo; however, it beats my GTX 970 Windforce by under 2k. Looks like I will return the card tomorrow and order the GTX 980TI OC Hybrid edition. The benchmarks released by AMD had the Fury X beating the 980 TI. Apparently, other reviews/benchmarks are showing this card is hardly beating the GTX 980. I think this card was hyped up and it is not living up to it.
> 
> Please correct me if I am work. I have only card the R9 for a little over 24 hours.
> 
> Check out the 3D Mark Benchmark below. The funny thing is, the G-card is not even recognized by the software!


How does it run games?


----------



## Ganf

Quote:


> Originally Posted by *harney*
> 
> I think every retailer where only given a 100 so the 30'000 thing that was floating about could have been true ....hence the reason for the delay i thought it was a driver optimization delay but turns out it was a stock supply issue interesting .....


States the order was placed on the 23rd, which means it's their second shipment. I don't know exactly what Newegg's policy is but when I order in bulk and my suppliers don't have it, I don't cut my order down, I tell them to send me what they've got, backorder the rest and get it to me ASAP. If you don't do that, you end up perpetually short on your own stock because everyone else is piling in their backorders ahead of you.

Just silly.


----------



## undeadhunter

Quote:


> Originally Posted by *Casey Ryback*
> 
> I feel sorry for you thinking you know everybody's situation, I literally don't play any games where my 7970 struggles at 1080p.
> 
> I don't plan on buying GTAV, because GTA4 was terribly boring. Witcher doesn't interest me, the list goes on.
> 
> The only reason I've even considered upgrading is because of the upgrade bug that bites every now and then.
> 
> I'll be playing star citizen when it's released so I can afford to wait, each to their own.
> 
> So please don't feel sorry for people who want to wait, or defend the waiting game.
> 
> Their choice is about them, not you.
> 
> And nice wall of text lol.


Quote:


> Originally Posted by *Thoth420*
> 
> I got an xfx one from tiger direct yesterday pretty easily. Everywhere else was oos all day. I got confirmation it shipped today as well so hopefully Monday.


Tiger had a drama with their Fury x , first they put em at 549 by mistake... then they did not honored the price and canceled everyone who got that price. Right after they popped a message saying the cards will ship within 7 to 20 days lol, guess you got lucky!


----------



## Alastair

You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.

Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?

Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.

Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.

Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.

Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining like bunch of four year olds didn't get their ice cream.

AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.

Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. They squeezed like 40% more under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community with your unrestrained expectations and blind fanboy hatred.

Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.

And please crucify me for actually brining logic into the forefront of this discussion. Because it seems these days common sense is neither common. Nor sensible. And for being a fanboy. Whatever floats your boat and your over enlarged egos compensating for a lack in other areas I really don't care.

MIC DROP.


----------



## hamzta09

Quote:


> Originally Posted by *PhRe4k*
> 
> How does it run games?


3 mile an hour.


----------



## rt123

Quote:


> Originally Posted by *curlyp*
> 
> Hello Everyone - I purchased the R9 Fury X yesterday from my local retailer (they only received two during their morning shipment).
> 
> Wow...I am not impressed at all with this card.
> 
> My benchmarks are not the greatest as I am using the firestrike demo; however, it beats my GTX 970 Windforce by under 2k. Looks like I will return the card tomorrow and order the GTX 980TI OC Hybrid edition. The benchmarks released by AMD had the Fury X beating the 980 TI. Apparently, other reviews/benchmarks are showing this card is hardly beating the GTX 980. I think this card was hyped up and it is not living up to it.
> 
> Please correct me if I am work. I have only card the R9 for a little over 24 hours.
> 
> Check out the 3D Mark Benchmark below. The funny thing is, the G-card is not even recognized by the software!


If you are gonna use FireStrike to judge a GPUs performance, you are better of getting Nvidia.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.
> 
> Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.
> 
> Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.
> 
> Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining likrlike bunch of four year olds didn't get their ice cream.
> 
> AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.
> 
> Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. The squeezed like 40% under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community.
> 
> Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.
> 
> MIC DROP.


People wanted Fury X to be cheaper, faster, overclock more, run cooler or else it would have been a failure.


----------



## Exilon

Quote:


> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> MIC DROP.


You think you sound cool, but you really don't.


----------



## Kane2207

Quote:


> Originally Posted by *ZealotKi11er*
> 
> People wanted Fury X to be cheaper, faster, overclock more, run cooler or else it would have been a failure.


Not unheard of from AMD to be fair, the 7970 was a beast that ticked all those boxes bar being cheaper at the time of launch.

In fairness, the 290X dropped Titan level performance on the table for half the price.

I think after all the hype, press slides and reviewers guides, we just expected it to be a bit better than it is unfortunately.


----------



## nakano2k1

Quote:


> Originally Posted by *Exilon*
> 
> You think you sound cool, but you really don't.


Context FTW


----------



## Robin Nio

Quote:


> Originally Posted by *ZealotKi11er*
> 
> People wanted Fury X to be cheaper, faster, overclock more, run cooler or else it would have been a failure.


The amount of hype that was created around AMDs new cards and hbm made people even more hyped and expected it to destroy the result of Nvidias 980ti and Titan X.


----------



## th3illusiveman

Quote:


> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.
> 
> Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.
> 
> Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.
> 
> Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining likrlike bunch of four year olds didn't get their ice cream.
> 
> AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.
> 
> Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. The squeezed like 40% under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community.
> 
> Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.
> 
> MIC DROP.


It's priced the same as a 980 Ti while offering (slightly)lower performance and that's the main issue. A lower price point would have helped. Better yet, custom air cooled versions for $550 but i guess we will see that on july 14th when the Fury (non-X) launches.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.
> 
> Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.
> 
> Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.
> 
> Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining like bunch of four year olds didn't get their ice cream.
> 
> AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.
> 
> Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. They squeezed like 40% more under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community with your unrestrained expectations and blind fanboy hatred.
> 
> Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.
> 
> And please crucify for actually brining logic into the forefront of this discussion. Because it seems these days common sense is neither common. Nor sensible. And for being a fanboy. Whatever floats your boat and your over enlarged egos compensating for a lack in other areas I really don't care.
> 
> MIC DROP.












We have ourselves a real AMD Badass here! Watch out now; you, along with the world, disappoint.

Don't feel so butt hurt that AMD opened their mouths, and then fell completely short of their claims.......again.

EDIT:

You, and the people like you, are actually the problem. You are the White Knights on a crusade to apologize for AMD, instead of holding them accountable to their claims. You want to help? Hold them to their word.


----------



## Alastair

Quote:


> Originally Posted by *Exilon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> MIC DROP.
> 
> 
> 
> You think you sound cool, but you really don't.
Click to expand...

who said I wanted to sound cool. I made a point. And since your too blind to see the point or have any meaning full response I'll just add you to the list of members of OCN and members of this community as a whole who disappoint me. Maturety and intelligence in this world seems to be lacking more and more every day. I wish all the stupid people would stop reproducing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Robin Nio*
> 
> The amount of hype that was created around AMDs new cards and hbm made people even more hyped and expected it to destroy the result of Nvidias 980ti and Titan X.


But how you can destroy the Destroyer?


----------



## Alastair

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.
> 
> Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.
> 
> Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.
> 
> Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining like bunch of four year olds didn't get their ice cream.
> 
> AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.
> 
> Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. They squeezed like 40% more under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community with your unrestrained expectations and blind fanboy hatred.
> 
> Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.
> 
> And please crucify for actually brining logic into the forefront of this discussion. Because it seems these days common sense is neither common. Nor sensible. And for being a fanboy. Whatever floats your boat and your over enlarged egos compensating for a lack in other areas I really don't care.
> 
> MIC DROP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We have ourselves a real AMD Badass here! Watch out now; you, along with the world, disappoint.
> 
> Don't feel so butt hurt that AMD opened their mouths, and then fell completely short of their claims.......again.
> 
> EDIT:
> 
> You, and the people like you, are actually the problem. You are the White Knights on a crusade to apologize for AMD, instead of holding them accountable to their claims. You want to help? Hold them to their word.
Click to expand...

where did they fail to deliver? I don't see it? In certain titles at 4k they they fall behind 980ti. And at some they fall right in between the two (980ti and TX). They brought a brand new technology to the table. They get 40-44% more performance per watt than their old cards. What's not to like where did they fail. I don't see it. Please. ENLIGHTEN me.

So they get beat at 1080P. Yeah OK. I'll give you that. I was a bit disappointed at that since I still run 1080. But then again. Last time I checked these cards were not aimed at the 1080 market.


----------



## davidelite10

Quote:


> Originally Posted by *Alastair*
> 
> where did they fail to deliver? I don't see it? In certain titles at 4k they they fall behind 980ti. And at some they fall right in between the two. They brought a brand new technology to the table. They get 40-44% more performance per watt than their old cards. What's not to like where did they fail. I don't see it. Please. ENLIGHTEN me.


Here's just one of many, an 'overclocker's dream'.


----------



## Tivan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We have ourselves a real AMD Badass here! Watch out now; you, along with the world, disappoint.
> 
> Don't feel so butt hurt that AMD opened their mouths, and then fell completely short of their claims.......again.
> 
> EDIT:
> 
> You, and the people like you, are actually the problem. You are the White Knights on a crusade to apologize for AMD, instead of holding them accountable to their claims. You want to help? Hold them to their word.


While his choice of words was overly harsh, I tend to agree with him. Give it some time and enjoy the show as AMD mess around with drivers, it's their first new flagship in 2 years. Instead of crying about it being 10% slower than a 980ti or whatever.

Anyway, I for my part will enjoy hearing back about the FuryX, and would encourage you to do the same c:


----------



## iLeakStuff

Quote:


> Originally Posted by *Alastair*
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising.


----------



## Kaltenbrunner

Most of the games I own or want, the fury x matches the 980 ti, except in GTAV, BF3, C3 where it losses big. A lot of that is down to the dev's and maybe just the way the cards r used (yes of course) but u know how it is.

I can't believe I'll pay so much for a single card, this market is insane and so am I.

I wish the 980 ti gets more expensive, and AMD drops their price, then I'd get the fury x and fell better about it........but I'll probably get the 980 ti at the current price.....and then sell it in 1-2 years and do it all over again









I sure am disappointed with the failure of r9 290 crossfire and 7950 CF. I had 6950 CF and that seemed to work way better. A lot of it is the games I played, but thats it, I'm sick of CF not working right and ruining games


----------



## MapRef41N93W

Quote:


> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We have ourselves a real AMD Badass here! Watch out now; you, along with the world, disappoint.
> 
> Don't feel so butt hurt that AMD opened their mouths, and then fell completely short of their claims.......again.
> 
> EDIT:
> 
> You, and the people like you, are actually the problem. You are the White Knights on a crusade to apologize for AMD, instead of holding them accountable to their claims. You want to help? Hold them to their word.


You are talking about the guy who goes thread to thread starting flame wars proclaiming that the FX-8350 is faster than the i7-4790k. What do you expect really.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Most of the games I own or want, the fury x matches the 980 ti, except in GTAV, BF3, C3 where it losses big. A lot of that is down to the dev's and maybe just the way the cards r used (yes of course) but u know how it is.
> 
> I can't believe I'll pay so much for a single card, this market is insane and so am I.
> 
> I wish the 980 ti gets more expensive, and AMD drops their price, then I'd get the fury x and fell better about it........but I'll probably get the 980 ti at the current price.....and then sell it in 1-2 years and do it all over again


Or wait for Air Cooler $550 Fury.


----------



## Alastair

Quote:


> Originally Posted by *davidelite10*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> where did they fail to deliver? I don't see it? In certain titles at 4k they they fall behind 980ti. And at some they fall right in between the two. They brought a brand new technology to the table. They get 40-44% more performance per watt than their old cards. What's not to like where did they fail. I don't see it. Please. ENLIGHTEN me.
> 
> 
> 
> Here's just one of many, an 'overclocker's dream'.
Click to expand...

Read the text wall if you bothered. No wait I'll put it on a platter for you.

Yes I too was disappointed when I saw the overclocking results.

But then I gave it some thought. And once you have some context to go on it makes sense.

Firstly and the biggest point one needs to realise is. The voltage is currently locked. Meaning you can't change it.

Secondly. I imagine in order to get as much power efficiency gains as they possibly could AMD put the stock volts as low as they could without having an unstable card.

Wait till AMD unlocks the voltage controllers in the drivers, or until someone hacks it and then make your judgments. It's only day two.


----------



## Alastair

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We have ourselves a real AMD Badass here! Watch out now; you, along with the world, disappoint.
> 
> Don't feel so butt hurt that AMD opened their mouths, and then fell completely short of their claims.......again.
> 
> EDIT:
> 
> You, and the people like you, are actually the problem. You are the White Knights on a crusade to apologize for AMD, instead of holding them accountable to their claims. You ewant to help? Hold them to their word.
> 
> 
> 
> You are talking about the guy who goes thread to thread starting flame wars proclaiming that the FX-8350 is faster than the i7-4790k. What do you expect really.
Click to expand...

sorry where have I started a flame war? Where have I EVER stated that 4770K is better than FX?

Link it.


----------



## iinversion

Quote:


> Originally Posted by *Alastair*
> 
> sorry where have I started a flame war? *Where have I EVER stated that 4770K is better than FX*?
> 
> Link it.


Exactly.


----------



## davidelite10

Quote:


> Originally Posted by *Alastair*
> 
> Read the text wall if you bothered. No wait I'll put it on a platter for you.
> 
> Yes I took was disappointed when I saw the overclocking results.
> 
> But then I gave it some thought. And once you have some context to go on it makes sense.
> 
> Firstly and the biggest point one needs to realise is. The voltage is currently locked. Meaning you can't change it.
> 
> Secondly. I imagine in order to get as much power efficiency gains as they possibly could AMD put the stock volts as low as they could without having an unstable card.
> 
> Wait till AMD unlocks the voltage controllers in the drivers, or until someone hacks it and then make your judgments. It's only day two.


Yup, play the waiting game just like I did back in the day.

Here's another, their 'benches'

Oh and it only nips the heels of a stock 980ti without boost.
Look at the ones which are easily OCed to 1400mhz for benches.

After everything AMD was spouting and hyping I sincerely thought they were going to pull it around, now I sincerely hope they're going to have their entire supervision and direction changed back to how ATI was. Along with their CPU side.


----------



## harney

Quote:


> Originally Posted by *Alastair*
> 
> You kids all disappoint me. I actually expected more from a bunch of adults. But I guess I'm just cynical. The world disappoints me these days.
> 
> Seriously guys get off your high horses. For starters. AMD has a budget probably less than half of NVidia. On a day one launch in most titles at 4K, the market it is AIMING for it performs on par or slightly worse than 980ti. They pulled that off with less than half the R&D?
> 
> Secondly. I never heard mention of "TITAN KILLER" anywhere mentioned in AMD's PR or advertising. It's only the silly kids out there that expected a Titan killer.
> 
> Thirdly. Yes the card does not over clock at the moment. I imagine to squeeze as much efficency as possible out the chip AMD probably dialed down the voltage as much as they could without instability. When things get unlocked I imagine the picture will be far prettier.
> 
> Fourthly. AMD has just presented us with ANOTHER brand new tech. To go alongside multi core, Intergrated Memory Controllers, GDDR5 and a whole host of other great technology that they brought to the table. You could probably bet that if it wasn't for AMD partnering with Hynix and brining HBM to the table you could bet your top dollar that Pascal would have GDDR5 for MAXIMUM MILKING POTENTIAL.
> 
> Honestly it seems AMD can do no good in people's eyes. And seriously you guys honestly need to grow up seriously. It's performance scales better with resolution. The card doesn't seem to be falling on its face from the 4GB limitation. Where it matters the card trades blows with Ti/TX. And yet all you kids are whinging and whining like bunch of four year olds didn't get their ice cream.
> 
> AMD have done really well for a day one launch on a BRAND NEW tech. I will repeat. It is able to trade blows with Ti/TX where it matters. At 1440 and above. So I really don't see what the matter is.
> 
> Honestly. I expected more from overclock.net. I seem to use this sight less and less every day because it's just the same kids every day. And I expected more from the enthusiast community as a whole. You guys should be standing and giving AMD a standing ovation. But your all so blinded by your wild expectations (not due to AMD PR) that you are too blind to ACTUALLY see the achievements and the progress being made. It's a few % slower than Ti at worst. So it doesn't get EXACTLY 50% more performance per watt vs Hawaii, but rather 44%. That is still a MIRACULOUS achievement. They squeezed like 40% more under the hood but barely went up in power consumption. How is that not awesome? It's like going from a V6 to a V8 in your car. Getting 150 more Horses but your consumption only goes up half a liter per 100km. If only cars could be so awesome. (they are awesome but that is OT) But heck it's AMD and they didn't SLAY Titan by 50% so let's boycott them for actually delivering a really good product. But please try squishing your Ti into a small form factor as easily as one of these or an air cooled Fury, Hual it to a LAN and tell me how that goes for you? Ti performance in a little over 6 inches?!? What the hell is wrong with you people are stupid? Dumb? Blind? I dunno. It's not AMD that has disappointed you. But rather YOU who has disappointed the community with your unrestrained expectations and blind fanboy hatred.
> 
> Honestly. I am disappointed in this community. I am ashamed to be a PC gamer right now. Ashamed to be an enthusiast right now. Because I actually have to compared to ungrateful kids like yourselves. Grow up people. Seriously.
> 
> And please crucify me for actually brining logic into the forefront of this discussion. Because it seems these days common sense is neither common. Nor sensible. And for being a fanboy. Whatever floats your boat and your over enlarged egos compensating for a lack in other areas I really don't care.
> 
> MIC DROP.


Alastair has a fair point to what he is saying maybe a little strong in the choice of words but yes AMD have done great bringing new tech to the table its just there price that's a little strong that's all it really needs to be 100 lower than the Ti


----------



## glenn37216

So you guys think the normal Fury card that is released on July 14th will have better overclock ability? There's going to be custom versions from AIB partners so should we be expecting some good clock tweaks ?
I believe if the Fury x just had a little more oomph in performance it would be more appealing but as it stands now I'm starting to think AMD should just split and sell off its GPU dept to Samsung. Unless they drop the price of the Fury X to match the GTX980 , I foresee Bankruptcy soon in AMD's future. (Nvidia is talking about a price drop on their 980TI by $50.00 or more by this time next month.)


----------



## Alastair

Quote:


> Originally Posted by *iinversion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> sorry where have I started a flame war? *Where have I EVER stated that 4770K is better than FX*?
> 
> Link it.
> 
> 
> 
> Exactly.
Click to expand...

I think I know what your trying to say. But I hope that you'll back me up in this. We may not see always eye to eye. But where have I claimed FX is 100% better than X or Y. I defend the FX yes guilty as charged. But I haven't ever stated that it is outright better than I7. I have praised the FX on its merits and how well it performs when COMPARED to I5 and or 7. It wins in areas and yes it certainly does loose in some areas for sure. But I have never outright claimed FX is better than I7.


----------



## ZealotKi11er

Quote:


> Originally Posted by *davidelite10*
> 
> Yup, play the waiting game just like I did back in the day.
> 
> Here's another, their 'benches'
> 
> Oh and it only nips the heels of a stock 980ti without boost.
> Look at the ones which are easily OCed to 1400mhz for benches.
> 
> After everything AMD was spouting and hyping I sincerely thought they were going to pull it around, now I sincerely hope they're going to have their entire supervision and direction changed back to how ATI was. Along with their CPU side.


The think is most people that talk about these GPUs are not even buying one. Most of us are just playing a game comparing GPUs. If you want to compare GTX980 Ti OC unlocked vs Fury X handicapped OC that i fine.


----------



## Tivan

We won't be gaining much from talking about how the FuryX with its current drivers and lack of voltage control doesn't overclock well and doesn't beat the 980ti.

Because we know that already. = D

edit: I guess I'm off to wait for threads with news about the FuryX because this thread isn't really about that, it seems.


----------



## Alastair

Quote:


> Originally Posted by *harney*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> sorry where have I started a flame war? Where have I EVER stated that 4770K is better than FX?
> 
> Link it.
> 
> 
> 
> Alastair has a fair point to what he is saying maybe a little strong in the choice of words but yes AMD have done great bringing new tech to the table its just there price that's a little strong that's all it really needs to be 100 lower than the Ti
Click to expand...

yes I was strong on the words. But seriously. If you looked at the benches and then saw how people were going on on like page 25 or so your hope for the community as a whole gets crushed like a bug under foot. And I want to see better for the community as a whole. I feel we are better than this.

Edit: also I see my Autocorrect is bugging out. If you guys see an error just let me know


----------



## Alastair

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> sorry where have I started a flame war? Where have I EVER stated that 4770K is better than FX?
> 
> Link it.
> 
> 
> 
> Every "AMD vs Intel" thread in the past year on this board. It's a shame that most of your posts end up getting deleted by mods though.
Click to expand...

I have never outright said FX is better than I7. I defended FX on its merits. But if you bother to read my posts properly I always try and include in my posts where Intel is better than AMD. A few of my posts might have been deleted. But not as many as you like to insinuate. Like I said. Post me some proof on the matter. If you have the proof PM me. Cause it's OT here. But if you don't have proof to back up your claims then yeah. It's just words dude.


----------



## davidelite10

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The think is most people that talk about these GPUs are not even buying one. Most of us are just playing a game comparing GPUs. If you want to compare GTX980 Ti OC unlocked vs Fury X handicapped OC that i fine.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> The think is most people that talk about these GPUs are not even buying one. Most of us are just playing a game comparing GPUs. If you want to compare GTX980 Ti OC unlocked vs Fury X handicapped OC that i fine.


I was going to if the performance is right, but it doesn't have HDMI2.0 so it's not great for 4k which is what I wanted it for.

currently I might sit on my 2x GTX 780s for 1440p till pascal.

Also I'm comparing current capabilities of both sides, I've been burned way too many times playing the waiting game with AMD and a few times as well with Nvidia.


----------



## Rei86

Quote:


> Originally Posted by *error-id10t*
> 
> lol are you kidding me, this forum was full of people who couldn't even deal with the fact that another TI was going to come, they were so against it there were threads ruined because of their nay-saying. So I guess that says a lot about the type of people we have here...
> 
> This Fury X bashing is just the same but different smell. No bench I've seen shows any 4GB vRAM problem existing but somehow that's a major flaw. Everyone knows AMD have "trouble" with their drivers but they've written off the whole generation because it's "only" throwing punches with day 1 release drivers compared to months from the other side.


Yup. more like the asshates where arguing that you couldn't call it a GTX 980Ti because the GTX 980 was a GM204 chip and there is NO WAY nVidia would EVER call the next thing a GTX 980Ti. NEVER EVER BECAUSE the GTX 980 is already out with a full GM204 chip and they could NEVER EVER call it a Ti. EVER. EVER.....

But the rest of us knew with a flagship card like the Titan X they had to have a cut down card in the works to sell off the ones that couldn't make it.

Quote:


> Originally Posted by *SpeedyVT*
> 
> I can't believe people would still defend NVidia after the Batman fiasco.


Who are you going to blame? nVidia or RockSteady who handed this game off to a garbage developer and never even seen or touched the PC version till it was released? Who?

Quote:


> Originally Posted by *i7monkey*
> 
> This launch was entirely ruined by AMD's marketing team and the decision to charge $649 for it. Period.
> 
> Hype and MSRP is what ruined this card, because it's an otherwise decent card. Fury X should cost $499.


Again this card is not performing so far below the GTX 980Ti that it should be cheaper than 649.99. AGAIN this is AMD's halo card and its stable at its price point. Why won't people like you get this?


----------



## GorillaSceptre

Here's more of the poor AMD argument again..

They are asking for $650.. It's not like they are a charity, they're a business.

If you want to "save" the industry by supporting an inferior product, in a bid to bring healthy competition and prosperity for all, then go ahead.
I'll use my hard earned money to selfishly destroy the industry, by going Nvidia









The Fury X obviously isn't a bad card, in fact i think it's to it's detriment that it performs so close to the Ti, and at the same price.

The Fury may get close in raw performance, but AMD are still behind Nvidia in nearly every other aspect. If it offered 20% over the competition i would of gone for it, but now it comes down to which brand has proven over and over again to offer a better experience, and imo it's Nvidia.


----------



## Alastair

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> sorry where have I started a flame war? Where have I EVER stated that 4770K is better than FX?
> 
> Link it.
> 
> 
> 
> Every "AMD vs Intel" thread in the past year on this board. It's a shame that most of your posts end up getting deleted by mods though.
Click to expand...

I will also add. Have you EVER in your life on OCN been in the Intel sub forum. Please. Give me that. But every "I'm doing an AMD build" thread out there at least 20 Intel guys come in and start bashing the thread till it gets locked. So please. Stop with the stories. Or back it up with some proof.


----------



## glenn37216

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you want to compare GTX980 Ti OC unlocked vs Fury X handicapped OC that i fine.


The fury x is watercooled .. on a more fair note shouldn't we be comparing the fury thats released next month (aircooled) to the standard 980 ti reference? -confused on the fairness of hardware comparison here.

I thought the fury x with all its glory was supposed to be an overclocked beast. Why does amd's benches compare it to a reference air cooled 980ti ? I think the reason why they put this card out first is that the expectations of the air cooled version isn't that great.


----------



## SpeedyVT

Quote:


> Originally Posted by *Rei86*
> 
> Who are you going to blame? nVidia or RockSteady who handed this game off to a garbage developer and never even seen or touched the PC version till it was released? Who?


http://www.reddit.com/r/pcmasterrace/comments/3b40vu/nvidia_allegedly_sped_up_their_batman_arkham/

NVidia knew, oooh they knew.


----------



## iinversion

Quote:


> Originally Posted by *Alastair*
> 
> I think I know what your trying to say. But I hope that you'll back me up in this. We may not see always eye to eye. But where have I claimed FX is 100% better than X or Y. I defend the FX yes guilty as charged. But I haven't ever stated that it is outright better than I7. I have praised the FX on its merits and how well it performs when COMPARED to I5 and or 7. It wins in areas and yes it certainly does loose in some areas for sure. *But I have never outright claimed FX is better than I7*.


Maybe not, but FX is always your recommendation even if the Intel alternative is better for the price/type of workload someone is wanting. I have never seen you suggest an Intel part to anyone. It is clear you are biased towards AMD and that bias isn't helping you here.

Fury X might improve with unlocked voltage and it might improve with drivers. Both of those are uncertain and not everyone wants to take a chance on uncertainty. Right now the 980 Ti is most definitely the better buy.


----------



## Rei86

Quote:


> Originally Posted by *glenn37216*
> 
> The fury x is watercooled .. on a more fair note shouldn't we be comparing the fury (aircooled) to the standard 980 ti reference? -confused on the fairness of hardware comparison here.
> 
> I thought the fury x with all its glory was supposed to be an overclocked beast. Why does amd's benches compare it to a reference air cooled 980ti ? I think the reason why they put this card out first is that the expectations of the air cooled version isn't that great.


No.

The Fury X is the full shebang. The Fury is a cut rate Fury X. Sure if you want to look at it the Fury X to the Titan X as the Fury is the GTX 980Ti... However.

And since the Fury X is not priced at 999.99, almost all of us will compare it to its closest competitor the GTX 980Ti in the same price bracket.


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> Fury X might improve with unlocked voltage and it might improve with drivers. Both of those are uncertain and not everyone wants to take a chance on uncertainty. Right now the 980 Ti is most definitely the better buy.


Ding ding ding! We have a winner!


----------



## Rei86

Quote:


> Originally Posted by *SpeedyVT*
> 
> http://www.reddit.com/r/pcmasterrace/comments/3b40vu/nvidia_allegedly_sped_up_their_batman_arkham/
> 
> NVidia knew, oooh they knew.


Allegedly. Give me facts.


----------



## SpeedyVT

Quote:


> Originally Posted by *Rei86*
> 
> Allegedly. Give me facts.


Allegedly, that's fact enough if you watch the video and hear the speed up.


----------



## Tivan

Quote:


> Originally Posted by *GorillaSceptre*
> 
> now it comes down to which brand has proven over and over again to offer a better experience, and imo it's Nvidia.


To me, it's AMD, but yeah, this is subjective and depends what you value in the brand. But good approach!
I see AMD's been stepping up their PR lately a bit as well, so maybe they'll get more people than just me to appreciate their brand c:


----------



## Rei86

Quote:


> Originally Posted by *SpeedyVT*
> 
> Allegedly, that's fact enough if you watch the video and hear the speed up.


So if they "knew," you wanted nVidia to tell Warner Brothers, RockSteady and Iron Galaxy Studios right before the launch. To shove it and go back to the drawing boards?


----------



## Orivaa

Quote:


> Originally Posted by *Alastair*
> 
> Read the text wall if you bothered. No wait I'll put it on a platter for you.
> 
> Yes I too was disappointed when I saw the overclocking results.
> 
> But then I gave it some thought. And once you have some context to go on it makes sense.
> 
> Firstly and the biggest point one needs to realise is. The voltage is currently locked. Meaning you can't change it.
> 
> Secondly. I imagine in order to get as much power efficiency gains as they possibly could AMD put the stock volts as low as they could without having an unstable card.
> 
> Wait till AMD unlocks the voltage controllers in the drivers, or until someone hacks it and then make your judgments. It's only day two.


This is from a few pages back, so I do not know if it has been addressed already, but voltage control is not something AMD needs to do. They never had voltage control natively. It's 3rd part programs like MSI Afterburner that needs to add it.


----------



## SpeedyVT

Quote:


> Originally Posted by *Rei86*
> 
> So if they "knew," you wanted nVidia to tell Warner Brothers, RockSteady and Iron Galaxy Studios right before the launch. To shove it and go back to the drawing boards?


Most games get pushed back from their official PC release due to bugs like this. It would've been wiser to make consoles a time exclusive and release Batman for PC in the fall of 2015.

I blame them all including NVidia. Gameworks is the reason it doesn't run on PC. Batman uses nearly all of the Gameworks features which proves that you can't run them all simultaneously without having a godly PC.


----------



## th3illusiveman

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Here's more of the poor AMD argument again..
> 
> They are asking for $650.. It's not like they are a charity, they're a business.
> 
> If you want to "save" the industry by supporting an inferior product, in a bid to bring healthy competition and prosperity for all, then go ahead.
> I'll use my hard earned money to selfishly destroy the industry, by going Nvidia
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Fury X obviously isn't a bad card, in fact i think it's to it's detriment that it performs so close to the Ti, and at the same price.
> 
> The Fury may get close in raw performance, but AMD are still behind Nvidia in nearly every other aspect. If it offered 20% over the competition i would of gone for it, but now it comes down to which brand has proven over and over again to offer a better experience, and imo it's Nvidia.


lol at 20% perf over a 980 Ti. I have no idea how people come up with these expectations. You were destined to be disappointed with unrealistic expectations like that.


----------



## glenn37216

Quote:


> Originally Posted by *Rei86*
> 
> No.
> 
> The Fury X is the full shebang. The Fury is a cut rate Fury X. Sure if you want to look at it the Fury X to the Titan X as the Fury is the GTX 980Ti... However.
> 
> And since the Fury X is not priced at 999.99, almost all of us will compare it to its closest competitor the GTX 980Ti in the same price bracket.


I still don't understand the direct comparison fairness by AMD's press release benches. A watercooled Fury X vs reference 980 TI that is AIR cooled? I can understand if its the price starting point of the two cards but I gave $650.00 for my Gigabyte G1 980TI which bests the Fury X by 20fps or more in every major title out - and it was a cheaper price than the Fury X at launch.

In all fairness a more tru telling benchmark comparison woud be a Fury AIR COOLED vs any aftermarket 980TI AIR COOLED. Then you would have a good price value vs performance .

But now I can see why they didn't . -Their card just can't keep up.


----------



## Rei86

Quote:


> Originally Posted by *SpeedyVT*
> 
> Most games get pushed back from their official PC release due to bugs like this. It would've been wiser to make consoles a time exclusive and release Batman for PC in the fall of 2015.
> 
> I blame them all including NVidia.


That's not up to nVidia.

That's up to you and your opinion on who you want to hold accountable for the mess that's BAK on the PC.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Tivan*
> 
> To me, it's AMD, but yeah, this is subjective and depends what you value in the brand. But good approach!
> I see AMD's been stepping up their PR lately a bit as well, so maybe they'll get more people than just me to appreciate their brand c:


Yeah, in some aspects it's subjective, in others it's not.

There are countless threads of people with AMD cards going mad over gimped performance in an "Nvidia" title. Even though it's not fair that AMD performance suffers, it is one of the risks of going with AMD.

With these products competing so closely, you have to look at other aspects besides performance. If they significantly out-performed Nvidia i would look past the draw backs, but in this case, Nvidia has them beaten on both fronts unfortunately.


----------



## SpeedyVT

Quote:


> Originally Posted by *Rei86*
> 
> That's not up to nVidia.
> 
> That's up to you and your opinion on who you want to hold accountable for the mess that's BAK on the PC.


If NVidia can't be held accountable for it's poor software. Heck even NVidia drivers have suffered lately because of it.

I hold AMD accountable for it's slow driver releases.

Realistically all businesses associated with the release of a product need to be held accountable. There is no fall guy.


----------



## GorillaSceptre

Quote:


> Originally Posted by *th3illusiveman*
> 
> lol at 20% perf over a 980 Ti. I have no idea how people come up with these expectations. You were destined to be disappointed with unrealistic expectations like that.


So me expecting a card that's been hyped up for over 6 months, with a massive die, HBM, and a CLC cooler to have 20% over a reference air-cooled 980Ti is lol worthy? Okay then


----------



## DSgamer64

I am thinking of waiting to see how the dual Fury X handles things before I decide to jump back on the AMD ship again. My R9 290 can last a few more months, it gives me time to put away extra cash just in case. Either way, as it stands the choices are going to be a 980 Ti, Fury X or Fury X2, it all depends on what the dual card does and whether Nvidia chooses to drop the prices on the 980 Ti (though that is pretty wishful thinking).


----------



## Kane2207

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Here's more of the poor AMD argument again..
> 
> They are asking for $650.. It's not like they are a charity, they're a business.
> 
> If you want to "save" the industry by supporting an inferior product, in a bid to bring healthy competition and prosperity for all, then go ahead.
> I'll use my hard earned money to selfishly destroy the industry, by going Nvidia
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Fury X obviously isn't a bad card, in fact i think it's to it's detriment that it performs so close to the Ti, and at the same price.
> 
> The Fury may get close in raw performance, but AMD are still behind Nvidia in nearly every other aspect. If it offered 20% over the competition i would of gone for it, but now it comes down to which brand has proven over and over again to offer a better experience, and imo it's Nvidia.


It's one of the more bizarre arguments on OCN.

If people like supporting an underdog then fair enough, but personally if I wanted to be charitable i'd give my money to some kid in a war torn country to feed/educate/clothe them.

Giving my hard earned cash to a billion dollar corporation with execs on 6-7 figures salaries who are incapable of getting all their ducks in a row, just happens to be right at the bottom of my list charitable organisations lol.


----------



## wiak

Quote:


> Originally Posted by *Alatar*
> 
> My personal opinion of this card after looking at the performance numbers and considering the memory amount is that it should be priced at $599 and then it'd be at the perfect spot for the average high end buyer.
> 
> Only real exception being SFF where even higher pricing would be a non issue.


well the water cooling by its self cost around 80 bucks..


----------



## dmasteR

Quote:


> Originally Posted by *wiak*
> 
> well the water cooling by its self cost around 80 bucks..


AMD is purchasing these in huge quantities. It would not cost them $80 bucks per unit.


----------



## cbarros82

AMD can keep there water cooling . ill take a fury x with an EK block


----------



## mav451

So there's a Reddit thread talking about multiple revisions to the 15.15 driver: June 15, 17, 20th.
Guess this story isn't over haha









https://www.reddit.com/r/buildapc/comments/3b30bt/discussionfury_x_possibly_reviewed_with_incorrect/


----------



## Ganf

Quote:


> Originally Posted by *mav451*
> 
> So there's a Reddit thread talking about multiple revisions to the 15.15 driver: June 15, 17, 20th.
> Guess this story isn't over haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.reddit.com/r/buildapc/comments/3b30bt/discussionfury_x_possibly_reviewed_with_incorrect/


I keep telling you guys that AMD's software department is being slaved and beaten like a bunch of mangy dogs, they've put out so many driver changes across so many different platforms over the last 3 months they're making Nvidia look like outright slobs.


----------



## SpeedyVT

Quote:


> Originally Posted by *Ganf*
> 
> I keep telling you guys that AMD's software department is being slaved and beaten like a bunch of mangy dogs, they've put out so many driver changes across so many different platforms over the last 3 months they're making Nvidia look like outright slobs.


I think NVidia achieved that themselves with just gameworks.


----------



## kingduqc

Quote:


> Originally Posted by *mav451*
> 
> So there's a Reddit thread talking about multiple revisions to the 15.15 driver: June 15, 17, 20th.
> Guess this story isn't over haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.reddit.com/r/buildapc/comments/3b30bt/discussionfury_x_possibly_reviewed_with_incorrect/


http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_980_ti_g1_gaming_soc_review,36.html
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,37.html

Stock in 4k the 980ti g1 is:
7% faster in hitman
20% faster in bioshock
10% faster in tomb raider
28% faster in metro last light
20% faster in theif
9% faster in bf: hardline
17% faster in gta V
32% faster in witcher 3

Once overclock the 980ti g1 is :
22% faster in bioshock infinite
13% faster in hitman
21% faster in tomb raider

There is no magic drivers.


----------



## SpeedyVT

Quote:


> Originally Posted by *kingduqc*
> 
> http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_980_ti_g1_gaming_soc_review,36.html
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,37.html
> 
> Stock in 4k the 980ti g1 is:
> 7% faster in hitman
> 20% faster in bioshock
> 10% faster in tomb raider
> 28% faster in metro last light
> 20% faster in theif
> 9% faster in bf: hardline
> 17% faster in gta V
> 32% faster in witcher 3
> 
> Once overclock the 980ti g1 is :
> 22% faster in bioshock infinite
> 13% faster in hitman
> 21% faster in tomb raider
> 
> There is no magic drivers.


Guru3D lol... They are massive net trolls. Just ask yourself why are all of the benchmarks so widely different from each other, some better some worse. Guru3D is too consistent to even be real benchmarks.


----------



## Ha-Nocri

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,26.html

Reference its about tie. Flurry wins in more games, but 980ti wins with bigger margin.


----------



## Rei86

Quote:


> Originally Posted by *SpeedyVT*
> 
> If NVidia can't be held accountable for it's poor software. Heck even NVidia drivers have suffered lately because of it.
> 
> I hold AMD accountable for it's slow driver releases.
> 
> Realistically all businesses associated with the release of a product need to be held accountable. There is no fall guy.


And you're being rational and realistic about this?

Do you know how nVidia gameworks work?

Also again you're opinion but BAK was published by Warner Brothers, developed by their in house crew Rock Steady, with the PC version sent off to Iron Galaxy Studio.


----------



## Kaltenbrunner

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Or wait for Air Cooler $550 Fury.


but that won't have enough power for 1440p let alone at 96Hz, I like eye candy too much


----------



## BigMack70

Quote:


> Originally Posted by *SpeedyVT*
> 
> Guru3D lol... They are massive net trolls. Just ask yourself why are all of the benchmarks so widely different from each other, some better some worse. Guru3D is too consistent to even be real benchmarks.


Well it seems all real discussion left this thread a while back and most of the remaining folks are just the classic AMD conspiracy theory trolls who think that basically all websites that say something they don't like are incompetent/corrupt/bought by Nvidia.

Time for me to unsub... you guys have fun down there in the fanboy sewers


----------



## ambientblue

People were expecting more from fury x because of last gen. Despite the heat, r9 290x came out and handily beat the 780 and nearly reached 780ti levels at stock speed, the most important factor being it cost much less than a 780ti. The fury x is in a similar position this gen but it doesn't have the benefit of a much lower price this time. On top of that its already cooled with a closed loop cooler meaning less o/c headroom than last gen.


----------



## rdr09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,26.html
> 
> Reference its about tie. Flurry wins in more games, but 980ti wins with bigger margin.


Could the difference in driver. they used 15.15 for the one you posted.

I noticed, they never updated the driver for the 980 Ti.


----------



## jamaican voodoo

Quote:


> Originally Posted by *hawker-gb*
> 
> If it was significantly slower than 980ti i would buy 980ti.
> It is pretty much similar in performance so i would buy Fury X.
> Simple.


I'm getting one myself by the end of month, personally i think DX12 will my the game change with this card my manly intuition tells me so.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> but that won't have enough power for 1440p let alone at 96Hz, I like eye candy too much


I want to move from my 2 x 290s too to go back to single GPU. I think 1 x Fury is enough for 1440p. My R9 290X OCed is almost enough getting 40-50 in games.


----------



## Pawelr98

Looking at prices.
On newegg 980TI is 650$ while on microcenter it's 670$.
I'm going to purchase from microcenter.

Both cards will require some modding/customization.
Fury X is reverse side VRM's cooling + connection to main loop.
980TI would require complete watercooling solution(fullcover block or universal waterblock + radiators with a fan).

Both cards are close in pricing and perfromance but modding Fury X will be cheaper(I already have radiators and tubing).
All comes down what will be available within those next 2 weeks (my stay in US). If both will be available then I will go with Fury X due to smaller size (easier to fit into luggage) and cheaper cooling mod.

I think that even lowering the price to 625$ would make Fury X more competitive.


----------



## sugalumps

Quote:


> Originally Posted by *Pawelr98*
> 
> Looking at prices.
> On newegg 980TI is 650$ while on microcenter it's 670$.
> I'm going to purchase from microcenter.
> 
> Both cards will require some modding/customization.
> Fury X is reverse side VRM's cooling + connection to main loop.
> 980TI would require complete watercooling solution(fullcover block or universal waterblock + radiators with a fan).
> 
> Both cards are close in pricing and perfromance but modding Fury X will be cheaper(I already have radiators and tubing).
> All comes down what will be available within those next 2 weeks (my stay in US). If both will be available then I will go with Fury X due to smaller size (easier to fit into luggage) and cheaper cooling mod.
> 
> *I think that even lowering the price to 625$ would make Fury X more competitive*.


Nah $25 for 50% more vram is still a better deal.


----------



## Ganf

Quote:


> Originally Posted by *kingduqc*
> 
> There is no magic drivers.


Tell that to Kepler owners who're watching AMD's middle tier creep up on Nvidia's former flagships. Some of them seem to be believing in the magic right about now.


----------



## Unkzilla

Not that I would buy a AMD product anyway, but given I have a catleap q270 (dual link dvi only) and a Bravia UHD tv, I couldn't even connect this card to either of them and get the correct output

Bit baffling since AMD is obviously trying to get market share back, surely you would cater for these connections


----------



## Kaltenbrunner

Quote:


> Originally Posted by *jamaican voodoo*
> 
> I'm getting one myself by the end of month, personally i think DX12 will my the game change with this card my manly intuition tells me so.


I expect it will take me awhile to sell 1 r9 290 locally, so I'll be waiting till late summer I'd say, hope AMD makes price or driver moves by then


----------



## Kaltenbrunner

So good work for AMD either way, what improvements did they make anyways to get more out of 28nm compared to 290X ? Wish I knew/learned a ton about they technical side of all this.


----------



## jamaican voodoo

Guy we still got half year to go things with this card can go either way. i almost regret my purchase of 7970's when the gtx 680 lunched. but i kept feeling things will change and they did with those wonder drivers, many of you think that amd doesn't know what they are doing, but i can this card was design to performance in win 10 and DX12 titles, why do you think amd is promoting DX12 so hard? haha come on guys the xbox one had GCN in it, do you really think maxwell will benefit as much, you must be joking if you do.

i can't wait to see of you guys face when Fury X is kicking maxwell butt the upcoming Fable Legends, Deus EX DX12 titles. DRAW CALLS MATTER WE CAN SEE THAT WITH DX11 and maxwell.


----------



## Sashimi

Have a laugh guys. AMD R9 Furry X:


----------



## jamaican voodoo

Quote:


> Originally Posted by *Sashimi*
> 
> Have a laugh guys. AMD R9 Furry X:


HAHAHA!!! good one







:thumb:


----------



## Kinaesthetic

To debunk the driver discrepancy rumors:

Straight from AMDMatt over at Overclockers:

Link: http://forums.overclockers.co.uk/showthread.php?p=28230087#post28230087

Quote:


> Yep, no.
> 
> For reference on what driver strings mean by the way:
> 15.15-150611a-185358E (This is the real driver we provided)
> 15.15 - is the branch
> 150611 - is the date of the build, YY/MM/DD
> 185358 - is the build request from our system to create this driver based off the information above
> 
> In this case the review is suggesting a driver dated from June 12th 2018 and the build request that has a letter instead of a number for its last digit so it's either a lot of typos or someone being misleading on purpose.


Can we just enjoy this darn card now instead of trying to nitpick every single itsy bitsy thing that tries to support either fanboy side's opinion?


----------



## Sashimi

Quote:


> Originally Posted by *jamaican voodoo*
> 
> HAHAHA!!! good one
> 
> 
> 
> 
> 
> 
> 
> :thumb:


Cheers.

Review conclusion: super cuddly lol.


----------



## Ganf

Quote:


> Originally Posted by *Kinaesthetic*
> 
> To debunk the driver discrepancy rumors:
> 
> Straight from AMDMatt over at Overclockers:
> 
> Link: http://forums.overclockers.co.uk/showthread.php?p=28230087#post28230087
> 
> Can we just enjoy this darn card now instead of trying to nitpick every single itsy bitsy thing that tries to support either fanboy side's opinion?


Almost.

Hexus has always been a review site I couldn't find fault with.

http://hexus.net/tech/reviews/graphics/84170-amd-radeon-r9-fury-x-4gb/

And they sure as hell stay away from the politics and drama that other review sites seem to thrive on.

That's two outlying reviews that put the Fury X on par with the 980ti

Edit: They also don't have a single borked test showing the Fury X tanking for no reason. If anyone can explain to me why they can get the card to run fine on GTA V while nobody else can I'll politely bow out.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> I'm on the fence with all these cards currently, I almost bought a 970 when it came out, then I almost picked up a cheap sapphire tri-X 290.
> 
> I usually go for the mid range type cards hoping for value at 1080p.
> 
> It's actually the software more than anything that has turned me away from doing it, if my 7970 crumbles with star citizen (which it no doubt will) I'm pretty sure I'll be straight down the shop buying whatever is available at the time.
> 
> I'm not flying a fancy ship at low framerates


I am in the same boat as you, I get indecisive about this stuff because I wait to long and see the new tech and tell myself i am going to wait.

What I am trying to do is find a dang EK water block for this MSI 7970 lightning... I thought I had one but they were sold...

it might be getting a used 290x with a WB already attached idk...


----------



## Fifth Horseman

I noticed a few places have complained about the thermal performance of the radiator fan, can anyone confirm if it will be removable or a integral part of the card, cause I bet a nice Noctua fan or Corsair would do better.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> Edit: They also don't have a single borked test showing the Fury X tanking for no reason. If anyone can explain to me why they can get the card to run fine on GTA V while nobody else can I'll politely bow out.


FXAA instead of MSAA? That puts more emphasis on shaders and less on ROPs maybe. Tech Report got similar results and they used FXAA also.


----------



## Bartouille

Quote:


> Originally Posted by *Fifth Horseman*
> 
> I noticed a few places have complained about the thermal performance of the radiator fan, can anyone confirm if it will be removable or a integral part of the card, cause I bet a nice Noctua fan or Corsair would do better.


This fan blows away any Noctua or Corsair fan. I'm pretty sure you can replace the fan if you wish tho.


----------



## Fifth Horseman

Quote:


> Originally Posted by *Bartouille*
> 
> This fan blows away any Noctua or Corsair fan. I'm pretty sure you can replace the fan if you wish tho.


I would have to question that, considering a few of the reviews reported inconsistencies in the fan speed rpms and ramp up time, is that a software issue or a fan issue, i'm not sure.


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> FXAA instead of MSAA? That puts more emphasis on shaders and less on ROPs.


http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V

You mean like that?

MSAA and ROPS doesn't explain that, that's just using an outlier as your main sample.
Quote:


> Originally Posted by *Fifth Horseman*
> 
> I would have to question that, considering a few of the reviews reported inconsistencies in the fan speed rpms and ramp up time, is that a software issue or a fan issue, i'm not sure.


That's a software issue, and indicative of unfinished drivers, of all things.

As with any GPU, you can control the fan profile to be as aggressive or as quiet as you like, and the GT's are quiet.


----------



## fatmario

I am sure amd fury x will shine in over time it will definitely get performance boost in future driver update. just like amd 7970 gpu over past 3.5 year there is huge improvement was done in performance wise.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V
> 
> You mean like that?


That's the only one I saw using 4x MSAA. Everyone else was 2X or FXAA at 4K (of the 5 or 6 I checked). That average is so low because the frame rate tanked in the middle (and they had stuttering). Not sure those things are a coincidence.

Techpowerup is the outlier, they had something like 30 FPS with FXAA (although the 980 Ti also had crappy frame rates, so they've got something else going on, maybe Win 7?)


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> That's the only one I saw using 4x MSAA. Everyone else was 2X or FXAA at 4K (of the 5 or 6 I checked). That average is so low because the frame rate tanked in the middle (and they had stuttering). Not sure those things are a coincidence.
> 
> Techpowerup is the outlier, they had something like 30 FPS with FXAA.


There's a 290x in the same graph. Why didn't it tank? Same ROPS, same VRAM.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> There's a 290x in the same graph. Why didn't it tank? Same ROPS, same VRAM.


Magic?


----------



## Aaron_Henderson

How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> Magic?


Yeah, there's been a whole lot of magic in how people have been cherry picking their graphs and arguments for the last 1800 posts.









I stayed out of it because I can't work and post in a fast moving thread like that and get anything done, but there are a lot of reviews on this card that are just screwed up.

Edit: And it's AMD's fault.
Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


Because not a whole lot of people buy a $650 GPU, or two, or four, to use on a $150 monitor. Simplest explanation.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> I stayed out of it because I can't work and post in a fast moving thread like that and get anything done, but there are a lot of reviews on this card that are just screwed up.


Simple solution...don't work.


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> Simple solution...don't work.


Then what's the point of commenting on the new cards if I can't buy them?









WIC doesn't scan at Newegg, son.


----------



## azanimefan

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


because if you're playing in 1080p an r9-280x is enough.

if you really want to overkill 1080p get yourself a card made for 1440p like a gtx970 or r9-290x.

you don't spend $700 on a gpu to game on a $120 1080p monitor.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *Ganf*
> 
> Because not a whole lot of people buy a $650 GPU, or two, or four, to use on a $150 monitor. Simplest explanation.


Makes sense, I suppose


----------



## dmasteR

Quote:


> Originally Posted by *Ganf*
> 
> Yeah, there's been a whole lot of magic in how people have been cherry picking their graphs and arguments for the last 1800 posts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I stayed out of it because I can't work and post in a fast moving thread like that and get anything done, but there are a lot of reviews on this card that are just screwed up.
> 
> Edit: And it's AMD's fault.
> Because not a whole lot of people buy a $650 GPU, or two, or four, to use on a $150 monitor. Simplest explanation.


People also want to buy a $650 GPU for 1080p 144fps like myself :]

Playing games at 60fps just feels awful.


----------



## Ganf

Quote:


> Originally Posted by *dmasteR*
> 
> People also want to buy a $650 GPU for 1080p 144fps like myself :]


Then you're better off with the 980ti until DX12 takes over.


----------



## 12Cores

Any word on them unlocking voltage control on this card, the 7970/290x reference cards shipped with unlocked voltage?


----------



## hamzta09

Quote:


> Originally Posted by *azanimefan*
> 
> because if you're playing in 1080p an r9-280x is enough.
> 
> if you really want to overkill 1080p get yourself a card made for 1440p like a gtx970 or r9-290x.
> 
> you don't spend $700 on a gpu to game on a $120 1080p monitor.


I love how everyone assumes 1080p means <60fps.
1080p gaming today is all about 120-144hz meaning 120-144fps.

Thus 280X is farrrrrrrrrrrrrrrrrrrrr from enough.
Heck a 980 Ti isnt even enough.

Why anyone wants to play on a 60hz monitor with that amount of ghosting, motionblur and what not.. is beyond me.


----------



## curlyp

Quote:


> Originally Posted by *iLeakStuff*
> 
> Get the 980 Ti Hybrid. Sure it cost $100 more, but it got stellar cooling and noise, plus the performance should be a good deal higher than Fury X.
> Should be well worth the money I think
> 
> Thats what I plan to do as well, unless I do water on everything. Then I just get a 980Ti for $650 and use EK Waterblock
> 
> Gamernexus did a review on it
> http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks/Page-2
> 
> edit: and Jayz2cents
> https://www.youtube.com/watch?v=qtRqmzRMar8


Thanks for your suggestion.

I would really like to give Radeon's new series of cards a try. Based on your opinion (and others), what do you think about 2x Crossfire MSI 390X Gaming cards? Would they out perform the R9 Fury X?


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> I love how everyone assumes 1080p means <60fps.
> 1080p gaming today is all about 120-144hz meaning 120-144fps.
> 
> Thus 280X is farrrrrrrrrrrrrrrrrrrrr from enough.
> Heck a 980 Ti isnt even enough.
> 
> Why anyone wants to play on a 60hz monitor with that amount of ghosting, motionblur and what not.. is beyond me.


I play RPG's and space sims. Motion blur and ghosting are not my bogeymen, I just want more screen real estate.


----------



## iinversion

Quote:


> Originally Posted by *curlyp*
> 
> Thanks for your suggestion.
> 
> I would really like to give Radeon's new series of cards a try. Based on your opinion (and others), what do you think about 2x Crossfire MSI 390X Gaming cards? Would they out perform the R9 Fury X?


The 300 series is not new at all. They are the same thing as the R9 290X 8GB versions just with a slightly higher clock and rebranded with a new sticker. If you want to go that route then get 2 of those instead and save $50 per card

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202144&cm_re=sapphire_r9_290x_8gb-_-14-202-144-_-Product

However, a single card solution should always be preferred over any dual card setup.


----------



## Casey Ryback

Quote:


> Originally Posted by *Klocek001*
> 
> blasphemy! you seem to dislike most games. why would you even need a gpu then?


lol, just because i don't like the small 1% of benchmarked games, doesn't mean I don't like gaming.

Me and my friends finished dying light pretty quick (under 20 hours iiirc), did most of the side quests in that time too and after that the replay factor was non existent.

Graphics were nice but it just didn't have the gameplay for us.

Just for you I'll list some games I've put hours into........

Arma 3
BF3/BF4
COH/COH2
Payday 2
Rust
Space engineers
Starbound
Streetfighter 4

You do know that the biggest games in the world like CSGO, dota etc never are part of benchmarks right?


----------



## BinaryDemon

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


Because when you spend $650, you want to know how the card is going to perform in the future as well as the present. 4K is the future.. ok some people are running 4k now, but most people aren't maxing out smoothly at 4k yet.


----------



## keikei

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


All those resolutions matter, but 4k being the harder setting to run it does get more attention.


----------



## DividebyZERO

Quote:


> Originally Posted by *BinaryDemon*
> 
> Because when you spend $650, you want to know how the card is going to perform in the future as well as the present. 4K is the future.. ok some people are running 4k now, but most people aren't maxing out smoothly at 4k yet.


Quote:


> Originally Posted by *keikei*
> 
> All those resolutions matter, but 4k being the harder setting to run it does get more attention.


don't forget VSR/DSR also.


----------



## Casey Ryback

Quote:


> Originally Posted by *GorillaSceptre*
> 
> The Fury may get close in raw performance, but AMD are still behind Nvidia in nearly every other aspect. If it offered 20% over the competition i would of gone for it, but now it comes down to which brand has proven over and over again to offer a better experience, and imo it's Nvidia.


How often does a card from either side offer 20% more performance for the same price though...............practically never.

You're just being unrealistic.

It would've been nice if the fury was $599 v 980ti's $649 but it wasn't to be.


----------



## Casey Ryback

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


Because the fury was designed for high resolutions. It is aimed at 4K , and also does well at 1440p.

If you bought it over the ti for 1080p then it would be pretty foolish from a price/performance perspective.


----------



## Pawelr98

Quote:


> Originally Posted by *sugalumps*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pawelr98*
> 
> Looking at prices.
> On newegg 980TI is 650$ while on microcenter it's 670$.
> I'm going to purchase from microcenter.
> 
> Both cards will require some modding/customization.
> Fury X is reverse side VRM's cooling + connection to main loop.
> 980TI would require complete watercooling solution(fullcover block or universal waterblock + radiators with a fan).
> 
> Both cards are close in pricing and perfromance but modding Fury X will be cheaper(I already have radiators and tubing).
> All comes down what will be available within those next 2 weeks (my stay in US). If both will be available then I will go with Fury X due to smaller size (easier to fit into luggage) and cheaper cooling mod.
> 
> *I think that even lowering the price to 625$ would make Fury X more competitive*.
> 
> 
> 
> Nah $25 for 50% more vram is still a better deal.
Click to expand...

I game on 2560x1080 so it's not like I need that additional Vram that much.I will use supersampling in some games though(like arma 3).
I apreciate smaller size(easier to put in luggage) and stock AIO cooler(easier to mod into full loop watercooling) more.

But as I said before. It all depends on what will show up in the store before I leave US.

Also on the other hand. Does any review has a chart about gpu clocks? I just wonder about OCP (power limit).
For example I managed to get clock slowdown on 880mhz bios on HD6990 and it's 450W OCP. +20% power limit managed to get rid of it but I'm still curious if any review has any info about that.


----------



## hamzta09

Quote:


> Originally Posted by *Ganf*
> 
> I play RPG's and space sims. Motion blur and ghosting are not my bogeymen, I just want more screen real estate.


Theres still motionblur/ghosting when you pan the camera.


----------



## Majin SSJ Eric

I think the most disappointing thing about this card so far is the overclocking, or lack thereof. Now it must be said that we really don't know how it will OC in the future as we are just one day out from release and it is certainly conceivable that over the next month or so that we will get voltage control but I was definitely hoping for something in the 1300-1400MHz range. Had we gotten such overclockability I don't doubt that Fury X would have been on more equal footing with GM200. At stock the card hangs just fine with 980Ti but we all know what a monster overclocker the Nvidia card is so there is no doubt Fury will get destroyed in the hands of the typical OCN member and thus the disappointment. I guess given GCN's history it would be kind of silly to expect 1400MHz out of Fury but AMD was throwing words around like "overlcoker's dream" so its not entirely our fault for getting our hopes up. Anyway, this is NOT a fail of a card by any means but it has to play second fiddle to Nvidia... AGAIN.


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> Because the fury was designed for high resolutions. It is aimed at 4K , and also does well at 1440p.
> 
> If you bought it over the ti for 1080p then it would be pretty foolish from a price/performance perspective.


Whats with this bollocks on OCN? Are there no gamers here?

Why would it be foolish at 1080p?


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> Whats with this bollocks on OCN? Are there no gamers here?
> 
> Why would it be foolish at 1080p?


err because the fury does terribly at 1080p.

Why wouldn't you buy the ti instead?

What bolloks are you carrying on about..........think you need to read my statement again, or I need to reword it for you.

The 980ti is a no brainer for high fps at 1080p......................

The fury is still viable for 1440p/4K.

edited


----------



## Blackops_2

Are there any OCing results with voltage control yet?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Casey Ryback*
> 
> err because the fury does terribly at 1080p.
> 
> Why wouldn't you buy the ti instead?
> 
> What bolloks are you carrying on about..........think you need to read my statement again, or I need to reword it for you.
> 
> The 980ti is a better card for 1080p......................


Everyone has been saying for year that AMD does bad at 1080p and now 1440p as well. It all that DX11 CPU overhead and nothing more. DX12 will be the fix once and for all. I too would recommend Geforce for 1080 gaming. I have a Projector what is 1280x800 @ 120Hz and its pathetic how R9 290X performs compare to 1440p. Its like 1/4 the pixels and get only 2x the fps at best.


----------



## gamervivek

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> How come all of a sudden 4K are the only benches that matter? I see a ton of people with 1080P/1440P displays saying the Fury X is at least as fast, or faster, than the 980 Ti? And please don't respond with any of this "fanboy" garbage...


It seems to be the opposite of when AMD would do better at lower resolutions and then drop off at higher resolutions.

And apparently downsampling is a way of life now with nvidia's introduction of DSR







(but of course AMD couldn't get reviewers to test for that)


----------



## Aaron_Henderson

I mean, hopefully these cards sell well enough...it sounds like they might sell OK...we'll see, maybe if I can get a hold of a used one in about 6 months or so, for a good price. Wonder if we'll see some decent aftermarket air coolers for the people not wanting the stock WCing? With the card as small as it is, it might be tough to air cool without a giant heatsink hanging over the end of the card. I just can't force myself to be OK with installing the stock Fury X water cooling setup into my case...that's really one of the biggest hurdles for me really finding interest in these cards. Anyway, it's not like I am one lined up and ready with $650+ in hand...it's just more than I would want to spend on a new GPU. So it's either wait for the used market, which due to supply, I am not sure will even exist. Or just grab another 290X for cheap and be happy until the next set of cards is out. I think I am over my initial disappointment, but it's still tough to see a reason to buy Fury X...might sit on my single 290X for a bit longer and see what plays out over the next few months.


----------



## Casey Ryback

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Wonder if we'll see some decent aftermarket air coolers for the people not wanting the stock WCing? With the card as small as it is, it might be tough to air cool without a giant heatsink hanging over the end of the card. I just can't force myself to be OK with installing the stock Fury X water cooling setup into my case...that's really one of the biggest hurdles for me really finding interest in these cards. Anyway, it's not like I am one lined up and ready with $650+ in hand...it's just more than I would want to spend on a new GPU.


Yep we'll see AIB coolers on the next fury card set for release later next month. There's also the fury nano.

The cards won't struggle to be cooled compared to any current gpus like the 290X/980ti etc.

It's a bit expensive, and look at it this way, consider it their expensive, ridiculously priced flagship,

Much like the titan card in many regards, not really justifiable to a majority of the market.

I think you'll find the air cooled fury may be more reasonably priced as for one thing they can drop that expensive cooling solution, second thing is it'll be a cut down chip.


----------



## Boomstick727

Quote:


> Originally Posted by *Casey Ryback*
> 
> err because the fury does terribly at 1080p.


It's not terrible. You can always run VSR and use higher resolution on a 1080P screen for some extra image quality.

My experience with Fury X @ 1080P so far:

*The Witcher 3 1080P @ Ultra with Hairworks Off*

*Fury X @ stock
5820 K @ stock*
_
Using Vsync wanted to see if card could maintain 60 FPS, for most part it's solid but does randomly hitch then go back to 60FPS, Memory usage is under 2GB. Can't replicate in same area seems random. Could be driver issue? Overall runs nice, game looks great. I have sharpening on low as high looks horrible to me.

Temps @ 50C, cooler is quiet.

Pics in spoiler, can see temps and memory usage etc._


Spoiler: Warning: Spoiler!



















*Crysis 3 1080P @ Very High SMAA X 1*

Runs really nice some drops below 60 some highs but average around 60FPS, with V-Sync on it's nice. Memory usage around 1400MB.

*Highs 80+
Average 60
Lows 50*
_
*Fury X @ 1120Mhz
5820K @ Stock*_

Pics below, see temps vram usage.


Spoiler: Warning: Spoiler!



*Settings:
*



*V Sync ON*






*V Sync OFF*
















*Far Cry 4 @ 1080P Ultra.*

_This game runs awesome, day and night over the 290X. All the settings up, looks great. Temps peaked at 55C. Memory usage around 3600MB._
*
Fury X @ stock
5820 K @ stock*

*Lows 60 FPS
Average 75FPS
Highs 90FPS+*

Pics with stuff:


Spoiler: Warning: Spoiler!









*Yay Elephant !*



*Sadly he died horrifically after we fell down a cliff R.I.P*


----------



## sugalumps

Well if the peoples predictions for dx12 gains are the same as their predictions for the fury "destroying" the ti then I am not to hopefull


----------



## ZealotKi11er

Quote:


> Originally Posted by *Casey Ryback*
> 
> Yep we'll see AIB coolers on the next fury card set for release later next month. There's also the fury nano.
> 
> The cards won't struggle to be cooled compared to any current gpus like the 290X/980ti etc.
> 
> It's a bit expensive, and look at it this way, consider it their expensive, ridiculously priced flagship,
> 
> Much like the titan card in many regards, not really justifiable to a majority of the market.
> 
> I think you'll find the air cooled fury may be more reasonably priced as for one thing they can drop that expensive cooling solution, second thing is it'll be a cut down chip.


Fury will be $50 more then GTX980. Surely it will bes faster and it will be easy choice one would think.
Quote:


> Originally Posted by *sugalumps*
> 
> Well if the peoples predictions for dx12 gains are the same as their predictions for the fury "destroying" the ti then I am not to hopefull


AMD just really bad with DX11 CPU overhead. The moment the game becomes CPU limited AMD cards cant output anymore fps. You can see it in games like TR. That's way as you increase resolution AMD = Nvidia and sometimes better but 1080p Nvidia walks all over. If you take Fury X and GTX980 Ti you will probably find GTX980 Ti ~ 20% faster 1080p. 15% is probably because of DX11.


----------



## Casey Ryback

Quote:


> Originally Posted by *Boomstick727*
> 
> It's not terrible. You can always run VSR and use higher resolution on a 1080P screen for some extra image quality.


Good point. And yes terrible was an exaggeration.

According to benchmarks the fury does fall behind the 980ti at 1080p, and the 980ti has the OC headroom.

Me personally I don't think the fury is a bad buy at all. Congrats dude those results look good


----------



## Casey Ryback

Quote:


> Originally Posted by *sugalumps*
> 
> Well if the peoples predictions for dx12 gains are the same as their predictions for the fury "destroying" the ti then I am not to hopefull


Are you replying to someone..........?

Or did you just see some pages without anti fury posts and feel the need to share your negative and pessimistic views?

Shoo fly


----------



## Bartouille

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think the most disappointing thing about this card so far is the overclocking, or lack thereof. Now it must be said that we really don't know how it will OC in the future as we are just one day out from release and it is certainly conceivable that over the next month or so that we will get voltage control but I was definitely hoping for something in the 1300-1400MHz range. Had we gotten such overclockability I don't doubt that Fury X would have been on more equal footing with GM200. At stock the card hangs just fine with 980Ti but we all know what a monster overclocker the Nvidia card is so there is no doubt Fury will get destroyed in the hands of the typical OCN member and thus the disappointment. I guess given GCN's history it would be kind of silly to expect 1400MHz out of Fury but AMD was throwing words around like "overlcoker's dream" so its not entirely our fault for getting our hopes up. Anyway, this is NOT a fail of a card by any means but it has to play second fiddle to Nvidia... AGAIN.


This will likely oc around the same as hawaii with voltage control. So probably 1.2ghz at most with a reasonable +100mv. This card is already running cool so there isn't that much more to squeeze with lower temp (not talking about ln2 stuff obviously lol). Anyway, I think if voltage control was such a game changer AMD would have had that working on release day, but they didn't.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bartouille*
> 
> This will likely oc around the same as hawaii with voltage control. So probably 1.2ghz at most with a reasonable +100mv. This card is already running cool so there isn't that much more to squeeze with lower temp (not talking about ln2 stuff obviously lol). Anyway, I think if voltage control was such a game changer AMD would have had that working on release day, but they didn't.


Only think we dont know 100% or i have missed how cool the VRM run. They play a big part on overclocking.


----------



## Casey Ryback

Quote:


> Originally Posted by *Bartouille*
> 
> This will likely oc around the same as hawaii with voltage control. So probably 1.2ghz at most with a reasonable +100mv. This card is already running cool so there isn't that much more to squeeze with lower temp (not talking about ln2 stuff obviously lol). Anyway, I think if voltage control was such a game changer AMD would have had that working on release day, but they didn't.


Yeah I'd say they will top out at 1250mhz or so, although we can't be sure.

Hopefully the air version comes out at 900mhz, for a lot cheaper, with voltage control, and therefore has a heap of headroom








Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only think we dont know 100% or i have missed how cool the VRM run. They play a big part on overclocking.


VRM's run quite hot, but that's because the fury is in a concealed box, air fury will be much better for vrm temps.


----------



## Kinaesthetic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only think we dont know 100% or i have missed how cool the VRM run. They play a big part on overclocking.


I think about 10 or so pages ago, it was shown that the VRMs, some of which are on the back are running as hot as 103oC on stock voltage. And since the backplate doesn't have thermal pads making contact with them, it is actually insulating heat on the back of the card.


----------



## p4inkill3r

Quote:


> Originally Posted by *Boomstick727*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It's not terrible. You can always run VSR and use higher resolution on a 1080P screen for some extra image quality.
> 
> My experience with Fury X @ 1080P so far:
> 
> *The Witcher 3 1080P @ Ultra with Hairworks Off*
> 
> *Fury X @ stock
> 5820 K @ stock*
> _
> Using Vsync wanted to see if card could maintain 60 FPS, for most part it's solid but does randomly hitch then go back to 60FPS, Memory usage is under 2GB. Can't replicate in same area seems random. Could be driver issue? Overall runs nice, game looks great. I have sharpening on low as high looks horrible to me.
> 
> Temps @ 50C, cooler is quiet.
> 
> Pics in spoiler, can see temps and memory usage etc._
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Crysis 3 1080P @ Very High SMAA X 1*
> 
> Runs really nice some drops below 60 some highs but average around 60FPS, with V-Sync on it's nice. Memory usage around 1400MB.
> 
> *Highs 80+
> Average 60
> Lows 50*
> _
> *Fury X @ 1120Mhz
> 5820K @ Stock*_
> 
> Pics below, see temps vram usage.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Settings:
> *
> 
> 
> 
> *V Sync ON*
> 
> 
> 
> 
> 
> 
> *V Sync OFF*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Far Cry 4 @ 1080P Ultra.*
> 
> _This game runs awesome, day and night over the 290X. All the settings up, looks great. Temps peaked at 55C. Memory usage around 3600MB._
> *
> Fury X @ stock
> 5820 K @ stock*
> 
> *Lows 60 FPS
> Average 75FPS
> Highs 90FPS+*
> 
> Pics with stuff:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Yay Elephant !*
> 
> 
> 
> *Sadly he died horrifically after we fell down a cliff R.I.P*


Great information, thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I think about 10 or so pages ago, it was shown that the VRMs, some of which are on the back are running as hot as 103oC on stock voltage. And since the backplate doesn't have thermal pads making contact with them, it is actually insulating heat on the back of the card.


Then could that be a reason for poor stock OC? I really hate how reviewers these days don't fully explore the card. I dont care about fps. There are 1 million reviews for that.


----------



## Kaltenbrunner

1 million reviews


----------



## curlyp

Quote:


> Originally Posted by *rt123*
> 
> If you are gonna use FireStrike to judge a GPUs performance, you are better of getting Nvidia.


Why do you say that?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I think about 10 or so pages ago, it was shown that the VRMs, some of which are on the back are running as hot as 103oC on stock voltage. And since the backplate doesn't have thermal pads making contact with them, it is actually insulating heat on the back of the card.


I thought the vrms were actively cooled by the aio?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I thought the vrms were actively cooled by the aio?


Yeah that is what it looks like in the picture but apparently they still run hot. I hope thats the case because 100C VRMs will not let you OC at all.


----------



## Redwoodz

Asus 27" 1440p Freesync IPS 4ms 144Hz displayport http://www.asus.com/us/Monitors/MG279Q/overview/

DX12 and FuryX-the greenteam dropping prices.

If you game on a tv...you're doing it wrong.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Redwoodz*
> 
> Asus 27" 1440p Freesync IPS 4ms 144Hz displayport http://www.asus.com/us/Monitors/MG279Q/overview/
> 
> DX12 and FuryX-the greenteam dropping prices.


That screen is amazing. Sow it on person.


----------



## Ganf

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I thought the vrms were actively cooled by the aio?


They are, and then it seems the other half of them are on the back of the PCB and don't get cooled nearly as well. A really sad oversight. Nothing a modder can't fix in 5 minutes, but we shouldn't be fixing new cards.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Then could that be a reason for poor stock OC? I really hate how reviewers these days don't fully explore the card. I dont care about fps. There are 1 million reviews for that.




"After removing the back plate, we can observe that the VRM reach a little over 100 ° C. This plate is not participating in the cooling, the temperature is probably a little higher when it is actually in place.

While these components are designed for such temperatures, the R9 Fury X cooling system shows its limits at this level and we do not recommend overclocking potential with large VMOD."

Translated


----------



## Kinaesthetic

Quote:


> Originally Posted by *Redwoodz*
> 
> Asus 27" 1440p Freesync IPS 4ms 144Hz displayport http://www.asus.com/us/Monitors/MG279Q/overview/
> 
> DX12 and FuryX-the greenteam dropping prices.
> 
> If you game on a tv...you're doing it wrong.


I hate to be the prick, but you conveniently left out 35-90Hz for the VRR window. Of which games will frequently run out of that refresh rate window unless the only thing you ever play are AAA games.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah that is what it looks like in the picture but apparently they still run hot. I hope thats the case because 100C VRMs will not let you OC at all.
> 
> Some are on the back and are being insulated by the backplate which doesn't make contact with them:


----------



## NuclearPeace

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I thought the vrms were actively cooled by the aio?


There is a copper tube that is part of the run back from the radiator to the pump/coldplate housing that is supposed to make contact and cool the VRMs. Either its not making proper contact or 6 phases just isn't enough. I suspect the former.

100C+ VRM temperates at stock is bad. High quality MOSFETs are rated to around 120C before things start to get hairy (positive feedback loop of more heat > worsening performance > more heat, and so on) and nearby capacitors lose lifespan quickly as heat builds up.


----------



## harney

said it before OUCH as to be the reason for low stock speed....you would have thought adding the vrms to the water loop would have been a better option


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kinaesthetic*
> 
> I hate to be the prick, but you conveniently left out 35-90Hz for the VRR window. Of which games will frequently run out of that refresh rate window unless the only thing you ever play are AAA games.


What was AMD thinking. This means we need a active plate even for custom wb. More money if you go Custom.


----------



## hamzta09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That screen is amazing. Sow it on person.


Shouldnt sow on people.
Quote:


> Originally Posted by *Redwoodz*
> 
> Asus 27" 1440p Freesync IPS 4ms 144Hz displayport http://www.asus.com/us/Monitors/MG279Q/overview/
> 
> DX12 and FuryX-the greenteam dropping prices.
> 
> If you game on a tv...you're doing it wrong.


Greenteam so far havent dropped any prices. Not in EU atleast.

Also I'd rather get the ACER Z35.

As long as IPS is trash in terms of GLOW/Clouding I will stay away from em.


----------



## Pawelr98

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kinaesthetic*
> 
> I think about 10 or so pages ago, it was shown that the VRMs, some of which are on the back are running as hot as 103oC on stock voltage. And since the backplate doesn't have thermal pads making contact with them, it is actually insulating heat on the back of the card.
> 
> 
> 
> I thought the vrms were actively cooled by the aio?
Click to expand...

Some of them are cooled by AIO while others are not.

The ones behind the backplate are not directly cooled(PCB takes the heat and transfers it to the cooled VRM's). The ones on the same side as the gpu are cooled by cooper pipe(with coolant flowing through it).
So if I purchase the Fury X then mod#1 is going to be VRM cooling modification(remove backplate and add radiators on those hot vrm's).


----------



## NuclearPeace

Quote:


> Originally Posted by *harney*
> 
> 
> 
> 
> 
> said it before OUCH as to be the reason for low stock speed....you would have thought adding the vrms to the water loop would have been a better option


God, those VRMs are scorching. People who are considering some hefty overclocks should consider a full cover water block...

Edit: And also a backplate that does something... or time to MacGyver force some thermal interfacing between the VRM and the backplate.


----------



## Dhoulmagus

Quote:


> Originally Posted by *NuclearPeace*
> 
> God, those VRMs are scorching. People who are considering some hefty overclocks should consider a full cover water block...
> 
> Edit: And also a backplate that does something... or time to MacGyver force some thermal interfacing between the VRM and the backplate.


That's too hot for my tastes, I would definitely grab the EKWB block with this, figure the full coverage on a better pump/rad setup would cool the sucker off not to mention knocking it down to a single pci bay.. And look way cooler!


----------



## The Stilt

There are no components requiring cooling on the back side of the card.
The backplate most likely only makes the temperatures worse as the heat is trapped between the PCB and the backplate.
Air is one of the best insulators there is









The backplate obviously only exists for the looks.
Using a solid, dummy backplate is idiotic. At least they could have made perforations matching the pattern of the front rubber material.


----------



## curlyp

Quote:


> Originally Posted by *iinversion*
> 
> The 300 series is not new at all. They are the same thing as the R9 290X 8GB versions just with a slightly higher clock and rebranded with a new sticker. If you want to go that route then get 2 of those instead and save $50 per card
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202144&cm_re=sapphire_r9_290x_8gb-_-14-202-144-_-Product
> 
> However, a single card solution should always be preferred over any dual card setup.


Thanks for the suggestion. I didn't realize it was basically the same as the 290X. Why would people even purchase the 390X< when they could purchase the 290X for cheaper like you said? Probably falls back to the name/title...people want the latest and greatest...haha I know I do.

I've been looking up some benchmarks and I see the 295X is out performing the 380Xs, 980TI, Fury X and Titan X. Any reason why I shouldn't just pick that up? My thoughts are that it wouldn't semi "future" proof me for the next couple years with 4K monitors.

This is my lack of knowledge (so please to beat me too bad!), but, cards that only support D11, will they receive a driver/firmware update for D12?

The two monitors I am deciding to upgrade to are:

Samsung U28E590D - New 4K FreeSync Monitor
LG34UC97 - New Cured Widescreen monitor

My only reservation is not all games support widescreen monitors, which sometime you are stuck with the black side bars.


----------



## rv8000

Quote:


> Originally Posted by *Pawelr98*
> 
> Some of them are cooled by AIO while others are not.
> 
> The ones behind the backplate are not directly cooled(PCB takes the heat and transfers it to the cooled VRM's). The ones on the same side as the gpu are cooled by cooper pipe(with coolant flowing through it).
> So if I purchase the Fury X then mod#1 is going to be VRM cooling modification(remove backplate and add radiators on those hot vrm's).


From the looks of that naked pcb shot, there are no vrm mosfets on the rear side of the card and thats the part of the vrm section that actually gets hot. I think some of these thermal imaging pics are a) taken under extreme loads such as furmark and b) issues with fan profiles/drivers, poor contact with the pipe.

I guess it's even possible the pump is moving water too quickly and with the extreme heat vrms can generate, having such a small surface area pipe could negatively effect the vrm temps.


----------



## harney

What is the point of having the card on water when you have the vrms dumping that amount of heat in your case i am not trying to bash amd here...
but just seems crazy why they did not design the vrms into the water loop

Yes i understand that's under extreme thrashing but still if peeps want to over clock then its not going to be far off


----------



## ZealotKi11er

Quote:


> Originally Posted by *harney*
> 
> What is the point of having the card on water when you have the vrms dumping that amount of heat in your case i am not trying to bash amd here...
> but just seems crazy why they did not design the vrms into the water loop
> 
> Yes i understand that's under extreme thrashing but still if peeps want to over clock then its not going to be far off


No idea. Good Reviews since Core Temps is the important aspect.


----------



## Blameless

Quote:


> Originally Posted by *The Stilt*
> 
> There are no components requiring cooling on the back side of the card.
> The backplate most likely only makes the temperatures worse as the heat is trapped between the PCB and the backplate.


Cooling the backside of the card itself, especially in the vicinity of the VRMs, will help VRM temperatures.
Quote:


> Originally Posted by *The Stilt*
> 
> Air is one of the best insulators there is


Which is why I would replace it with something else, allowing the backplate to become a heatsink, or simply remove the backplate and epoxy a bunch of dedicated sinks to the back of the card, then mount a fan over it.
Quote:


> Originally Posted by *harney*
> 
> What is the point of having the card on water when you have the vrms dumping that amount of heat in your case i am not trying to bash amd here...
> but just seems crazy why they did not design the vrms into the water loop


The total heat coming off the PCB isn't that much. It's hitting relatively high temperatures because it's mostly uncooled.

The VRMs are attached to the loop by the copper piping that runs over them. To do better they would have needed to add a fan to the card or used a full cover block, which would have significantly increased cost of the cooling solution.
Quote:


> Originally Posted by *rv8000*
> 
> From the looks of that naked pcb shot, there are no vrm mosfets on the rear side of the card and thats the part of the vrm section that actually gets hot.


The thermal resistance to the top of a mosfet contacting the plate on the front of the card is probably about the same as the thermal resistance through the whole PCB. The PCB is layers of fiberglass (or something similar) and copper, and most of the underside of the mosfet is soldered directly to it.

Cooling the back of the card is almost as good as cooling the top of of the fets.


----------



## harney

Quote:


> Originally Posted by *Blameless*
> 
> The total heat coming off the PCB isn't that much. It's hitting relatively high temperatures because it's mostly uncooled.
> 
> The VRMs are attached to the loop by the copper piping that runs over them. To do better they would have needed to add a fan to the card or used a full cover block, which would have significantly increased cost of the cooling solution.


Fair point


----------



## rv8000

Quote:


> Originally Posted by *Blameless*
> 
> Cooling the backside of the card itself, especially in the vicinity of the VRMs, will help VRM temperatures.
> Which is why I would replace it with something else, allowing the backplate to become a heatsink, or simply remove the backplate and epoxy a bunch of dedicated sinks to the back of the card, then mount a fan over it.
> The total heat coming off the PCB isn't that much. It's hitting relatively high temperatures because it's mostly uncooled.
> 
> The VRMs are attached to the loop by the copper piping that runs over them. To do better they would have needed to add a fan to the card or used a full cover block, which would have significantly increased cost of the cooling solution.
> The thermal resistance to the top of a mosfet contacting the plate on the front of the card is probably about the same as the thermal resistance through the whole PCB. The PCB is layers of fiberglass (or something similar) and copper, and most of the underside of the mosfet is soldered directly to it.
> 
> Cooling the back of the card is almost as good as cooling the top of of the fets.


True, but these crazy temps seem to all be taken during furmark loads or insane stress tests similar to IBT for cpus. Even with a moderate voltage bump I doubt the vrms are going to approach there 130c limit. Under sustained gaming load with a voltage bump of say 100mv or more I suspect theyd definitely be around 100c though.

Once I get my card I'll definitely be doing some snooping around near the vrm section and what kind of thermal transfer aid is there, as well as checking if pump speeds can be adjusted (isn't that what the micro usb slot on the shroud is for anyways? just realized its the bios switch and not a m-usb port







). Now if only I had my own thermal gun laying around


----------



## infranoia

Quote:


> Originally Posted by *curlyp*
> 
> Why would people even purchase the 390X< when they could purchase the 290X for cheaper like you said?


HDMI 2.0. At least on the MSI.

Of course, getting up above 30Hz in 4K on Hawaii-- well, you're playing Sins of a Solar Empire or... some other very old games.

Scratch that. MSI has modified their product details to clarify it's also 1.4a only. Original source:

http://www.overclock.net/t/1560814/ocuk-amd-radeon-r9-fury-series-doesnt-have-hdmi-2-0-port-limited-to-1-4/400_100#post_24074102

So, it really is just a GlobalFoundry 290x.


----------



## Blameless

Quote:


> Originally Posted by *rv8000*
> 
> True, but these crazy temps seem to all be taken during furmark loads or insane stress tests similar to IBT for cpus. Even with a moderate voltage bump I doubt the vrms are going to approach there 130c limit. Under sustained gaming load with a voltage bump of say 100mv or more I suspect theyd definitely be around 100c though.


What are these VRMs rated for, current wise? That 130C limit will decrease the greater the clock speeds and the heavier the load, as well as from increased voltage.

Even if they are technically within limits, at say +100mV and a 20% OC, cooler VRMs will still increase longevity and dump less heat into the board and thus GPU.
Quote:


> Originally Posted by *rv8000*
> 
> Once I get my card I'll definitely be doing some snooping around near the vrm section and what kind of thermal transfer aid is there, as well as checking if pump speeds can be adjusted (isn't that what the micro usb slot on the shroud is for anyways? just realized its the bios switch and not a m-usb port). Now if only I had my own thermal gun laying around


This reminds me of something I was thinking about earlier: What direction is the flow through the loop, and is the pump reversible?

Having cool water from the rad hit the VRM first may help equalize VRM/GPU temps by a few degrees.
Quote:


> Originally Posted by *infranoia*
> 
> Of course, getting up above 30Hz in 4K on Hawaii-- well, you're playing Sins of a Solar Empire or... some other very old games.


_Elite: Dangerous_, as of the 1.3 patch runs at 4k, with Ultra settings, at 40 (certain station interiors) to 110fps (open space), with averages around 60-70, on my 290X.


----------



## ambientblue

Quote:


> Originally Posted by *Blameless*
> 
> Cooling the backside of the card itself, especially in the vicinity of the VRMs, will help VRM temperatures.
> Which is why I would replace it with something else, allowing the backplate to become a heatsink, or simply remove the backplate and epoxy a bunch of dedicated sinks to the back of the card, then mount a fan over it.
> The total heat coming off the PCB isn't that much. It's hitting relatively high temperatures because it's mostly uncooled.
> 
> The VRMs are attached to the loop by the copper piping that runs over them. To do better they would have needed to add a fan to the card or used a full cover block, which would have significantly increased cost of the cooling solution.
> The thermal resistance to the top of a mosfet contacting the plate on the front of the card is probably about the same as the thermal resistance through the whole PCB. The PCB is layers of fiberglass (or something similar) and copper, and most of the underside of the mosfet is soldered directly to it.
> 
> Cooling the back of the card is almost as good as cooling the top of of the fets.


Actually, you don't need to cool the back of the card to get exceptional VRM cooling. One reason I got this block for my 290x over EK or Aquacomputer. Still got a backplate though.

http://www.xtremerigs.net/2014/08/11/bitspower-vg-ar290x-review/

What is needed is a full cover block this is a good comparison

http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-detailed-testing-results/


----------



## iinversion

Quote:


> Originally Posted by *curlyp*
> 
> Thanks for the suggestion. I didn't realize it was basically the same as the 290X. Why would people even purchase the 390X< when they could purchase the 290X for cheaper like you said? Probably falls back to the name/title...people want the latest and greatest...haha I know I do.
> 
> I've been looking up some benchmarks and I see the 295X is out performing the 380Xs, 980TI, Fury X and Titan X. Any reason why I shouldn't just pick that up? My thoughts are that it wouldn't semi "future" proof me for the next couple years with 4K monitors.
> 
> This is my lack of knowledge (so please to beat me too bad!), but, cards that only support D11, will they receive a driver/firmware update for D12?
> 
> The two monitors I am deciding to upgrade to are:
> 
> Samsung U28E590D - New 4K FreeSync Monitor
> LG34UC97 - New Cured Widescreen monitor
> 
> My only reservation is not all games support widescreen monitors, which sometime you are stuck with the black side bars.


I don't know. Some people don't realize 390/X is the same as the 290/X. Cards that only support DX11 will stay that way. There is no driver or firmware update that can change that.

As far as the 295X2 is concerned it is just two 290X's on a single card. It is a dual GPU card and that is why it performs higher. If you do want to stay with a 290X/390X definitely get the 290X 8GB and save some money. Otherwise I personally think the 980 Ti is the route to go between the high end cards right now.


----------



## Sashimi

Quote:


> Originally Posted by *Blameless*
> 
> Having cool water from the rad hit the VRM first may help equalize VRM/GPU temps by a few degrees.


In water cooling the direction makes no difference as in the end the loop and all its components will reach equilibrium where they stay at a fixed temp.


----------



## Blameless

Quote:


> Originally Posted by *Sashimi*
> 
> In water cooling the direction makes no difference as in the end the loop and all its components will reach equilibrium where they stay at a fixed temp.


I know the difference between water in an out temps, at the radiator, is typically very small (~1C), but I've seen some AIOs with really crappy flow rates that could exacerbate this. Then again, I keep reading that Coolermaster is the pump maker for the Fury X AIO, and I've been pretty impressed with their Nepton 280L, for what it is.

You're right though, flow direction probably wouldn't make any noticeable difference.


----------



## iLeakStuff

Quote:


> Originally Posted by *Kinaesthetic*
> 
> To debunk the driver discrepancy rumors:
> 
> Straight from AMDMatt over at Overclockers:
> 
> Link: http://forums.overclockers.co.uk/showthread.php?p=28230087#post28230087
> 
> Can we just enjoy this darn card now instead of trying to nitpick every single itsy bitsy thing that tries to support either fanboy side's opinion?


Even me as an AMD supporter think people need to calm down, stop finding excuses and see Fury X as it is.

Its a really great card, its not as fast as 980Ti in 1080/1200/1440/1600p and is as fast as 980Ti on 4K.
Overclocking wise 980Ti seems much better so far.

Take it or leave it


----------



## iLeakStuff

Quote:


> Originally Posted by *Ganf*
> 
> Yeah, there's been a whole lot of magic in how people have been cherry picking their graphs and arguments for the last
> 
> Edit: And it's AMD's fault.
> Because not a whole lot of people buy a $650 GPU, or two, or four, to use on a 1440p $150 monitor. Simplest explanation.


What a lousy thing to say.
There are people, me included, that think 1440p and 144Hz is superior to 4K 60Hz. Asus make one with Freesync, Acer make one with G-sync, both offering glorious IPS displays that cost $600+ and requires even faster card than FuryX/980Ti to get full enjoyment out of.

So not only is your $150 display argument very wrong, but so is the "1440p benchies doesnt matter".


----------



## blue1512

Straight from the horse's mouth
http://forums.overclockers.co.uk/showpost.php?p=28230091&postcount=889
Quote:


> Looking good Reece. Just need *voltage control* now to unlock the full power.


Said an AMD representative at ocuk.
If it can reach 1300MHz it will be a hit


----------



## Blameless

I fully expect voltage control to happen, very soon, and for it to appreciably improve Fury's OCing potential.

That said, it's not here yet, and rarely recommend components for what they could be, even if that could be is almost a given.


----------



## The Stilt

Voltage can be adjusted right now if you know what you are doing








It´s no different to Hawaii cards.


----------



## Xuper

I need this but don't have money ...











amd_roy : Room for 3. Little doesn't mean not POWERFULL.

Edit :
Original Source : https://twitter.com/TheMattB81/status/614151356606246914/photo/1
Benchmark : http://www.3dmark.com/3dm/7500097


----------



## Thoth420

Quote:


> Originally Posted by *Xuper*
> 
> I need this but don't have money ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> amd_roy : Room for 3. Little doesn't mean not POWERFULL.
> 
> Edit :
> Original Source : https://twitter.com/TheMattB81/status/614151356606246914/photo/1
> Benchmark : http://www.3dmark.com/3dm/7500097


And I am worried about snaking tubing in my HAF XB coming in the mail with just one Fury X.....I think I will be fine









Also whoever's that is is just ....Wow!!!


----------



## DividebyZERO

Quote:


> Originally Posted by *Thoth420*
> 
> And I am worried about snaking tubing in my HAF XB coming in the mail with just one Fury X.....I think I will be fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also whoever's that is is just ....Wow!!!


Yeah, he has the only 4 in existence. I think 2 other people have 1. Judging by out of stock everywhere i guess they sold about 6 or so?


----------



## Thoth420

Quote:


> Originally Posted by *DividebyZERO*
> 
> Yeah, he has the only 4 in existence. I think 2 other people have 1. Judging by out of stock everywhere i guess they sold about 6 or so?


I managed to get one XFX from Tiger Direct pretty easily. I should have it by Monday or Tuesday but it is going in a new build and the rest of the hardware is lagging a day behind I believe so be a bit before I can speak on what I think of it.


----------



## Sashimi

Quote:


> Originally Posted by *Xuper*
> 
> I need this but don't have money ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> amd_roy : Room for 3. Little doesn't mean not POWERFULL.
> 
> Edit :
> Original Source : https://twitter.com/TheMattB81/status/614151356606246914/photo/1
> Benchmark : http://www.3dmark.com/3dm/7500097


Not POWERFUL at all!! In fact they're NOT POWERED.







Joking joking...


----------



## eXe.Lilith

Yeah obviously we all knew there were gonna be supply issues, we knew that weeks ago. It's fine by me tbh, I'd rather wait another week for my cards to get here in hope that by then AMD's driver team is close to releasing a new Catalyst.


----------



## DividebyZERO

Quote:


> Originally Posted by *eXe.Lilith*
> 
> Yeah obviously we all knew there were gonna be supply issues, we knew that weeks ago. It's fine by me tbh, I'd rather wait another week for my cards to get here in hope that by then AMD's driver team is close to releasing a new Catalyst.


Much as i hate to say it, i am skipping this one. I was hoping for more myself but i am not seeing why HBM is better yet. The competition has GDDR5 and they are performing better. Oh well.


----------



## jamaican voodoo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fury will be $50 more then GTX980. Surely it will bes faster and it will be easy choice one would think.
> AMD just really bad with DX11 CPU overhead. The moment the game becomes CPU limited AMD cards cant output anymore fps. You can see it in games like TR. That's way as you increase resolution AMD = Nvidia and sometimes better but 1080p Nvidia walks all over. If you take Fury X and GTX980 Ti you will probably find GTX980 Ti ~ 20% faster 1080p. 15% is probably because of DX11.


and thats all their is too is nvidia cards are better with DX11 draw call, once DX12 is live its game over for that 980 ti i'm certain of it.


----------



## Casey Ryback

Quote:


> Originally Posted by *DividebyZERO*
> 
> Much as i hate to say it, i am skipping this one. I was hoping for more myself but i am not seeing why HBM is better yet. The competition has GDDR5 and they are performing better. Oh well.


This is a common misconception about HBM.

It doesn't improve gpu core performance.

The PCB's have now been reduced in size, and it has bought total power consumption of the GCN architecture down drastically.

The fury X is now using similar power to the 290X, yet it has 40% more shaders.

The fury's and dual GPU Fury x2 are able to fit in an amazingly small package and will be ideal for for beastly small form factor rigs.

It is also the very first HBM cards so don't expect to be blown away just yet.

When the shrink from 16nm, down to 14nm happens hopefully AMD will make some architectural changes that will impress you more, and you'll be able to get HBM2 cards with more vram if you wish.

nvidia will also have their pascal cards out it will be interesting to see how both companies go with the shrink and HBM2.


----------



## DividebyZERO

Quote:


> Originally Posted by *jamaican voodoo*
> 
> and thats all their is too is nvidia cards are better with DX11 draw call, once DX12 is live its game over for that 980 ti i'm certain of it.


Sorry i am not buying anything about DX12 until its out and games are bench and tested. To many projections being tossed around all the time on here. I am not going to buy something based on a projection and be let down. I will wait for cold hard numbers before i believe it.


----------



## jamaican voodoo

Quote:


> Originally Posted by *DividebyZERO*
> 
> Sorry i am not buying anything about DX12 until its out and games are bench and tested. To many projections being tossed around all the time on here. I am not going to buy something based on a projection and be let down. I will wait for cold hard numbers before i believe it.


well thats the difference between me and you, it just make sense have you ever seen the 3dmark 12 DX12 vs. Mantle vs DX11 Bench. if not you might want to take look. draw call will make a big difference for amd cards. they never let down when i had 5870's, 7970's and 290's. a little common sense and observation goes a along way.


----------



## DividebyZERO

Quote:


> Originally Posted by *jamaican voodoo*
> 
> well thats the difference between me and you, it just make sense have you ever seen the 3dmark 12 DX12 vs. Mantle vs DX11 Bench. if not you might want to take look. draw call will make a big difference for amd cards. they never let down when i had 5870's, 7970's and 290's. a little common sense and observation goes a along way.


All i am saying is i'll believe it when i see it in action. In a DX12 game, that has a benchmark in it. Preferably not a gameworks masterpiece, or gaming devolved.


----------



## infranoia

My 290x is promised to another in mid-August. By then the sweet scent of the X2 will be in the air.

Hell, I can run on 4770 IGP until it shows up this winter. I predict an absolute beast, if the 295x is anything to go by. The last gasp of 28nm will be pretty interesting I think, what with DX12 around the corner, some refreshes still to see, the Nano and X2, and whatever NV has up their sleeves.

I don't get the knee-jerk doom and gloom. AMD always has miserable launch days. They shake out after a couple months-- which will correspond with a bunch of new Fiji SKUs. It's not yet time to bawl yer eyes out.


----------



## Olivon

Quote:


> With betters drivers coming , Fiji will be more competitive !
> 
> 
> 
> 
> 
> 
> 
> 
> Fury will shine with DX12 and W10 !
> 
> 
> 
> 
> 
> 
> 
> 
> With voltage control, Fury will be overclocking dreams !
> 
> 
> 
> 
> 
> 
> 
> 
> With renewed pump, there will be no more annoying noise !
> 
> 
> 
> 
> 
> 
> 
> 
> When AMD will lowered 100$ the Fury, the card will be well priced !


I love how some AMD fans constantly believe in the future...
Truth is more harsh. Maxwell complete lineup totally stomps AMD offering.
nVidia easily won the 28nm race and I'm kinda worried for the next 16nm one.
While Fiji is a nice piece of engineering and a great achievment for them, they're desesperatly late on multiple sides and regain market shares won't be easy without killer poducts.


----------



## blue1512

Quote:


> Originally Posted by *The Stilt*
> 
> Voltage can be adjusted right now if you know what you are doing
> 
> 
> 
> 
> 
> 
> 
> 
> It´s no different to Hawaii cards.


Please enlighten me, my savior. I love the card but I need your BIOS before pulling the trigger.


----------



## iLeakStuff

It just seems silly to me to gamble on buying a 10-15% slower Fury X at 1440p because DX12 or driver updates might benefit AMD more than Nvidia.


----------



## Casey Ryback

Quote:


> Originally Posted by *Olivon*
> 
> Maxwell complete lineup totally stomps AMD offering.


Depends on what you consider stomping.

AMD still has good options at all price points.


----------



## infranoia

Quote:


> Originally Posted by *Olivon*
> 
> I love how some AMD fans constantly believe in the future...
> Truth is more harsh. Maxwell complete lineup totally stomps AMD offering.
> nVidia easily won the 28nm race and I'm kinda worried for the next 16nm one.
> While Fiji is a nice piece of engineering and a great achievment for them, they're desesperatly late on multiple sides and regain market shares won't be easy without killer poducts.


If I had to rip my card out today, I'd absolutely get a 980Ti. But I don't have to rip it out this second. It's just not necessary to make an AMD launch-day purchase from either camp with the games that are out there today. I can afford to wait just a bit until the rest of the competition shows up. Then I'll make the call.

What's so hard to understand about that? Does a launch day demand that everyone refresh their damned cards?


----------



## blue1512

It seems that AMD cards has a small boost in Dx11 games with Win10,
Quote:


> Remember, AMD hasn't enabled DX11 MT drivers in Windows 8.1.
> 
> I rather see benchmarks done on Windows 10 i.e. Project Cars has frame rate uplift on R9-280 on Windows 10. Windows 10 forces DX11 MT.


The big thing is still in Dx12. Witcher 3, Project Car, Arkham Knight will soon have Dx12 patch. Ironically they are all Gimpworks title


----------



## ladcrooks

Quote:


> Originally Posted by *Serious_Don*
> 
> I can't play the waiting game anymore, I went for cheap graphics in my two home rigs (280x and 7950) over a year ago to hold me over until I could find something that would be the ultimate upgrade. I usually only like to upgrade once every 3-5 years, before those cards I was using CF 5770s and 5850s
> 
> 
> 
> 
> 
> 
> 
> . My 4790k is about to celebrate its first birthday and its still driving a budget card and 1080P monitors from 2010
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I have to say though, it's not just the GPU market that's causing me to go insane, I have been going nuts over what monitor to buy all year. I was about to pull the trigger on the MG279Q but something made me cancel the order. There is too much of a mess in the market right now, I may just have to close my eyes, pick the cards + monitors and just deal with it and hopefully in 2020 things are the way they were in 2010. Back then it was just pick the best GPU you could afford from a simple lineup, flagship cards were $399, monitors were 1080P or not 1080P and nobody knew the difference between *TN and IPS* yet


Its only when i used a tv = ips, for a monitor that i noticed how much better it was. But for games, would i worry about tn? No! Games are not photographic scenes nor are the characters. A 32 '' tv are cheaps as chips now! But do your homework some are better than others for text display









*I see anandtech did not get fury too test then* unsual or i think it is?


----------



## DividebyZERO

Best fury x deal ever right here!

http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.2392329

Meanwhile, the gpu is out of stock?


----------



## toncij

I've seen we talk a lot about "DX12 will bring that, DX12 will bring this" so I've tried to write a simplified explanation of what we're talking about when we talk about those things. As a developer I'm not sure if this will be clear enough for non-developers, but I did try to simplify and avoid non-important details and actual complexity in a piece I wrote about modern APIs and draw calls: https://medium.com/@toncijukic/draw-calls-in-a-nutshell-597330a85381


----------



## SpeedyVT

Quote:


> Originally Posted by *toncij*
> 
> I've seen we talk a lot about "DX12 will bring that, DX12 will bring this" so I've tried to write a simplified explanation of what we're talking about when we talk about those things. As a developer I'm not sure if this will be clear enough for non-developers, but I did try to simplify and avoid non-important details and actual complexity in a piece I wrote about modern APIs and draw calls: https://medium.com/@toncijukic/draw-calls-in-a-nutshell-597330a85381


Real benefit comes from asyncronous shaders and the fact AMD isn't making the driver half of DX12. The raw performance of DX12 from driver level access has been elevated unlike what we could've done with Mantle. Mantle had the API just not the access level from Microsoft. My body is ready also for these draw call improvement!!!

You're probably confused when I stated driver access. You've got to imagine an operating system like an onion each layer contains a level of exposure to software and hardware level. Any level of access on the inside takes time to reach the outside in use. OS's for security and stability have always places drivers within the shell of an os including graphics. I think the only thing that sits exposed is the CPU. Although back when AGP was being used that sat even closer to the CPU, but because PCIE provided the substantial bandwidth and all OSes understood a pci slot as one that persisted inward with assets only accessible on the driver level. Windows 8.1 did improve it slightly. Windows 10 puts graphics right on the outside to be exposed like AGP.


----------



## Am3oo

Discussion still going strong?
Time for the normal people to have their go?

I don't have a 4k monitor. Heck! I don't have a 1440p monitor. I had upgraded to 1080p last year and I plan to stay here for a while.
The upgrade bug bites me often, but I don't really play anything (I haven't played anything for the last 6 months, although I often buy games on impulse when there are discounts).
Even so, I have waited enthusiastically for the Fury X (was disappointed by the rest of the rebrands, except for the Tonga R9 380, which makes me feel sort of happy that I kinda have a current card).

Since the fury X is not competitive below 4k, maybe 1440p, that's a dead end for me, upgrade bug or not, so I will maybe move on to a r9 290x or 390 (same price here) in the black friday days. What do you guys think for 1080p? Maybe a gtx 970? Initially i wanted xfire 285, but with all the RAM needs for new games nowadays, it seems a bad idea.


----------



## toncij

In danger of sounding arrogant - I am confused? You gotta be kidding or really too hot a head to actually perceive anything but yourself in a quest of fan war for AMD.

Anyway, as I've mentioned - multiple enhancements do help, but in the end, I doubt you'll see much better performance for tested titles even with DX12 driver. The fact is, nothing AMD can do to improve current games.

Async shaders are a great implementation on AMD side, but the problem is you need to implement them in the game. We have a difference in queue implementation and capabilities and I'm sure we'll see it in DX12 games in favor of AMD. Unfortunately, I doubt we'll see that many games during 2015 and before we see the next generation of cards, Pascal and whatever Fiji successor is named.

But I applaud your strong will and effort to take any and every possible opportunity to try and boost AMD rep and possible advantage. However, my article has nothing to do with AMD or NVIDIA - it is an explanatory piece on a single aspect we tend to read about.


----------



## alawadhi3000

Quote:


> Originally Posted by *iLeakStuff*
> 
> It just seems silly to me to gamble on buying a 10-15% slower Fury X at 1440p because DX12 or driver updates might benefit AMD more than Nvidia.


According to that table an OCed GTX 980 Ti is %22 faster than OCed R9 Fury X.

I don't think DX12 and future drivers will help that much.


----------



## toncij

Quote:


> Originally Posted by *alawadhi3000*
> 
> According to that table an OCed GTX 980 Ti is %22 faster than OCed R9 Fury X.
> 
> I don't think DX12 and future drivers will help that much.


Well, overclocking is a different beast alltogether. We also don't have proper overclocking tools with voltage modification for FuryX (as far as I know) so the potential is still maybe there. Stock clock for clock, FuryX ain't really worse or better than 980Ti in general. Not as much as what should cause any concern.


----------



## Thoth420

I need case advice guys. This whole rad has to be above the card mess ruined my HAF XB plans as the rear exhaust is level with the GPU as the mobo sits flat. Needs to be ATX


----------



## alawadhi3000

Quote:


> Originally Posted by *toncij*
> 
> Well, overclocking is a different beast alltogether. *We also don't have proper overclocking tools with voltage modification for FuryX (as far as I know) so the potential is still maybe there.* Stock clock for clock, FuryX ain't really worse or better than 980Ti in general. Not as much as what should cause any concern.


Yes that correct, but I doubt the Fury will be better than Hawaii scaling wise, I expect an average OCer (~%10 max).


----------



## toncij

Quote:


> Originally Posted by *alawadhi3000*
> 
> Yes that correct, but I doubt the Fury will be better than Hawaii scaling wise, I expect an average OCer (~%10 max).


I'm also afraid AMD can't offer much of the overclocking potential. Even on base air Titans (I presume 980Tis are just slightly worse) clock insanely high from 30 to 50 percent or even more on water. AMD already has 70 deg. C at 1113 MHz tested, which is not really great for water and such a small clock.

But, on the same page, it is a small factor card (tho you really need serious space for those radiators that come with cards) and I can't wait for what dual GPU will bring. I'm no fanboy for either brand so I'd like to see AMD fight good.


----------



## blue1512

Quote:


> Originally Posted by *toncij*
> 
> I'm also afraid AMD can't offer much of the overclocking potential. Even on base air Titans (I presume 980Tis are just slightly worse) clock insanely high from 30 to 50 percent or even more on water. AMD already has 70 deg. C at 1113 MHz tested, which is not really great for water and such a small clock.
> 
> But, on the same page, it is a small factor card (tho you really need serious space for those radiators that come with cards) and I can't wait for what dual GPU will bring. I'm no fanboy for either brand so I'd like to see AMD fight good.


Please link me to that 70C review? As far as I know the card have never gone beyond 60C.


----------



## iLeakStuff

Ouch. I feel this is what other forums think of the situation as well.
I feel sorry for AMD. This card is not turning around the ship for them. I think 980Ti ruined the show









This is from techpowerups site


----------



## sugarhell

Quote:


> Originally Posted by *The Stilt*
> 
> Voltage can be adjusted right now if you know what you are doing
> 
> 
> 
> 
> 
> 
> 
> 
> It´s no different to Hawaii cards.


It's the same IR voltage controller? Then its easy to unlock the voltage


----------



## toncij

Quote:


> Originally Posted by *blue1512*
> 
> Please link me to that 70C review? As far as I know the card have never gone beyond 60C.


Not sure. Engineering sample I had seen months ago had 71 degrees at 1113. I'm sure that has much improved now so I may not be telling truth when we talk about current GPUs in the wild. 60 sounds very nice so I hope we could get 1200-1250 from Fiji and run it at 80 degrees.


----------



## MadRabbit

Some people, I swear.

1440p+ it's on par with the Ti, it comes with an AIO, it's smaller and people complain that it's the same price as the Ti. Alrrrriigty then.

Remember, AMD needs to earn back the money they spent on HBM R&D with Hynix, otherwise your preccious Nvidia couldn't even use it for Pascal, but people keep forgetting that and ignore it like Hynix was the sole dev of HBM.


----------



## Pro3ootector

Fury X is good. AMD showed what they are capable of. Performance wise both Ti and Fury go neck and neck, drivers will only improve this, so the battle only begins. Also i belive OC will be tweaked in future. It's nice to see NVIDIA and AMD both relase such great cards.


----------



## SpeedyVT

Quote:


> Originally Posted by *toncij*
> 
> Not sure. Engineering sample I had seen months ago had 71 degrees at 1113. I'm sure that has much improved now so I may not be telling truth when we talk about current GPUs in the wild. 60 sounds very nice so I hope we could get 1200-1250 from Fiji and run it at 80 degrees.


You could probably hit 70c at those clock rates with a liquid cooler. I'm thinking with the voltage unlocked we could hit 1300-1350mhz tops at roughly 78 celius.


----------



## Smanci

Ramp the fan up to 3k rpm and it's still at 60


----------



## toncij

Quote:


> Originally Posted by *SpeedyVT*
> 
> You could probably hit 70c at those clock rates with a liquid cooler. I'm thinking with the voltage unlocked we could hit 1300-1350mhz tops at roughly 78 celius.


That wouldn't be much worse than what 980Ti can


----------



## blue1512

Quote:


> Originally Posted by *toncij*
> 
> Not sure. Engineering sample I had seen months ago had 71 degrees at 1113. I'm sure that has much improved now so I may not be telling truth when we talk about current GPUs in the wild. 60 sounds very nice so I hope we could get 1200-1250 from Fiji and run it at 80 degrees.


Up until now, reviewer can get around 1150MHz at 60C. If the card could run at 1300 MHz before hitting 80C, assuming the GPU can reach that clock and the card is allowed to go beyond 60C
Quote:


> Originally Posted by *toncij*
> 
> That wouldn't be much worse than what 980Ti can


No, it is not much worse. 980Ti boost at 1200MHz, and the best OC boost even on custom design is around 1520 MHz, which mean 25% OC. If FuryX can reach 1300MHz which would be 23.5% OC, not bad IMHO


----------



## SpeedyVT

Quote:


> Originally Posted by *toncij*
> 
> That wouldn't be much worse than what 980Ti can


There is a point though where the you can't exceed a certain amount of mhz without artifacting anyway.


----------



## toncij

Quote:


> Originally Posted by *blue1512*
> 
> Up until now, reviewer can get around 1150MHz at 60C. If the card could run at 1300 MHz before hitting 80C, assuming the GPU can reach that clock and the card is allowed to go beyond 60C
> No, it is not much worse. 980Ti boost at 1200MHz, and the best OC boost even on custom design is around 1520 MHz, which mean 25% OC. If FuryX can reach 1300MHz which would be 23.5% OC, not bad IMHO


I presume we don't count in TitanX and base clocks of insane values of over 1500 on water.







It is a $1000+ card so doesn't really fit in the bracket.

Would love to see how FuryX works on LN2. If interposer and HBM are not limiting factor, that could be fun.


----------



## Silent Scone

I spoke with Adrian Thompson from Sapphire this morning, he said Trixx over-voltage support will be available in the coming weeks. I also questioned him about the memory situation and if it would be permanently locked down - out of curiosity, I don't deem it a problem.

He said it was too early too tell due to the complexity of the chip.

I still think it's likely personally due to it needing to remain static for error correction and also it's too early to tell if degradation will occur


----------



## Orivaa

Quote:


> Originally Posted by *Bartouille*
> 
> This will likely oc around the same as hawaii with voltage control. So probably 1.2ghz at most with a reasonable +100mv. This card is already running cool so there isn't that much more to squeeze with lower temp (not talking about ln2 stuff obviously lol). Anyway, I think if voltage control was such a game changer AMD would have had that working on release day, but they didn't.


Older post, I know, but AMD never had voltage control. It was always in 3rd party software, such as MSI Afterburner. They are the ones who need to adapt to the new controller in the Fiji, so it's not AMD's responsibility.


----------



## Sashimi

I was disappointed at first but now starting to look at it from another angle.

There is nothing AMD can do to improve current games except with possibly better drivers but improvements are probably minor. However, Fury X is running current generation of games just fine despite not being the top dog so there's really nothing to complain about. Future games are likely to be more demanding, and if DX12 does benefit Fury X boosting its performance beyond the 980 Ti, then isn't the Fury X a better overall package as it brings performance where it's truly needed?

Even ignoring uncertainties such as possible win 10 and DX12 improvements, reviews have shown that the Fury X performs better and is able to closes the gap between itself and the 980 Ti with more workload, so wouldn't it be logical to think as games become more and more graphically intensive the Fury X will close in even more, or equate, or eventually overtake the 980 Ti in future titles?

Having said that, I'm uncertain about the sustainability of 4GB VRAM. Perhaps the Fury X may run into VRAM shortages before games become intensive enough for it to surpass the 980 Ti. *shrugs*

Edit: Grammatical mistakes


----------



## Pro3ootector

Quote:


> Originally Posted by *Silent Scone*
> 
> I spoke with Adrian Thompson from Sapphire this morning, he said Trixx over-voltage support will be available in the coming weeks. I also questioned him about the memory situation and if it would be permanently locked down - out of curiosity, I don't deem it a problem.
> 
> He said it was too early too tell due to the complexity of the chip.
> 
> I still think it's likely personally due to it needing to remain static for error correction and also it's too early to tell if degradation will occur


Memory OC would be nice. Still i gues it's beyond everything allready


----------



## Kylar182

Sooo... Think we could get a 980 Ti (Kingpin) w/a 12GB Vram/PCB of a Titan X and HBM from AMD? Anyone? Anyone? Bueller? Bueller...


----------



## toncij

Quote:


> Originally Posted by *Sashimi*
> 
> I was disappointed at first but now starting to look at it from another angle.
> 
> There is nothing AMD can do to improve current games except with possibly better drivers but improvements are probably minor. However, Fury X is running current generation of games just fine despite not being the top dog so there's really nothing to complain about. Future games are likely to be more demanding, and if DX12 does benefit Fury X boosting its performance beyond the 980 Ti, then isn't the Fury X a better overall package as it brings performance where it's truly needed?
> 
> Even ignoring uncertainties such as possible win 10 and DX12 improvements, reviews have shown that the Fury X performs better and is able to closes the gap between itself and the 980 Ti with more workload, so wouldn't it be logical to think as games become more and more graphically intensive the Fury X will close in even more, or equate, or eventually overtake the 980 Ti in future titles?
> 
> Having said that, I'm uncertain about the sustainability of 4GB VRAM. Perhaps the Fury X may run into VRAM shortages before games become intensive enough for it to surpass the 980 Ti. *shrugs*
> 
> Edit: Grammatical mistakes


Yes.

In theory, more workload in a form of more and more complex shaders should show true advantage of AMD chip architecture (asynchronous shaders) and HBM memory. High resolutions, insane amount of different rendering and computing tasks, etc. should make FuryX ahead of 980Ti that is really better for current games. Best indicators of modern games should be Shadow of Mordor and Witcher 3, but not in its entirety before DX12 games (not API) come to life (Ashes of the Singularity for example).

However, I also fear for 4GB of VRAM for high-res situations. 5K and more could be crippled exactly there.


----------



## Sashimi

Quote:


> Originally Posted by *toncij*
> 
> Yes.
> 
> In theory, more workload in a form of more and more complex shaders should show true advantage of AMD chip architecture (asynchronous shaders) and HBM memory. High resolutions, insane amount of different rendering and computing tasks, etc. should make FuryX ahead of 980Ti that is really better for current games. Best indicators of modern games should be Shadow of Mordor and Witcher 3, but not in its entirety before DX12 games (not API) come to life (Ashes of the Singularity for example).
> 
> However, I also fear for 4GB of VRAM for high-res situations. 5K and more could be crippled exactly there.


If only they release double VRAM versions of cards like back in the days of Fermi lol. Then the Fury X would be a very competitive candidate against nVidia's offering. If nothing else we know for a fact that it will have the bandwidth to sustain 8GB haha.


----------



## Themisseble

Quote:


> Originally Posted by *Sashimi*
> 
> If only they release double VRAM versions of cards like back in the days of Fermi lol. Then the Fury X would be a very competitive candidate against nVidia's offering. If nothing else we know for a fact that it will have the bandwidth to sustain 8GB haha.


fury X is already competitive


----------



## tajoh111

Quote:


> Originally Posted by *MadRabbit*
> 
> Some people, I swear.
> 
> 1440p+ it's on par with the Ti, it comes with an AIO, it's smaller and people complain that it's the same price as the Ti. Alrrrriigty then.
> 
> Remember, AMD needs to earn back the money they spent on HBM R&D with Hynix, otherwise your preccious Nvidia couldn't even use it for Pascal, but people keep forgetting that and ignore it like Hynix was the sole dev of HBM.


At best, in reviews, Fury is losing at 1440p by somewhere along the lines of 5 percent but more reviews show double digit deficit. At 4k, the best it is really doing is matching the gtx 980 ti or beating it by 2% which is more or less a tie.

An AIO only makes sense if that extra cooling performance gives extra headroom. If it doesn't, than it kind of seems like a waste of money for AMD. The smaller card doesn't mean much when you consider the space the radiator and fan take up. This what makes the nano a more attractive card for the mini builds.

I would take 25% overclocking headroom on air vs 10% overlocking headroom and an AIO any day. Particularly when the gtx 980 ti is faster to begin with and can lead to that 22% performance difference as shown in that eurogamer chart. Particularly because of the design of Fiji, that AIO doesn't even cool one of the components that need cooling the most, the VRM's. AMD should have made the PCB a tad bigger and put the VRMS in the front where the water cooler could cool the front plate and cool the VRMs.

This AIO + Fury's PCB designs seems like an accident waiting to happen when these cards get voltage control. E.g most hybrid solutions have sinks of the VRM with a fan blowing over to cool the vrms. A pure water solution usually has a full block that covers the whole thing including the vrms, which is generally the ideal solution. With Fury x, you have a pump cooling the GPU and memory, but the VRMs are on the back with nothing but a back plate with no fan blowing air on it. Considering voltage modding causes VRM temps to increase the most, it just seems like a bad design.

AMD gambled and spent money on HBM, Nvidia likely spent more money on R and D making a new architecture for 28nm's. AMD should have probably should have saved HBM for 16nm. It's their own fault for not realizing that HBM wasn't particularly needed when their old memory controller could get close to 400gb/s of bandwidth and they were not going to be constrained as far as GPU bottlenecks because of 28nm's. If AMD put that much work into HBM, they are going to benefit next gen when Nvidia gets it out of royalties from sale of these chips. So it isn't exactly like Nvidia is getting a free lunch here.

People are going to buy the gtx 980 ti because it's a better card at a similar price for most people. To buy AMD just because it's HBM and AMD helped develop it, is just generally an excuse to get pity sales. Everyone wants AMD to get sales, but it doesn't take a genius to realize the upper echelon 500+ market was not going to make AMD any serious money. The volume is too low, particularly when your the value brand in the market.

Fanboys tell people to buy Fury X isn't going to save AMD, it's Zen or the next gen of 16nm cards that will do this and AMD digging themselves out of this rut. Hopefully AMD can take their knowledge on HBM at 16nm and make something special.


----------



## Sashimi

Quote:


> Originally Posted by *Themisseble*
> 
> fury X is already competitive


"Competitive - as good as or better than others of a comparable nature"

Present competition: being only to equate 980 Ti at 4k but losing on most other resolution is not being competitive.

Future competition: running into VRAM problem before games become demanding enough to realise its full potential is questionably competitive.

Taking the average between the two:
(non-competitive + speculating it may become competitive with a big question mark) / 2 = non-competitive.

Not that it's not a good card, it's just not competitive based on current information and logical future forecast based on these current information. Of course it may change as new information is being presented.


----------



## Redwoodz

Quote:


> Originally Posted by *tajoh111*
> 
> At best, in reviews, Fury is losing at 1440p by somewhere along the lines of 5 percent but more reviews show double digit deficit. At 4k, the best it is really doing is matching the gtx 980 ti or beating it by 2% which is more or less a tie.
> 
> An AIO only makes sense if that extra cooling performance gives extra headroom. If it doesn't, than it kind of seems like a waste of money for AMD. The smaller card doesn't mean much when you consider the space the radiator and fan take up. This what makes the nano a more attractive card for the mini builds.
> 
> I would take 25% overclocking headroom on air vs 10% overlocking headroom and an AIO any day. Particularly when the gtx 980 ti is faster to begin with and can lead to that 22% performance difference as shown in that eurogamer chart. Particularly because of the design of Fiji, that AIO doesn't even cool one of the components that need cooling the most, the VRM's. AMD should have made the PCB a tad bigger and put the VRMS in the front where the water cooler could cool the front plate and cool the VRMs.
> 
> This AIO + Fury's PCB designs seems like an accident waiting to happen when these cards get voltage control. E.g most hybrid solutions have sinks of the VRM with a fan blowing over to cool the vrms. A pure water solution usually has a full block that covers the whole thing including the vrms, which is generally the ideal solution. With Fury x, you have a pump cooling the GPU and memory, but the VRMs are on the back with nothing but a back plate with no fan blowing air on it. Considering voltage modding causes VRM temps to increase the most, it just seems like a bad design.
> 
> AMD gambled and spent money on HBM, Nvidia likely spent more money on R and D making a new architecture for 28nm's. AMD should have probably should have saved HBM for 16nm. It's their own fault for not realizing that HBM wasn't particularly needed when their old memory controller could get close to 400gb/s of bandwidth and they were not going to be constrained as far as GPU bottlenecks because of 28nm's. If AMD put that much work into HBM, they are going to benefit next gen when Nvidia gets it out of royalties from sale of these chips. So it isn't exactly like Nvidia is getting a free lunch here.
> 
> People are going to buy the gtx 980 ti because it's a better card at a similar price for most people. To buy AMD just because it's HBM and AMD helped develop it, is just generally an excuse to get pity sales. Everyone wants AMD to get sales, but it doesn't take a genius to realize the upper echelon 500+ market was not going to make AMD any serious money. The volume is too low, particularly when your the value brand in the market.
> 
> Fanboys tell people to buy Fury X isn't going to save AMD, it's Zen or the next gen of 16nm cards that will do this and AMD digging themselves out of this rut. Hopefully AMD can take their knowledge on HBM at 16nm and make something special.


Gaming gpu's is not the whole story. AMD has proven to make large strides versus Nvidia in the compute arena,where the real money is at.

http://www.anandtech.com/bench/product/1513?vs=1496

As shown here,FuryX tops 980Ti in Sony Vegas and Luxmark,while making strong gains in [email protected],even topping Nvidia in explicit double precision. Nvidia's gains in gaming have largely come from sacrificing GPGU output,while AMD has remained strong in both. Also while power consumption is very close,the FuryX is quieter under load than 980Ti.It looks to be a strong contender in the workstation arena.


----------



## Kane2207

It's competitive at stock but the massive headroom on Maxwell makes that point kind of redundant.

If the Fury X was at $50-$100 cheaper it would be a tempting offer, at the same price of the 980ti it really isn't very tempting at all unfortunately.
Quote:


> Originally Posted by *Redwoodz*
> 
> Gaming gpu's is not the whole story. AMD has proven to make large strides versus Nvidia in the compute arena,where the real money is at.
> 
> http://www.anandtech.com/bench/product/1513?vs=1496
> 
> As shown here,FuryX tops 980Ti in Sony Vegas and Luxmark,while making strong gains in [email protected],even topping Nvidia in explicit double precision. Nvidia's gains in gaming have largely come from sacrificing GPGU output,while AMD has remained strong in both. Also while power consumption is very close,the FuryX is quieter under load than 980Ti.It looks to be a strong contender in the workstation arena.


They may have the edge on compute but actual real world implementations on a large scale are almost always Nvidia and CUDA which makes AMDs compute advantage kind of redundant. Are there any large scale deployments using Fire Pro in the server space?


----------



## The Stilt

Fury X has throttling temperature limit of 75°C so it might limit maximum OC when voltage is adjusted.
OTP limit for the GPU is 79°C.

Throttling is also tripped when the water temperature sensor exceeds 60 or 67°C depending on cooler vendor.

VRM temperature limit is 127°C and the shutdown occurs at 134°C.
Both of these are extremely high as for such load the optimal would be <80°C when the fet derating is taken into account.


----------



## harney

Quote:


> Originally Posted by *Kane2207*
> 
> It's competitive at stock but the massive headroom on Maxwell makes that point kind of redundant.
> 
> If the Fury X was at $50-$100 cheaper it would be a tempting offer, at the same price of the 980ti it really isn't very tempting at all unfortunately.
> They may have the edge on compute but actual real world applications on a large scale are almost always Nvidia and CUDA which makes AMDs compute advantage kind of redundant. Are there any large scale deployments using Fire Pro in server space?


I am sure there aware of this and it would not surprise if this happens as i am also hearing that Nvidia are considering a price drop too so may be good times ahead for all of us


----------



## BoredErica

I don't want 'competitive-ish with Nvidia', I wanted a Titan-killer, however unrealistic my expectations were.


----------



## Kane2207

Quote:


> Originally Posted by *Redwoodz*
> 
> Gaming gpu's is not the whole story. AMD has proven to make large strides versus Nvidia in the compute arena,where the real money is at.
> 
> http://www.anandtech.com/bench/product/1513?vs=1496
> 
> As shown here,FuryX tops 980Ti in Sony Vegas and Luxmark,while making strong gains in [email protected],even topping Nvidia in explicit double precision. Nvidia's gains in gaming have largely come from sacrificing GPGU output,while AMD has remained strong in both. Also while power consumption is very close,the FuryX is quieter under load than 980Ti.It looks to be a strong contender in the workstation arena.


Quote:


> Originally Posted by *Darkwizzie*
> 
> I don't want 'competitive-ish with Nvidia', I wanted a Titan-killer, however unrealistic my expectations were.


I think this is the problem and why everyone is a little underwhelmed; this is the sort of thing being banded about on forums everywhere 'leaked' by AMD in the build up to the release:


----------



## iLeakStuff

Fury X is a 980Ti killer at $500. Its a good purchase at $550.
At $650 its a no go.


----------



## harney

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X is a 980Ti killer at $500. Its a good purchase at $550.
> At $650 its a no go.


Agree


----------



## Orivaa

The Fury X is a fine purchase for the price, if one is going for cool and quiet operation. It'll also likely improve with more mature drivers, but not saying anyone should play the waiting game vs getting a card now, but it's a nice bonus for those whose needs the Fury X suits.


----------



## rdr09

Quote:


> Originally Posted by *harney*
> 
> Agree


if it is even available. can't even find any.


----------



## harney

Quote:


> Originally Posted by *rdr09*
> 
> if it is even available. can't even find any.


probably the reason being priced so high .......here in gouge land uk OCUK gouge up to £649 when the preorder same card version was at the right price £509 not good at all lost my respect for OCUK yes they have to make a profit but there doing that even at £509 not taking the P!ss with £649.

However they did counter with 980 Ti's @£509 so not too bad but i still think these cards are priced way too high


----------



## Redwoodz

Quote:


> Originally Posted by *rdr09*
> 
> if it is even available. can't even find any.


Can't find any anything but reference 980Ti's either,and those have 1000MHz core,which I'm sure FuryX surpasses at stock.Something everyone needs to remember when viewing some of these reviews,comparing aftermarket highly modified and overclocked designs against the reference FuryX.


----------



## Kane2207

Quote:


> Originally Posted by *harney*
> 
> probably the reason being priced so high .......here in gouge land uk OCUK gouge up to £649 when the preorder same card version was at the right price £509 not good at all lost my respect for OCUK yes they have to make a profit but there doing that even at £509 not taking the P!ss with £649


That's pretty bad, didn't they drop the price of the 980ti to £509 around the same time?

£140 premium for the Fury X, wow.


----------



## keikei

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X is a 980Ti killer at $500. Its a good purchase at $550.
> At $650 its a no go.


Lets see what happens with the Fury X price within a few months.


----------



## rdr09

Quote:


> Originally Posted by *keikei*
> 
> Lets see what happens with the Fury X price within a few months.


they all go down. my 290 was $500 and can now be bought for $250. How much was the 780 Ti at launch? $600? How much is it now?

Titans are the only ones that hold their value pretty well.


----------



## RagingCain

Quote:


> Originally Posted by *Redwoodz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tajoh111*
> 
> At best, in reviews, Fury is losing at 1440p by somewhere along the lines of 5 percent but more reviews show double digit deficit. At 4k, the best it is really doing is matching the gtx 980 ti or beating it by 2% which is more or less a tie.
> 
> An AIO only makes sense if that extra cooling performance gives extra headroom. If it doesn't, than it kind of seems like a waste of money for AMD. The smaller card doesn't mean much when you consider the space the radiator and fan take up. This what makes the nano a more attractive card for the mini builds.
> 
> I would take 25% overclocking headroom on air vs 10% overlocking headroom and an AIO any day. Particularly when the gtx 980 ti is faster to begin with and can lead to that 22% performance difference as shown in that eurogamer chart. Particularly because of the design of Fiji, that AIO doesn't even cool one of the components that need cooling the most, the VRM's. AMD should have made the PCB a tad bigger and put the VRMS in the front where the water cooler could cool the front plate and cool the VRMs.
> 
> This AIO + Fury's PCB designs seems like an accident waiting to happen when these cards get voltage control. E.g most hybrid solutions have sinks of the VRM with a fan blowing over to cool the vrms. A pure water solution usually has a full block that covers the whole thing including the vrms, which is generally the ideal solution. With Fury x, you have a pump cooling the GPU and memory, but the VRMs are on the back with nothing but a back plate with no fan blowing air on it. Considering voltage modding causes VRM temps to increase the most, it just seems like a bad design.
> 
> AMD gambled and spent money on HBM, Nvidia likely spent more money on R and D making a new architecture for 28nm's. AMD should have probably should have saved HBM for 16nm. It's their own fault for not realizing that HBM wasn't particularly needed when their old memory controller could get close to 400gb/s of bandwidth and they were not going to be constrained as far as GPU bottlenecks because of 28nm's. If AMD put that much work into HBM, they are going to benefit next gen when Nvidia gets it out of royalties from sale of these chips. So it isn't exactly like Nvidia is getting a free lunch here.
> 
> People are going to buy the gtx 980 ti because it's a better card at a similar price for most people. To buy AMD just because it's HBM and AMD helped develop it, is just generally an excuse to get pity sales. Everyone wants AMD to get sales, but it doesn't take a genius to realize the upper echelon 500+ market was not going to make AMD any serious money. The volume is too low, particularly when your the value brand in the market.
> 
> Fanboys tell people to buy Fury X isn't going to save AMD, it's Zen or the next gen of 16nm cards that will do this and AMD digging themselves out of this rut. Hopefully AMD can take their knowledge on HBM at 16nm and make something special.
> 
> 
> 
> Gaming gpu's is not the whole story. AMD has proven to make large strides versus Nvidia in the compute arena,where the real money is at.
> 
> http://www.anandtech.com/bench/product/1513?vs=1496
> 
> As shown here,FuryX tops 980Ti in Sony Vegas and Luxmark,while making strong gains in [email protected],even topping Nvidia in explicit double precision. Nvidia's gains in gaming have largely come from sacrificing GPGU output,while AMD has remained strong in both. Also while power consumption is very close,the FuryX is quieter under load than 980Ti.It looks to be a strong contender in the workstation arena.
Click to expand...

So you think the Fury X is going to be popular with workstations? By comparing two gaming GPUs together?

And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?

And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?


----------



## Silent Scone

lol Cain and his logical debunking. Why don't you take it else where


----------



## SpeedyVT

This is definitely a driver issue.


----------



## Kaltenbrunner

Quote:


> Originally Posted by *Orivaa*
> 
> Older post, I know, but AMD never had voltage control. It was always in 3rd party software, such as MSI Afterburner. They are the ones who need to adapt to the new controller in the Fiji, so it's not AMD's responsibility.


drop it in a bucket of salty water and a toaster, that should add some voltage


----------



## RagingCain

Quote:


> Originally Posted by *Silent Scone*
> 
> lol Cain and his logical debunking. Why don't you take it else where


I am just getting a little clarification is all









Practicing in case I ever run for office.


----------



## Ganf

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X is a 980Ti killer at $500. Its a good purchase at $550.
> At $650 its a no go.


Still haven't heard a decent explanation as to why on this, when the 980ti Hybrid is $100 more.

Don't say noise either, the reviews on noise are all over the place with some saying it's quieter, some louder, and some reviewers reporting that their fan profile is borked and blaming the card for it.
Quote:


> Originally Posted by *RagingCain*
> 
> So you think the Fury X is going to be popular with workstations? By comparing two gaming GPUs together?
> 
> And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?
> 
> And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?


Firepro's have always been derivatives of the consumer side, it's like we beta test the hardware for enterprise. The nano shows that the AIO is not necessary, and they've already arranged the ports to make the card capable of being set up in single slot configurations.

Just sayin', they've still got a chance in that arena.


----------



## Redwoodz

Quote:


> Originally Posted by *RagingCain*
> 
> So you think the Fury X is going to be popular with workstations?
> 
> And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?
> 
> And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?


Obviously air cooled and passive go in farms.







Single/dual card workstations. Too early to tell exactly how they may be adapted to enterprise market,but Fury nano and FuryX2,with maybe eventual FirePro versions is not a real stretch. Also prior market domination is not a real valid argument against a new product.


----------



## Orivaa

Quote:


> Originally Posted by *Ganf*
> 
> Still haven't heard a decent explanation as to why on this, when the 980ti Hybrid is more.
> 
> Don't say noise either, the reviews on noise are all over the place with some saying it's quieter, some louder, and some reviewers reporting that their fan profile is borked and blaming the card for it.


Pre-launch cards had a cooler issue that was addressed and fixed before launch. This was established long ago, so any copy you buy should be pretty darn silent, unless you somehow get your hands on an early review sample.


----------



## 4everAnoob

The problem for AMD is simply that their architecture (GCN) is pretty much the same as when it was introduced in 2011, and there have been no die shrinks since then. However Nvidia didn't make any excuses and made a completely new and amazing architecture while AMD again makes a stupid decision and thinks it can keep adding shaders and MHz and bandwidth to solve their problems. I am still using a HD6870, a card from a time AMD was truly competitive. Also efficiency was very competitive then as well.


----------



## RagingCain

Quote:


> Originally Posted by *Redwoodz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> So you think the Fury X is going to be popular with workstations?
> 
> And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?
> 
> And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?
> 
> 
> 
> Obviously air cooled and passive go in farms.
> 
> 
> 
> 
> 
> 
> 
> Single/dual card workstations. Too early to tell exactly how they may be adapted to enterprise market,but Fury nano and FuryX2,with maybe eventual FirePro versions is not a real stretch. Also prior market domination is not a real valid argument against a new product.
Click to expand...

I completely agree that air cooled ones would be necessary.

Doesn't that negate your noise and power usage advantage over the 980 Ti? Higher temperatures, means greater voltages to offset leakage.

Couple it now with the fact the GTX 980 Ti has an extra 2 GB of VRAM. HBM maybe advantageous, unfortunately though, that high speed VRAM would be bottlenecked by the rate of computations done by the GPU not by bandwidth.

Not that I would use a 980 Ti for compute rendering, that's what professional workstation cards are for, but to entertain your idea.

Prior market domination means that any apps based on CUDA are probably going to stay that way, you can agree to that. OpenCL based applications could be adapted though. Someone, somewhere, has to look at the Fury X, see that their program can't run on it because it's been using CUDA for years now, and now want to convert their application to OpenCL

What feature on the Fury X would make that computer scientist do that?

Also do you have any source on computational arena being where the money is at, regarding commercial GPUs?


----------



## Orivaa

Quote:


> Originally Posted by *4everAnoob*
> 
> The problem for AMD is simply that their architecture (GCN) is pretty much the same as when it was introduced in 2011, and there have been no die shrinks since then. However Nvidia didn't make any excuses and made a completely new and amazing architecture while AMD again makes a stupid decision and thinks it can keep adding shaders and MHz and bandwidth to solve their problems. I am still using a HD6870, a card from a time AMD was truly competitive. Also efficient


Not. GCN has changed a lot since it was introduced.


----------



## iLeakStuff

Quote:


> Originally Posted by *RagingCain*
> 
> So you think the Fury X is going to be popular with workstations? By comparing two gaming GPUs together?
> 
> And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?
> 
> And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?


I can see AMD catching up with workstations this time around. GK110 Kepler and $999 Titan Black was extremely good at computation. GM200 and Titan X however doesnt have the same ciomputation performance which means you have to buy a Quadro M6000 that cost much more.
Maybe the workstation market isnt as big as we like to think and thats why they ditched GPGPU with Titan X, who knows. CUDA is strong within the scientific market thats for sure and they might not have other options than Nvidia, but Fury X is a really good value at $650 for those people.

Who knows how this will turn out.

Quote:


> Originally Posted by *Ganf*
> 
> Still haven't heard a decent explanation as to why on this, when the 980ti Hybrid is $100 more.
> 
> Don't say noise either, the reviews on noise are all over the place with some saying it's quieter, some louder, and some reviewers reporting that their fan profile is borked and blaming the card for it.



Stock 980Ti at 1440p is 10-15% faster than Fury X stock. Which is the display res choice I want to use.
Overclocking tests show that OC 980Ti is much faster than OC Fury X
Hybrid 980Ti cost $100 more than Fury X, but you get overclocked 980Ti with water cooling and warranty, which will be a good deal faster than Fury X.
I get 6GB VRAM instead of 4GB with 980Ti. Although most games work great with 4GB today, who knows about the future where some lazy developer make a console port that need the extra 2GB? And they say AMD need to make driver entries for each game that is released to make it work properly on HBM. What if they have to work 1 or 2 weeks on getting proper driver support for a game I want to play at launch? Wait there for 2 weeks?

I use to buy AMD when they offer better value than Nvidia. Which they usually do. Except this time it have slightly worse performance/$ value than 980Ti. If Fury X was priecd at $550 it would tip the scale in the other direction. Instabuy from me. If Fury X was faster than 980Ti which they showed tat the benchmark slide AMD released I would also have bought it.

I tried to find reasons to get a Fury X since the card looks soo much better than 980Ti, but I just can`t find much reason


----------



## rdr09

Quote:


> Originally Posted by *4everAnoob*
> 
> The problem for AMD is simply that their architecture (GCN) is pretty much the same as when it was introduced in 2011, and there have been no die shrinks since then. However Nvidia didn't make any excuses and made a completely new and amazing architecture while AMD again makes a stupid decision and thinks it can keep adding shaders and MHz and bandwidth to solve their problems. I am still using a HD6870, a card from a time AMD was truly competitive. Also efficient


I am shooting for playing the next battlefield in 4K with just one card. Fury might be it.


----------



## 4everAnoob

Quote:


> Originally Posted by *Orivaa*
> 
> Not. GCN has changed a lot since it was introduced.


Stuff was added yes. Noteworthy performance improvement due to architerctural changes? Nope.
GCN 1.0 -> GCN 1.1 has almost no noteworthy improvement at all. GCN 1.2 has some new stuff like memory bandwidth compression which does help a little.
But it is all very small stuff. Just increased shaders, core speed, bandwidth is really all that happened.
NVidia rebranded cards from the 8800 series to the GT2XX series, everyone was complaining. What AMD is doing now is just as bad in my opinion.


----------



## Ganf

Quote:


> Originally Posted by *iLeakStuff*
> 
> 
> Stock 980Ti at 1440p is 10-15% faster than Fury X stock. Which is the display res choice I want to use.
> Overclocking tests show that OC 980Ti is much faster than OC Fury X
> Hybrid 980Ti cost $100 more than Fury X, but you get overclocked 980Ti with water cooling and warranty, which will be a good deal faster than Fury X.
> I get 6GB VRAM instead of 4GB with 980Ti. Although most games work great with 4GB today, who knows about the future where some lazy developer make a console port that need the extra 2GB? And they say AMD need to make driver entries for each game that is released to make it work properly on HBM. What if they have to work 1 or 2 weeks on getting proper driver support for a game I want to play at launch? Wait there for 2 weeks?
> 
> I use to buy AMD when they offer better value than Nvidia. Which they usually do. Except this time it have slightly worse performance/$ value than 980Ti. If Fury X was priecd at $550 it would tip the scale in the other direction. Instabuy from me. If Fury X was faster than 980Ti which they showed tat the benchmark slide AMD released I would also have bought it.
> 
> I tried to find reasons to get a Fury X since the card looks soo much better than 980Ti, but I just can`t find much reason


Ehh, you picked the reviews that told you what you wanted to hear. I won't go any further than that.


----------



## RagingCain

Quote:


> Originally Posted by *iLeakStuff*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> So you think the Fury X is going to be popular with workstations? By comparing two gaming GPUs together?
> 
> And how much room would you need for each AIO water cooler radiator in a render farm or compute farm?
> 
> And that the compute arena is where the real money is at? Where the market is currently dominated by nVidia and their Quadros or Teslas and they can install 4 - 16 per machine?
> 
> 
> 
> I can see AMD catching up with workstations this time around. GK110 Kepler and $999 Titan Black was extremely good at computation. GM200 and Titan X however doesnt have the same ciomputation performance which means you have to buy a Quadro M6000 that cost much more.
> Maybe the workstation market isnt as big as we like to think and thats why they ditched GPGPU with Titan X, who knows. CUDA is strong within the scientific market thats for sure and they might not have other options than Nvidia, but Fury X is a really good value at $650 for those people.
> 
> Who knows how this will turn out.
Click to expand...

I have honestly just never seen a professional workstation at a business using commercial GPUs for workstation rendering. Most of these machines are Dell / Lenovo etc., when ordering workstations, these cards don't even show up as options. I am not saying it doesn't happen.

I would like to see some numbers. I don't see any current evidence that Fury X will succeed in enterprise workstation environments.

To claim a consumer side GPU will be a "success" (whatever that means for a GPU) because it might do well in professional computing is really a stretch on the confines of consumer-side "success". Not to mention if AMD FirePro Fury is ever planned and released... and did well, wouldn't that really be a "success" of the AMD FirePro Fury and not the Fury X?

The Fury X might make a good home workstation GPU, but I doubt the percentages of purchasers who buy this GPU with this in mind, are that high. Just like the number of Titan owners who are professional 3D artists who casually game.

I would need to see the numbers before I made any informed conclusion, thus I am simply asking for those numbers from our users that are clearly informed, since they have already reached the conclusion.


----------



## blue1512

Where does that "15%-20% at 1440p" come from?


----------



## obababoy

Quote:


> Originally Posted by *Ganf*
> 
> Ehh, you picked the reviews that told you what you wanted to hear. I won't go any further than that.


I disagree. I think he has a VERY valid point. There is nothing wrong with the FX card, but it would have been an easy win if they performed better than 980ti's, sold at $550, or bundled with some AAA games.

I am not concerned about the performance right now and have faith it will improve but for AMD as a company they made a mistake by not doing one of the aforementioned sales tactics.

With that said, I believe AMD cards age much better than NV so I will hold off until I see either a performance bump from drivers or price drop from AMD.


----------



## rdr09

Quote:


> Originally Posted by *4everAnoob*
> 
> Stuff was added yes. Noteworthy performance improvement due to architerctural changes? Nope.
> GCN 1.0 -> GCN 1.1 has almost no noteworthy improvement at all. GCN 1.2 has some new stuff like memory bandwidth compression which does help a little.
> But it is all very small stuff. Just increased shaders, core speed, bandwidth is really all that happened.
> NVidia rebranded cards from the 8800 series to the GT2XX series, everyone was complaining. What AMD is doing now is just as bad in my opinion.


i used to own a 6870. loved it and sold it to member here . . .

http://www.3dmark.com/compare/3dm11/8776470/3dm11/2454753

the Fury is 40% faster than the 290.


----------



## obababoy

Quote:


> Originally Posted by *blue1512*
> 
> Where does that "15%-20% at 1440p" come from?


http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Grand-Theft-Auto-V
http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Battlefield-4
http://hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/4#.VY1Qz0ay58E

Scroll to the bottom. Granted with gameworks and all these other NV features. benchmarks are going to be skewed because AMD guys like myself turn off most of the NV features in games. Either way though it is not a higher performing card....At the moment.


----------



## iLeakStuff

Quote:


> Originally Posted by *Ganf*
> 
> Ehh, you picked the reviews that told you what you wanted to hear. I won't go any further than that.


What? I wanted to buy Fury X. But there are just too many reviews showing 980Ti being faster in 1440p.

Hardware.fr puts it 12% faster


Eurogamer puts it closer to 15-20%. You can also see the overclock advantage here. And I like to play with overclocking.


Hardwarecanucks puts it 7% above


EVGA Hybrid will be even faster than this since its at 1228MHz instead of 1075MHz. So I think the extra $100 for the same cooling performance as Fury X and better performance will be worth it. Or better yet, just get one for $650 and make my own water cooling

If AMD wants my money, they can either drop price to $550 for Fury X (or release a full Fury for $550 which I can water cool). Or Fury X2 for $999.


----------



## Alatar

Quote:


> Originally Posted by *Redwoodz*
> 
> Can't find any anything but reference 980Ti's either,and those have 1000MHz core,which I'm sure FuryX surpasses at stock.Something everyone needs to remember when viewing some of these reviews,comparing aftermarket highly modified and overclocked designs against the reference FuryX.


All the reviews I've seen have been against those reference 980Tis.

The aftermarket ones are easily faster than the Fury X and even the Titan X.


----------



## Ganf

Quote:


> Originally Posted by *blue1512*
> 
> Where does that "15%-20% at 1440p" come from?


Cherry picking is cherry picking, doesn't matter which side is doing it.

Fact is out of all of the reliable reviews that didn't screw up royally in one way or another, the card is within 3-5% of the 980ti as was leaked weeks before release. There have been no VRAM issues with games that're known to assign more than 4GB, voltage is still locked on the cards so none of the reviewers did an actual overclocking bench, HBM is working fine with every game they've thrown at it whether it has been officially "optimized" or not,

You know what, screw it. I'll go there. What's the point of buying Nvidia for the sake of "drivers" when games with their official support are being pulled from sale because they are unplayable? If you're worried about AMD updating drivers when they've been doing so on a weekly basis, why aren't you considering the fact that none of the AAA games that need the best performance available aren't even playable for the first week of release on ANY card?

Drivers aren't what's breaking games right now or for the foreseeable future.


----------



## Final8ty

https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml
Quote:


> Quote:
> 
> 
> 
> AMD Radeon R9 Fury X took the lead in the top class of single-processor accelerator (400-800 dollars), confidently beating the recently released single processor flagship NVIDIA - GeForce GTX 980 Ti. Moreover, in a number of tests R9 Fury X is at a much more expensive product from the premium segment - GTX Titan X. A 2-3 Fury X test performance even reached the level of Radeon R9 295X2 (of course, we are not talking about a low-resolution, where speed can be limited by the graphics card environment).
Click to expand...


----------



## DFroN

Quote:


> Originally Posted by *Final8ty*
> 
> https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml


Quote:


> AMD Radeon R9 Fury X took the lead in the top class of single-processor accelerator (400-800 dollars), confidently beating the recently released single processor flagship NVIDIA - GeForce GTX 980 Ti.


How can their results differ from so many other sites? Is that a respected website?


----------



## bossie2000

A mix bag of reviews then! Theres no clear lines, it's blurry!


----------



## flopper

Quote:


> Originally Posted by *blue1512*
> 
> Where does that "15%-20% at 1440p" come from?


yea people skew the world reality a lot.

Quote:


> Originally Posted by *obababoy*
> 
> With that said, I believe AMD cards age much better than NV so I will hold off until I see either a performance bump from drivers or price drop from AMD.


Kepler anyone?








Quote:


> Originally Posted by *Ganf*
> 
> Cherry picking is cherry picking, doesn't matter which side is doing it.
> 
> Fact is out of all of the reliable reviews that didn't screw up royally in one way or another, the card is within 3-5% of the 980ti as was leaked weeks before release. There have been no VRAM issues with games that're known to assign more than 4GB, voltage is still locked on the cards so none of the reviewers did an actual overclocking bench, HBM is working fine with every game they've thrown at it whether it has been officially "optimized" or not,
> 
> You know what, screw it. I'll go there. What's the point of buying Nvidia for the sake of "drivers" when games with their official support are being pulled from sale because they are unplayable? If you're worried about AMD updating drivers when they've been doing so on a weekly basis, why aren't you considering the fact that none of the AAA games that need the best performance available aren't even playable for the first week of release on ANY card?
> 
> Drivers aren't what's breaking games right now or for the foreseeable future.


Nvidia drivers atm is a trainwreck.
who buys old tech like gddr5 at entusiast? I dont
I buy new tech ready for windows 10.
Quote:


> Originally Posted by *DFroN*
> 
> How can their results differ from so many other sites? Is that a respected website?


Results differ due to different testing and hardware.
drivers are early and need to mature.
first you make them work and be stable and then once that is true you optimize for performance.

I buy fury as I am informed what it can do and nvidia has nothing to compare.


----------



## Themisseble

Quote:


> Originally Posted by *DFroN*
> 
> How can their results differ from so many other sites? Is that a respected website?


different drivers

check this review..
http://www.sweclockers.com/test/20730-amd-radeon-r9-fury-x/6#content

sometimes FurY X is as fast as R9 390X. Look, how close R9 390X comes in 4K benchmarks to GTX 980 Ti.
And also look at GTX 780Ti.
R9 390/X are one of the best cards right now for 1440P+


----------



## Blackops_2

http://www.tweaktown.com/news/46149/ek-water-blocks-teased-amd-radeon-r9-fury-block/index.html

Ek teases Fury X block. It's beautiful








Would love to Build a parvum S2 with Fury X crossfire.

Still awaiting voltage controlled OCs.


----------



## Redwoodz

Quote:


> Originally Posted by *RagingCain*
> 
> I have honestly just never seen a professional workstation at a business using commercial GPUs for workstation rendering. Most of these machines are Dell / Lenovo etc., when ordering workstations, these cards don't even show up as options. I am not saying it doesn't happen.
> 
> I would like to see some numbers. I don't see any current evidence that Fury X will succeed in enterprise workstation environments.
> 
> To claim a consumer side GPU will be a "success" (whatever that means for a GPU) because it might do well in professional computing is really a stretch on the confines of consumer-side "success". Not to mention if AMD FirePro Fury is ever planned and released... and did well, wouldn't that really be a "success" of the AMD FirePro Fury and not the Fury X?
> 
> The Fury X might make a good home workstation GPU, but I doubt the percentages of purchasers who buy this GPU with this in mind, are that high. Just like the number of Titan owners who are professional 3D artists who casually game.
> 
> I would need to see the numbers before I made any informed conclusion, thus I am simply asking for those numbers from our users that are clearly informed, since they have already reached the conclusion.


Quote:


> Originally Posted by *RagingCain*
> 
> I completely agree that air cooled ones would be necessary.
> 
> Doesn't that negate your noise and power usage advantage over the 980 Ti? Higher temperatures, means greater voltages to offset leakage.
> 
> Couple it now with the fact the GTX 980 Ti has an extra 2 GB of VRAM. HBM maybe advantageous, unfortunately though, that high speed VRAM would be bottlenecked by the rate of computations done by the GPU not by bandwidth.
> 
> Not that I would use a 980 Ti for compute rendering, that's what professional workstation cards are for, but to entertain your idea.
> 
> Prior market domination means that any apps based on CUDA are probably going to stay that way, you can agree to that. OpenCL based applications could be adapted though. Someone, somewhere, has to look at the Fury X, see that their program can't run on it because it's been using CUDA for years now, and now want to convert their application to OpenCL
> 
> What feature on the Fury X would make that computer scientist do that?
> 
> Also do you have any source on computational arena being where the money is at, regarding commercial GPUs?


Quote:


> Originally Posted by *iLeakStuff*
> 
> I can see AMD catching up with workstations this time around. GK110 Kepler and $999 Titan Black was extremely good at computation. GM200 and Titan X however doesnt have the same ciomputation performance which means you have to buy a Quadro M6000 that cost much more.
> Maybe the workstation market isnt as big as we like to think and thats why they ditched GPGPU with Titan X, who knows. CUDA is strong within the scientific market thats for sure and they might not have other options than Nvidia, but Fury X is a really good value at $650 for those people.
> 
> Who knows how this will turn out.
> 
> Stock 980Ti at 1440p is 10-15% faster than Fury X stock. Which is the display res choice I want to use.
> Overclocking tests show that OC 980Ti is much faster than OC Fury X
> Hybrid 980Ti cost $100 more than Fury X, but you get overclocked 980Ti with water cooling and warranty, which will be a good deal faster than Fury X.
> I get 6GB VRAM instead of 4GB with 980Ti. Although most games work great with 4GB today, who knows about the future where some lazy developer make a console port that need the extra 2GB? And they say AMD need to make driver entries for each game that is released to make it work properly on HBM. What if they have to work 1 or 2 weeks on getting proper driver support for a game I want to play at launch? Wait there for 2 weeks?
> 
> I use to buy AMD when they offer better value than Nvidia. Which they usually do. Except this time it have slightly worse performance/$ value than 980Ti. If Fury X was priecd at $550 it would tip the scale in the other direction. Instabuy from me. If Fury X was faster than 980Ti which they showed tat the benchmark slide AMD released I would also have bought it.
> 
> I tried to find reasons to get a Fury X since the card looks soo much better than 980Ti, but I just can`t find much reason


If you read again I made no conclusions,I stated it's too early to tell about Fury's workstation potential,the performance is promising from the evidence we do have. Obviously software and applications play a part.
As for enterprise/workstation GPU market potential,it's hard to get a clear picture because sales are not indicated when customers use a desktop GPU for workstation releated loads,which is often the case nowadays,for instance people using GTX Titans instead of Quadro's.

http://www.theplatform.net/2015/05/08/tesla-gpu-accelerator-grows-fast-for-nvidia/

Even so there is a large market potential over and above gaming gpu's,as Nvidia's bet on deeplearning and pascal proves.


----------



## Alatar

Quote:


> Originally Posted by *DFroN*
> 
> How can their results differ from so many other sites? Is that a respected website?


There's always variation between reviews.

Posting [H]'s review for example is just as valid as posting the one above.

The reality isn't represented by either extreme.

But we both know the reason that people single out one extreme or the other. It's for cherry picking purposes.
Quote:


> Originally Posted by *Themisseble*
> 
> different drivers


No this has been proven to be incorrect many times already both by AMD reps and reviewers. It's reddit nonsense that people keep repeating because it allows them to disregard reviews they don't like.

I'll quote Ryan Smith:
Quote:


> Aww come on, not here too. Beyond3D is supposed to be better than this... :|
> 
> Anyhow, anyone claiming there's a 15.15-180612a driver is either making typos or full of hooey. If you look at AMD's driver strings, they're in YYMMDD format. For example, Catalyst 15.3.1 from March is 14.502.1014-150313a-181517E.
> 
> I don't claim to speak for AMD, but there is no other driver besides the original website driver for the Fury/300 series (15.15-150611a-185375E), the press driver (15.15-150611a-185358E), and the current Batman driver (15.15.1004-150619a-185674E).
> 
> Also: http://forums.overclockers.co.uk/showthread.php?p=28230087#post28230087


https://forum.beyond3d.com/posts/1856458/


----------



## MapRef41N93W

Quote:


> Originally Posted by *iLeakStuff*
> 
> I can see AMD catching up with workstations this time around. GK110 Kepler and $999 Titan Black was extremely good at computation. *GM200 and Titan X however doesnt have the same ciomputation performance which means you have to buy a Quadro M6000 that cost much more.*
> Maybe the workstation market isnt as big as we like to think and thats why they ditched GPGPU with Titan X, who knows. CUDA is strong within the scientific market thats for sure and they might not have other options than Nvidia, but Fury X is a really good value at $650 for those people.
> 
> Who knows how this will turn out.
> 
> Stock 980Ti at 1440p is 10-15% faster than Fury X stock. Which is the display res choice I want to use.
> Overclocking tests show that OC 980Ti is much faster than OC Fury X
> Hybrid 980Ti cost $100 more than Fury X, but you get overclocked 980Ti with water cooling and warranty, which will be a good deal faster than Fury X.
> I get 6GB VRAM instead of 4GB with 980Ti. Although most games work great with 4GB today, who knows about the future where some lazy developer make a console port that need the extra 2GB? And they say AMD need to make driver entries for each game that is released to make it work properly on HBM. What if they have to work 1 or 2 weeks on getting proper driver support for a game I want to play at launch? Wait there for 2 weeks?
> 
> I use to buy AMD when they offer better value than Nvidia. Which they usually do. Except this time it have slightly worse performance/$ value than 980Ti. If Fury X was priecd at $550 it would tip the scale in the other direction. Instabuy from me. If Fury X was faster than 980Ti which they showed tat the benchmark slide AMD released I would also have bought it.
> 
> I tried to find reasons to get a Fury X since the card looks soo much better than 980Ti, but I just can`t find much reason


Eh??? The Quadro M6000 is simply a Titan X with 10 bit support, ECC RAM, and professional drivers. And actually you are straight up wrong, GM200 has excellent single precision compute performance (which is what 90% of professionals buying the card would be using anyways) just like the M6000. Double precision was nerfed by the Maxwell architecture to make the architecture more efficient.


----------



## Final8ty

Quote:


> Originally Posted by *DFroN*
> 
> How can their results differ from so many other sites? Is that a respected website?


Quote:


> The review iXBT in question, that used a more recent driver dated 18th June (others used the older driver for the 300 series launch)


----------



## Kane2207

Wow, the apologists are strong in this thread
 








It's a decent card, but it's not the second coming of AMD everyone hyped it to be. Nothing more, nothing less....


----------



## Alatar

Quote:


> Originally Posted by *Final8ty*


No they didn't. Please don't buy into the reddit nonsense.

Ryan Smith on the issue:

https://forum.beyond3d.com/posts/1856458/


----------



## 331149

Well on the plus side you get a cooler card, a smaller card, a sleek looking card and superior drivers.


----------



## tconroy135

Quote:


> Originally Posted by *TheBDK*
> 
> Well on the plus side you get a cooler card, a smaller card, a sleek looking card and superior drivers.


I think you misspelled "inferior..."


----------



## Final8ty

Quote:


> Originally Posted by *Alatar*
> 
> There's always variation between reviews.
> 
> Posting [H]'s review for example is just as valid as posting the one above.
> 
> The reality isn't represented by either extreme.
> 
> But we both know the reason that people single out one extreme or the other. It's for cherry picking purposes.
> No this has been proven to be incorrect many times already both by AMD reps and reviewers. It's reddit nonsense that people keep repeating because it allows them to disregard reviews they don't like.
> 
> I'll quote Ryan Smith:
> https://forum.beyond3d.com/posts/1856458/


Quote:


> Originally Posted by *Deathroned;1041693396*
> Ahem Digital storm would like to have a word with you.
> Single card benches all are neck and neck: www.digitalstorm.com/unlocked/amd-fury-x-performance-benchmarks-idnum360/
> in crossfire fury x starts pulling ahead a bit.
> www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/
> Kyle was bottlenecking the mighty fury x with the 3770k system.


----------



## Ganf

Quote:


> Originally Posted by *Themisseble*
> 
> different drivers
> 
> check this review..
> http://www.sweclockers.com/test/20730-amd-radeon-r9-fury-x/6#content
> 
> sometimes FurY X is as fast as R9 390X.
> 
> R9 390/X are one of the best cards right now for 1440P+


Drivers was debunked, it's just lousy testing methods, you've got people like PCPer putting tests that're obviously flukes in as prime samples when they should've been discarded as outliers, people screwing up the Nvidia card tests and tanking them randomly, etc...



Seriously, what is that supposed to be? An equal comparison? Give me a break.

They've got the 290x running better than the Fury and they think there was nothing wrong with their test?


----------



## Kane2207

Quote:


> Originally Posted by *TheBDK*
> 
> Well on the plus side you get a cooler card, a smaller card, a sleek looking card and superior drivers.


It's cooler because it's an AIO on the core, the VRMs are still baking by the looks of every thermal image I've seen because there's no air flow under the shroud.

It's not smaller either. The PCB is smaller, but once I factor in a rather thick 120mm RAD + fan it actually occupies more volume than the 980ti.

It's looks nice, I'll give you that, but then that's massively subjective.

Drivers? Really? Let's not decend into that argument here, both sides have poor showings when it comes to driver releases. Nvidia can't currently get their head around people using hardware accelerated web browsers. AMD don't understand that people still play DX9 games and have left the frame pacing in a mess.


----------



## Final8ty

Quote:


> Originally Posted by *Alatar*
> 
> No they didn't. Please don't buy into the reddit nonsense.
> 
> Ryan Smith on the issue:
> 
> https://forum.beyond3d.com/posts/1856458/


I have not read reddit and i didn't say i was buying into it im just quoting what others are speculating, so stop shooting the messenger.


----------



## obababoy

Quote:


> Originally Posted by *TheBDK*
> 
> Well on the plus side you get a cooler card, a smaller card, a sleek looking card and superior drivers.


People keep mentioning drivers haha. You are wrong. Neither cards are impressive with drivers because the games lately have been developed either too fast, or NV and AMD werent given enough time to write drivers.

Using drivers as a selling point for either is just stupid...Especially these days.


----------



## Redwoodz

Quote:


> Originally Posted by *Alatar*
> 
> All the reviews I've seen have been against those reference 980Tis.
> 
> The aftermarket ones are easily faster than the Fury X and even the Titan X.


All the reviews I have seen have not stated anything other than "980Ti" which could mean any of them,and neither do they report what frequency they were tested at either.A quick search shows a variance from 1000MHz to 1178Mhz for base clocks on 980Ti....what were they tested with? No one knows but them.


----------



## Chargeit

Quote:


> Originally Posted by *Kane2207*
> 
> It's cooler because it's an AIO on the core, the VRMs are still baking by the looks of every thermal image I've seen because there's no air flow under the shroud.
> 
> It's not smaller either. The PCB is smaller, but once I factor in a rather thick 120mm RAD + fan it actually occupies more volume than the 980ti.
> 
> It's looks nice, I'll give you that, but then that's massively subjective.
> 
> Drivers? Really? Let's not decend into that argument here, both sides have poor showings when it comes to driver releases. *Nvidia can't currently get their head around people using hardware accelerated web browsers.* AMD don't understand that people still play DX9 games and have left the frame pacing in a mess.


Has to be more to it then that. I have hardware acceleration enabled with zero issue in chrome.

Can't help but remember (I've mentioned a few times) when I had a lot of driver problems my PSU was also going bad.

I wonder how many people complaining about "poor drivers" have other parts of their systems to blame. Be it improper OC, or questionable components. I think the easy route is to just blame drivers before investigating other possible issues.

*I haven't personally used a AMD gpu in a while. I do have that 270x in my ol'ladys system. It has also given me zero issues. Though she only plays "GW2". The parts in that system are now what I'd consider tried and true.


----------



## Kane2207

Quote:


> Originally Posted by *Redwoodz*
> 
> All the reviews I have seen have not stated anything other than "980Ti" which could mean any of them,and neither do they report what frequency they were tested at either.A quick search shows a variance from 1000MHz to 1178Mhz for base clocks on 980Ti....what were they tested with? No one knows but them.


Makes very little difference. Like the OG Titan and Titan X, 980ti's boost much higher than their stated top clocks. Out of the box it's not unusual to see 1200+ on the core with no voltage adjustments.


----------



## obababoy

Quote:


> Originally Posted by *Ganf*
> 
> Drivers was debunked, it's just lousy testing methods, you've got people like PCPer putting tests that're obviously flukes in as prime samples when they should've been discarded as outliers, people screwing up the Nvidia card tests and tanking them randomly, etc...
> 
> 
> 
> Seriously, what is that supposed to be? An equal comparison? Give me a break.
> 
> They've got the 290x running better than the Fury and they think there was nothing wrong with their test?


Part of this is the implementation of gameworks with regards to benchmarking. Sure it helps some games look better. ie. the monsters hair in Witcher 3, but most of these benchmarks for games now are using "proprietary" features that AMD doesn't have optimized. Most AMD guys like myself turn off most of these features if they are FPS hogs.

Results for benchmarking are skewed these days because of gameworks and settings that reviewers choose. People need to take them with a grain of salt, which is fairly easy to do with about 10 sites reporting roughly a 15% variable range for results.

End result. Fury X is a great card as is the 980ti. Fury X has additional growing pains ahead of it which I am excited for, but only time will tell and it is a slight risk if that is what you are counting on.


----------



## Ganf

Quote:


> Originally Posted by *Kane2207*
> 
> It's cooler because it's an AIO on the core, the VRMs are still baking by the looks of every thermal image I've seen because there's no air flow under the shroud.
> 
> It's not smaller either. The PCB is smaller, but once I factor in a rather thick 120mm RAD + fan it actually occupies more volume than the 980ti.
> 
> It's looks nice, I'll give you that, but then that's massively subjective.
> 
> Drivers? Really? Let's not decend into that argument here, both sides have poor showings when it comes to driver releases. Nvidia can't currently get their head around people using hardware accelerated web browsers. AMD don't understand that people still play DX9 games and have left the frame pacing in a mess.


The AIO also means that the heat gets dumped straight out of the case instead of blown inside it to heat up your other components though, so there is that. And EK just put it the single slot bracket on display. So if you're watercooling, this is officially the smallest card you can get that kind of performance out of.

I just want to see real overclocking values. Not being able to turn up the volts is killing that side of the reviews.

Meh. Still waiting for 980ti Lightnings and whatnot. No custom PCB's killed it for me.


----------



## RagingCain

Quote:


> Originally Posted by *Redwoodz*
> 
> If you read again I made no conclusions,I stated it's too early to tell about Fury's workstation potential,the performance is promising from the evidence we do have. Obviously software and applications play a part.
> As for enterprise/workstation GPU market potential,it's hard to get a clear picture because sales are not indicated when customers use a desktop GPU for workstation releated loads,which is often the case nowadays,for instance people using GTX Titans instead of Quadro's.
> 
> http://www.theplatform.net/2015/05/08/tesla-gpu-accelerator-grows-fast-for-nvidia/
> 
> Even so there is a large market potential over and above gaming gpu's,as Nvidia's bet on deeplearning and pascal proves.


What is you definition of "where the real money is"?
Quote:


> Originally Posted by *Redwoodz*
> *Gaming gpu's is not the whole story. AMD has proven to make large strides versus Nvidia in the compute arena,where the real money is at.*
> 
> http://www.anandtech.com/bench/product/1513?vs=1496
> 
> As shown here,FuryX tops 980Ti in Sony Vegas and Luxmark,while making strong gains in [email protected],even topping Nvidia in explicit double precision. Nvidia's gains in gaming have largely come from sacrificing GPGU output,while AMD has remained strong in both. Also while power consumption is very close,the FuryX is quieter under load than 980Ti.It looks to be a strong contender in the workstation arena.


It looks to me that consumer side GPUs is where the real money is, no? That's nVidia's numbers, are AMDs numbers better on Workstation side than Gaming?

In addition I am not stating there is no money in Enterprise, that's absurdly wrong. I may have misunderstood you, but your claim was, this card the Fury X, a consumer card, will make more money on the compute arena, which is essentially science & enterprise.

I wanted to know what is the percentage of users buying the Fury X as primarily a compute card and not gaming, and what does that translate into dollars for AMD. The reason I think I am asking is because I believe that number to be extremely low, and of no consequence. As in, the "real money" is in gaming. At least for this card.

This card will be solely judged on well it plays video games and how much gaming market share / money it gives AMD.


----------



## Kane2207

Quote:


> Originally Posted by *Chargeit*
> 
> Has to be more to it then that. I have hardware acceleration enabled with zero issue in chrome.
> 
> Can't help but remember (I've mentioned a few times) when I had a lot of driver problems my PSU was also going bad.
> 
> I wonder how many people complaining about "poor drivers" have other parts of their systems to blame. Be it improper OC, or questionable components. I think the easy route is to just blame drivers before investigating other possible issues.
> 
> *I haven't personally used a AMD gpu in a while. I do have that 270x in my ol'ladys system. It has also given me zero issues. Though she only plays "GW2". The parts in that system are now what I'd consider tried and true.


I would agree with that if the problem was relatively small but there's a massive thread on Nvidia's support forum and Reddit regarding the HW accelerated issues on drivers past 350.xx. Not everyone has the issue of course but the shear number of users with problems on three subsequently released drivers suggests an issue rather than all of these users experiencing potential hardware related problems at exactly the same time.


----------



## Alatar

Quote:


> Originally Posted by *Redwoodz*
> 
> All the reviews I have seen have not stated anything other than "980Ti" which could mean any of them,and neither do they report what frequency they were tested at either.A quick search shows a variance from 1000MHz to 1178Mhz for base clocks on 980Ti....what were they tested with? No one knows but them.


The 980Tis in these Fury X reviews perform exactly like reference 980Tis did in reference 980Ti reviews. Just shy of the Titan X.

If they were non ref cards they'd be clearly faster than the Titan X and you would know.


----------



## Roadkill95

Quote:


> Originally Posted by *Kane2207*
> 
> Wow, the apologists are strong in this thread
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a decent card, but it's not the second coming of AMD everyone hyped it to be. Nothing more, nothing less....


My thoughts exactly. Completely and utterly underwhelming.

I think it was fair for people to expect a 980ti/TX killer when AMD was this late to the game. But as usual, they did not deliver.

Also, why are people using AIO excuse to put it above the 980ti? The AIO is rendered absolutely worthless when 980tis on air destroy the furious when it comes to overclocking.


----------



## CrazyElf

Quote:


> Originally Posted by *Blameless*
> 
> What are these VRMs rated for, current wise? That 130C limit will decrease the greater the clock speeds and the heavier the load, as well as from increased voltage.
> 
> Even if they are technically within limits, at say +100mV and a 20% OC, cooler VRMs will still increase longevity and dump less heat into the board and thus GPU.
> This reminds me of something I was thinking about earlier: What direction is the flow through the loop, and is the pump reversible?
> 
> Having cool water from the rad hit the VRM first may help equalize VRM/GPU temps by a few degrees.
> _Elite: Dangerous_, as of the 1.3 patch runs at 4k, with Ultra settings, at 40 (certain station interiors) to 110fps (open space), with averages around 60-70, on my 290X.


These are DirectFET MosFET and there are 6 of them. IR6811 and IR6894 DirectFET



I think these are the datasheets:
http://www.irf.com/product-info/datasheets/data/irf6811spbf.pdf
http://www.irf.com/product-info/datasheets/data/irf6894mpbf.pdf

If you plan on buying the Fury X, what you really need to do, I think would be to take the backplate off, and then attach a Mosfet heatsink to the rear of the card.



Something like the image above ought to do, although you will have to measure out the right dimensions and you will have to have clearance (so it doesn't hit anything in the back of the PCB). This could be an issue for large CPU coolers or perhaps for tight cases in a Crossfire configuration.

Either that or a full coverage water block and water cool the whole thing.

Ideally have some air blowing over the Mosfet heatsink.

Quote:


> Originally Posted by *Alatar*
> 
> The 980Tis in these Fury X reviews perform exactly like reference 980Tis did in reference 980Ti reviews. Just shy of the Titan X.
> 
> If they were non ref cards they'd be clearly faster than the Titan X and you would know.


The problem I have with that is that I know the reviews haven't shown it, but the 980Ti had 6GB of VRAM. The Fury X only has 4GB. I'd be worried about running out of VRAM before having something so intensive that the Core cannot render it properly. If the VRM problems that have been described are an issue too - yeah this card is a flop.

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think the most disappointing thing about this card so far is the overclocking, or lack thereof. Now it must be said that we really don't know how it will OC in the future as we are just one day out from release and it is certainly conceivable that over the next month or so that we will get voltage control but I was definitely hoping for something in the 1300-1400MHz range. Had we gotten such overclockability I don't doubt that Fury X would have been on more equal footing with GM200. At stock the card hangs just fine with 980Ti but we all know what a monster overclocker the Nvidia card is so there is no doubt Fury will get destroyed in the hands of the typical OCN member and thus the disappointment. I guess given GCN's history it would be kind of silly to expect 1400MHz out of Fury but AMD was throwing words around like "overlcoker's dream" so its not entirely our fault for getting our hopes up. Anyway, this is NOT a fail of a card by any means but it has to play second fiddle to Nvidia... AGAIN.


I would hesitate to guess that >1300 MHz is unlikely, unless you get really lucky with the silicon lottery.

Most likely with voltage unlocked, you'll get perhaps 1200-1300 MHz is most likely what you'll get with <1250 MHz on air and 1250-1300 MHz on water. A few "winners" of the silicon lottery might get 1350.

The reason why is because this chip has a comparable power consumption to the 290X. As it's on the same 28nm process, we can guess what kinds of OCs people will get. True, the power density is somewhat lower (it's a bigger die than the 290X), so that might help things, but without a custom PCB, it's probably not going to overclock as well.

I would be worried about overclocking this chip, judging by what is being described by the VRMs as well. OC-ing would worsen the already hot VRMs.


----------



## Ganf

Quote:


> Originally Posted by *obababoy*
> 
> Part of this is the implementation of gameworks with regards to benchmarking. Sure it helps some games look better. ie. the monsters hair in Witcher 3, but most of these benchmarks for games now are using "proprietary" features that AMD doesn't have optimized. Most AMD guys like myself turn off most of these features if they are FPS hogs.
> 
> Results for benchmarking are skewed these days because of gameworks and settings that reviewers choose. People need to take them with a grain of salt, which is fairly easy to do with about 10 sites reporting roughly a 15% variable range for results.
> 
> End result. Fury X is a great card as is the 980ti. Fury X has additional growing pains ahead of it which I am excited for, but only time will tell and it is a slight risk if that is what you are counting on.


Nope, reviewers disable Gameworks and AMD stuff like TressFX when testing. They're apparently smart enough to do that, but can't figure out when they're got a failed test.


----------



## Casey Ryback

Quote:


> Originally Posted by *Roadkill95*
> 
> Also, why are people using AIO excuse to put it above the 980ti?


Who's putting the fury above the 980ti?

Just because people bought it doesn't automatically mean they think it is superior.


----------



## Chargeit

Quote:


> Originally Posted by *Kane2207*
> 
> I would agree with that if the problem was relatively small but there's a massive thread on Nvidia's support forum and Reddit regarding the HW accelerated issues on drivers past 350.xx. Not everyone has the issue of course but the shear number of users with problems on three subsequently released drivers suggests an issue rather than all of these users experiencing potential hardware related problems at exactly the same time.


Yea, but, is it pure drivers, or possible combo of hardware/software. I've found all kinds of crazy problems caused by using different OC software, or monitoring tools. One thing I dropped which seemed to give me issues was EVGA precision X. I don't remember what it was doing, just life has been a lot easier since going back to MSI afterburner. I ditched that Corsair link with my H100i also. That thing always gave problems and part of the reason I no longer buy Corsair. Part.


----------



## ladcrooks

Whatever way all you lot look at it, Amd , 1st attempt and being the 1st with a different architecture from scratch is a winner in my eyes - bored with all the poooers









Damn for not having 8gig - all you greeny's would be like rats jumping a sinking boat, and same if the shoe was on the other foot


----------



## DampMonkey

Quote:


> Originally Posted by *Alatar*
> 
> All the reviews I've seen have been against those reference 980Tis.
> 
> The aftermarket ones are easily faster than the Fury X and even the Titan X.


PCGAMER decided to use an EVGA 980Ti SC for some reason. Didn't event mention in the article that it wasn't a reference card, compared to every other card in the test:
http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/

EDIT: just kidding, it looks like the editor made an update explaining the discrepancy. When I checked the other day it wasn't there, hence my comment.


----------



## Alatar

Quote:


> Originally Posted by *DampMonkey*
> 
> PCGAMER decided to use an EVGA 980Ti SC for some reason. Didn't event mention in the article that it wasn't a reference card, compared to every other card in the test:
> http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/
> 
> EDIT: just kidding, it looks like the editor made an update explaining the discrepancy. When I checked the other day it wasn't there, hence my comment.


Well that's the first one I've seen.

But again even without the labeling that's clearly there it'd be easy to know that they used a non reference card because it's clearly faster than the Titan X.


----------



## Redwoodz

Quote:


> Originally Posted by *RagingCain*
> 
> What is you definition of "where the real money is"?
> It looks to me that consumer side GPUs is where the real money is, no? That's nVidia's numbers, are AMDs numbers better on Workstation side than Gaming?
> 
> In addition I am not stating there is no money in Enterprise, that's absurdly wrong. I may have misunderstood you, but your claim was, this card the Fury X, a consumer card, will make more money on the compute arena, which is essentially science & enterprise.
> 
> I wanted to know what is the percentage of users buying the Fury X as primarily a compute card and not gaming, and what does that translate into dollars for AMD. The reason I think I am asking is because I believe that number to be extremely low, and of no consequence. As in, the "real money" is in gaming. At least for this card.
> 
> This card will be solely judged on well it plays video games and how much gaming market share / money it gives AMD.


Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,which mean gaming benchmark results mean little.Referencing the graph I showed before where AMD beats Nvidia in 7 out of 12 compute tests.

Quote:


> Originally Posted by *Alatar*
> 
> The 980Tis in these Fury X reviews perform exactly like reference 980Tis did in reference 980Ti reviews. Just shy of the Titan X.
> 
> If they were non ref cards they'd be clearly faster than the Titan X and you would know.


Quote:


> Originally Posted by *DampMonkey*
> 
> PCGAMER decided to use an EVGA 980Ti SC for some reason. Didn't event mention in the article that it wasn't a reference card, compared to every other card in the test:
> http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/
> 
> EDIT: just kidding, it looks like the editor made an update explaining the discrepancy. When I checked the other day it wasn't there, hence my comment.


Exactly...1100MHz base core clock.


----------



## rdr09

Quote:


> Originally Posted by *DampMonkey*
> 
> PCGAMER decided to use an EVGA 980Ti SC for some reason. Didn't event mention in the article that it wasn't a reference card, compared to every other card in the test:
> http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/
> 
> EDIT: just kidding, it looks like the editor made an update explaining the discrepancy. When I checked the other day it wasn't there, hence my comment.


here was the reference . . .


----------



## Sashimi

I like the idea of perform better when hit with more workload. This card has a very positive attitude I think we all living things can learn from.

Give me 8GB versions and I'm sold......


----------



## Alatar

Quote:


> Originally Posted by *Redwoodz*
> 
> Exactly...1100MHz base core clock.


And as you can see it's labeled as a 980Ti SC and is clearly faster than the Titan X and beats the Fury X by bigger margins than it does anywhere else.


----------



## RagingCain

Quote:


> Originally Posted by *Redwoodz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> What is you definition of "where the real money is"?
> It looks to me that consumer side GPUs is where the real money is, no? That's nVidia's numbers, are AMDs numbers better on Workstation side than Gaming?
> 
> In addition I am not stating there is no money in Enterprise, that's absurdly wrong. I may have misunderstood you, but your claim was, this card the Fury X, a consumer card, will make more money on the compute arena, which is essentially science & enterprise.
> 
> I wanted to know what is the percentage of users buying the Fury X as primarily a compute card and not gaming, and what does that translate into dollars for AMD. The reason I think I am asking is because I believe that number to be extremely low, and of no consequence. As in, the "real money" is in gaming. At least for this card.
> 
> This card will be solely judged on well it plays video games and how much gaming market share / money it gives AMD.
> 
> 
> 
> Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,which mean gaming benchmark results mean little.Referencing the graph I showed before where AMD beats Nvidia in 7 out of 12 compute tests.
Click to expand...

This line right here:
*Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,*

These are the kinds of numbers I am asking for. Where are your numbers on this?

*which mean gaming benchmark results mean little.*

And this line, you are telling me and everyone else, that game benchmarks don't matter?
That a gaming GPU, designed, marketed, and sold as a gaming GPU, doesn't have to perform well in game benchmarks?


----------



## Kane2207

Quote:


> Originally Posted by *RagingCain*
> 
> This line right here:
> *Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,*
> 
> These are the kinds of numbers I am asking for. Where are your numbers on this?
> 
> *which mean gaming benchmark results mean little.*
> 
> And this line, you are telling me and everyone else, that game benchmarks don't matter?
> 
> That a gaming GPU, designed, marketed, and sold as a gaming GPU, doesn't have to perform well in game benchmarks?


I could have sworn blind some OCN users were berating OG Titan owners previously, stating that '_no-one uses consumer grade GPUs for compute tasks_'.

My., my, haven't times changed?


----------



## Orivaa

Quote:


> Originally Posted by *Sashimi*
> 
> I like the idea of perform better when hit with more workload. This card has a very positive attitude I think we all living things can learn from.
> 
> Give me 8GB versions and I'm sold......


They can't make a 8GB version, unless you look at the double Fury GPU, but that is only 4GB mirrored.


----------



## obababoy

Quote:


> Originally Posted by *Alatar*
> 
> And as you can see it's labeled as a 980Ti SC and is clearly faster than the Titan X and beats the Fury X by bigger margins than it does anywhere else.


Why are you guys still arguing this? It is known pcgamer used the SC version. Not a big deal as it was easy to recognize the difference and it is labeled as such. Lets move on..


----------



## obababoy

SLI and CF mirrored vram is apparently a thing of the past...in the near future


----------



## Sashimi

Quote:


> Originally Posted by *Orivaa*
> 
> They can't make a 8GB version, unless you look at the double Fury GPU, but that is only 4GB mirrored.


Then I'm not sold. Still I know my boss would love it if can keep performing better when he gives me more work. That makes this card respectable.


----------



## Redwoodz

Quote:


> Originally Posted by *RagingCain*
> 
> This line right here:
> *Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,*
> 
> These are the kinds of numbers I am asking for. Where are your numbers on this?
> 
> *which mean gaming benchmark results mean little.*
> 
> And this line, you are telling me and everyone else, that game benchmarks don't matter?
> That a gaming GPU, designed, marketed, and sold as a gaming GPU, doesn't have to perform well in game benchmarks?


I already stated earlier it's hard to put exact numbers because sales are not differentiated on use. Someone buying a workstation gpu will refer to workstation(compute) benches,not gaming.Not that hard to understand.

Let's make this a little easier for you to understand.

AMD FuryX= 8,602 GFLOPS

Nvidia 980Ti= 5,632 GFLOPS
Nvidia TitanX= 6,144 GFLOPS


----------



## Sashimi

Quote:


> Originally Posted by *obababoy*
> 
> SLI and CF mirrored vram is apparently a thing of the past...in the near future


Do expand, are you for real?


----------



## sugarhell

The amd shader compiler on drivers is so slow apparently for fury x. They need to fix it. Kinda sad with the results the release wasnt optimal. They could delay the release until win 10 release if that was the plan....

Still waiting for some ocn bench to decide if this is a flop


----------



## CasualCat

Quote:


> Originally Posted by *Ganf*
> 
> Drivers was debunked, it's just lousy testing methods, you've got people like PCPer putting tests that're obviously flukes in as prime samples when they should've been discarded as outliers, people screwing up the Nvidia card tests and tanking them randomly, etc...
> 
> 
> 
> Seriously, what is that supposed to be? An equal comparison? Give me a break.
> 
> They've got the 290x running better than the Fury and they think there was nothing wrong with their test?


Quote:


> Originally Posted by *Ganf*
> 
> Nope, reviewers disable Gameworks and AMD stuff like TressFX when testing. They're apparently smart enough to do that, but can't figure out when they're got a failed test.


From the bottom of their GTA5 test:

*Also, those big frame rate drops you see in the Fury X line are pretty hard and dramatic stutters in the game; I ran our testing on these settings on the Fury X 6 times to make sure these results weren't out of the ordinary, and in fact it was a consistent result.*

Seems like plenty of due diligence to me. I could see if they ran it once, got that result and ran with it, but they didn't.


----------



## Slaughterem

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Eh??? The Quadro M6000 is simply a Titan X with 10 bit support, ECC RAM, and professional drivers. And actually you are straight up wrong, GM200 has excellent single precision compute performance (which is what 90% of professionals buying the card would be using anyways) just like the M6000. Double precision was nerfed by the Maxwell architecture to make the architecture more efficient.


AMD is the only card manufacturer who has 10 12 and 16 bit consumer cards.


----------



## Ganf

Quote:


> Originally Posted by *sugarhell*
> 
> The amd shader compiler on drivers is so slow apparently for fury x. They need to fix it. Kinda sad with the results the release wasnt optimal. They could delay the release until win 10 release if that was the plan....
> 
> Still waiting for some ocn bench to decide if this is a flop


Don't think anyone on OCN is interested in benching until the voltage is unlocked.

Would've been nice if MSI had had their junk together and gotten that patched in before release.
Quote:


> Originally Posted by *CasualCat*
> 
> From the bottom of their GTA5 test:
> 
> *Also, those big frame rate drops you see in the Fury X line are pretty hard and dramatic stutters in the game; I ran our testing on these settings on the Fury X 6 times to make sure these results weren't out of the ordinary, and in fact it was a consistent result.*
> 
> Seems like plenty of due diligence to me. I could see if they ran it once, got that result and ran with it, but they didn't.


And did nothing to find the cause. Where does the fault lie, the card or their setup? We don't know, because half of the other reviews are borked too so we have no baseline to compare this to. Meanwhile half of the other reviews report 0 problems in GTA V.

Where is the discrepancy?


----------



## Blameless

Quote:


> Originally Posted by *CrazyElf*
> 
> These are DirectFET MosFET and there are 6 of them. IR6811 and IR6894 DirectFET
> 
> 
> 
> I think these are the datasheets:
> http://www.irf.com/product-info/datasheets/data/irf6811spbf.pdf
> http://www.irf.com/product-info/datasheets/data/irf6894mpbf.pdf


Thanks.

Maximum operating temperature is 150C, but they lose a lot of current capacity as temperature rises.
Quote:


> Originally Posted by *CrazyElf*
> 
> Something like the image above ought to do, although you will have to measure out the right dimensions and you will have to have clearance (so it doesn't hit anything in the back of the PCB). This could be an issue for large CPU coolers or perhaps for tight cases in a Crossfire configuration.


I have some of these Enzotech sinks...had one epoxied to the back of my GTX 480 for a while. They work pretty well, but add a lot of weight, being solid forged copper and all.

It should fit fine, at least width wise; length might present some issues, but I think they come in 66mm and that should be small enough. If not, there is always the saw.


----------



## sugarhell

Quote:


> Originally Posted by *Ganf*
> 
> Don't think anyone on OCN is interested in benching until the voltage is unlocked.
> 
> Would've been nice if MSI had had their junk together and gotten that patched in before release.
> And did nothing to find the cause. Where does the fault lie, the card or their setup? We don't know, because half of the other reviews are borked too so we have no baseline to compare this to. Meanwhile half of the other reviews report 0 problems in GTA V.
> 
> Where is the discrepancy?


If i had a fury i could easily unlock the voltages with Afterburner commands. It seems that it use an IR voltage controller


----------



## CasualCat

Well checking a couple of reviews the settings mainly seem to be different. Some using presets, some are using AA, some not, etc. In this test, PCPer seems to use more aggressive settings than at least a couple of the other sites, but I haven't checked them all.

For example @4k:
hardware canucks (no AA)
techpowerup (no AA)
Tom's (did at least use 2xAA, would have to compare the other settings)

edit: PCPer appears to list all the settings they use, so it'd be pretty easy for a member here or other reviewer to match them and see if they see similar behavior.


----------



## criminal

Still believe it to be a driver issue. I am going to give AMD a little benefit of the doubt here and say the card will be much better soon. They have a little less than three weeks before they release the Fury on July 14th. They should have a much better driver by then. Fingers crossed and hopes high.


----------



## Slaughterem

Quote:


> Originally Posted by *RagingCain*
> 
> This line right here:
> *Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,*
> 
> These are the kinds of numbers I am asking for. Where are your numbers on this?
> 
> *which mean gaming benchmark results mean little.*
> 
> And this line, you are telling me and everyone else, that game benchmarks don't matter?
> That a gaming GPU, designed, marketed, and sold as a gaming GPU, doesn't have to perform well in game benchmarks?


You will see that the only option for consumer cards at 10 bit color depth will be the AMD.


----------



## obababoy

Quote:


> Originally Posted by *Sashimi*
> 
> Do expand, are you for real?


I did mention the word apparently but here you go:

"DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed. - See more at: http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html#sthash.feicveYS.dpuf"


----------



## sugarhell

Quote:


> Originally Posted by *obababoy*
> 
> I did mention the word apparently but here you go:
> 
> "DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed. - See more at: http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html#sthash.feicveYS.dpuf"


SFR is older than AFR...

Dx12 will probably use something better than SFR.


----------



## obababoy

Either way that is cool...The problem is I am not a CF or SLI kinda guy. I want 1 card to run my games at close to 60FPS with max settings. I don't want more and I don't want less.


----------



## Thoth420

Quote:


> Originally Posted by *Ganf*
> 
> Still haven't heard a decent explanation as to why on this, when the 980ti Hybrid is $100 more.
> 
> Don't say noise either, the reviews on noise are all over the place with some saying it's quieter, some louder, and some reviewers reporting that their fan profile is borked and blaming the card for it.
> Firepro's have always been derivatives of the consumer side, it's like we beta test the hardware for enterprise. The nano shows that the AIO is not necessary, and they've already arranged the ports to make the card capable of being set up in single slot configurations.
> 
> Just sayin', they've still got a chance in that arena.


In regard to sound there was an issue with the pump whicj AMD claims is fixed. Also I have seen numerous sites bench and it test it improperly according to the user's manual. Just wondering if that had impact on the variations on sound reviews.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> Don't think anyone on OCN is interested in benching until the voltage is unlocked.
> 
> Would've been nice if MSI had had their junk together and gotten that patched in before release.
> And did nothing to find the cause. Where does the fault lie, the card or their setup? We don't know, because half of the other reviews are borked too so we have no baseline to compare this to. Meanwhile half of the other reviews report 0 problems in GTA V.
> 
> Where is the discrepancy?


Did the review say if they were using the built-in benchmark for their testing? Might have something to do with the scene they chose to test on if not.


----------



## Redwoodz

Quote:


> Originally Posted by *Slaughterem*
> 
> You will see that the only option for consumer cards at 10 bit color depth will be the AMD.


I seem to remember reading about even 16bit color support with Fiji.


----------



## Sashimi

Quote:


> Originally Posted by *obababoy*
> 
> I did mention the word apparently but here you go:
> 
> "DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed. - See more at: http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html#sthash.feicveYS.dpuf"


That's sweet!!!! I am certainly a CF/ SLI kinda guy so this is wonderful news. If that's happening then the Fury X will definitely edge ahead for me.

Anyway time for bed. Good night guys. Keep the discussions rolling and news flashing. This is good stuff.


----------



## Slaughterem

Quote:


> Originally Posted by *Redwoodz*
> 
> I seem to remember reading about even 16bit color support with Fiji.


That is correct and even the 300 series will have this support.


----------



## ladcrooks

Quote:


> Originally Posted by *obababoy*
> 
> Either way that is cool...The problem is I am not a CF or SLI kinda guy. I want 1 card to run my games at close to 60FPS with max settings. I don't want more and I don't want less.


i am the same, but if dx12 can change that criteria for fps then i would probably take that advantage. But I have been a 60hz guy for yrs and don't feel as though i am missing anything


----------



## flopper

Quote:


> Originally Posted by *ladcrooks*
> 
> i am the same, but if dx12 can change that criteria for fps then i would probably take that advantage. But I have been a 60hz guy for yrs and don't feel as though i am missing anything


Main thing with dx12 and win 10 is removal of the cpu overhead.
gamers wins as they can use 8 core cpus better


----------



## Final8ty

Quote:


> Originally Posted by *Slaughterem*
> 
> You will see that the only option for consumer cards at 10 bit color depth will be the AMD.


I see 8,10 and 12 bit options on my 290s.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Final8ty*
> 
> I see 8,10 and 12 bit options on my 290s.


That's also panel limit.


----------



## RagingCain

Quote:


> Originally Posted by *Redwoodz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> This line right here:
> *Consumer side GPU's,of which a SIZEABLE portion are used in workstation loads,*
> 
> These are the kinds of numbers I am asking for. Where are your numbers on this?
> 
> *which mean gaming benchmark results mean little.*
> 
> And this line, you are telling me and everyone else, that game benchmarks don't matter?
> That a gaming GPU, designed, marketed, and sold as a gaming GPU, doesn't have to perform well in game benchmarks?
> 
> 
> 
> I already stated earlier it's hard to put exact numbers because sales are not differentiated on use. Someone buying a workstation gpu will refer to workstation(compute) benches,not gaming.Not that hard to understand.
Click to expand...

So you have no "numbers" to correlate your opinion?
Quote:


> Let's make this a little easier for you to understand.
> 
> AMD FuryX= 8,602 GFLOPS
> 
> Nvidia 980Ti= 5,632 GFLOPS
> Nvidia TitanX= 6,144 GFLOPS


So nVidia gaming GPUs outperform AMD gaming GPUs, despite AMD having much higher TFLOPs.

So you think the superior gaming GPU should be ranked by TFLOPs? Or are you stating that AMD doesn't have do as well in gaming because it has higher TFLOPs?


----------



## ladcrooks

Quote:


> Originally Posted by *flopper*
> 
> Main thing with dx12 and win 10 is removal of the cpu overhead.
> gamers wins as they can use 8 core cpus better


I just hope everybody is a winner, about time we got something for nothing - both camps have had our money over the years. Whatever the card you use, i can see in time it will be the cost of the card that counts not the colour


----------



## Slaughterem

As always when you have threads that are as many pages as this people do not view important information. Peter Nixeus is a member here on OCN and a Hardware rep. He provided some very important information but as always it was buried in the thread. Read his posts starting here http://www.overclock.net/t/1560625/amd-a-new-era-of-pc-gaming-livestream-thread/1260_30#post_24053237


----------



## sugarhell

Quote:


> Originally Posted by *RagingCain*
> 
> So you have no "numbers" to correlate your opinion?
> 
> So nVidia gaming GPUs outperform AMD gaming GPUs, despite AMD having much higher TFLOPs.
> 
> So you think the superior gaming GPU should be ranked by TFLOPs? Or are you stating that AMD doesn't have do as well in gaming because it has higher TFLOPs?


Ofc it has more TFlops the shader array of fury is insane. I will quote something from techreport
Quote:


> Koduri answered by stating my question another way: why didn't AMD build a bigger engine? That's an astute way to view things, because a bigger GPU engine would have taken fuller advantage of HBM's considerable bandwidth.
> 
> The reason why Fiji isn't any larger, he said, is that AMD was up against a size limitation: the interposer that sits beneath the GPU and the DRAM stacks is fabricated just like a chip, and as a result, the interposer can only be as large as the reticle used in the photolithography process. (Larger interposers might be possible with multiple exposures, but they'd likely not be cost-effective.) In an HBM solution, the GPU has to be small enough to allow space on the interposer for the HBM stacks. Koduri explained that Fiji is very close to its maximum possible size, within something like four square millimeters.


----------



## ondoy

pls. remove if it's already posted...


----------



## MapRef41N93W

Quote:


> Originally Posted by *Final8ty*
> 
> I see 8,10 and 12 bit options on my 290s.


You may be able to enable these in the CCC but that doesn't mean your card actually supports 10 bit. Plug it into a real 10 bit panel (not 8bit + dithering) and you will see banding. 10 bit support is locked to Quadro + Firepro.


----------



## obababoy

Quote:


> Originally Posted by *Slaughterem*
> 
> As always when you have threads that are as many pages as this people do not view important information. Peter Nixeus is a member here on OCN and a Hardware rep. He provided some very important information but as always it was buried in the thread. Read his posts starting here http://www.overclock.net/t/1560625/amd-a-new-era-of-pc-gaming-livestream-thread/1260_30#post_24053237


The problem is 15.15 is already out which is the Fury X "performance" update. Almost everyone has tested it and most of the benchmarks shown in the reviews reflect no major improvements.


----------



## harney

Quote:


> Originally Posted by *RagingCain*
> 
> So nVidia gaming GPUs outperform AMD gaming GPUs, despite AMD having much higher TFLOPs.
> 
> So you think the superior gaming GPU should be ranked by TFLOPs? Or are you stating that AMD doesn't have do as well in gaming because it has higher TFLOPs?


this....i would like to know how come the fury has much more TFLOPS yet slower than a card with less TFLOPS


----------



## Blackops_2

Quote:


> Originally Posted by *ondoy*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> pls. remove if it's already posted...


Thought that was a pretty good review. Though i didn't notice the pump noise like i noticed the coil whine. Maybe i'm just used to my 240L? Idk.

Any estimates on when they will actually unlock voltage?


----------



## harney

Quote:


> Originally Posted by *Slaughterem*
> 
> As always when you have threads that are as many pages as this people do not view important information. Peter Nixeus is a member here on OCN and a Hardware rep. He provided some very important information but as always it was buried in the thread. Read his posts starting here http://www.overclock.net/t/1560625/amd-a-new-era-of-pc-gaming-livestream-thread/1260_30#post_24053237


thx


----------



## MapRef41N93W

Quote:


> Originally Posted by *harney*
> 
> this....i would like to know how come the fury has much more TFLOPS yet slower than a card with less TFLOPS


You mean like how the 290x has almost as many Tflops as a Titan X? Maybe because flops haven't been a real representative of pc performance in... ages?


----------



## Slaughterem

Quote:


> Originally Posted by *MapRef41N93W*
> 
> You may be able to enable these in the CCC but that doesn't mean your card actually supports 10 bit. Plug it into a real 10 bit panel (not 8bit + dithering) and you will see banding. 10 bit support is locked to Quadro + Firepro.


Again someone who does not read information that is readily available read the posts by Peter N in the link I provided.


----------



## Slaughterem

Quote:


> Originally Posted by *obababoy*
> 
> The problem is 15.15 is already out which is the Fury X "performance" update. Almost everyone has tested it and most of the benchmarks shown in the reviews reflect no major improvements.


The link is not about the Drivers its talks about support for HDR monitors which require 10 12 and 16 bit color depth.


----------



## criminal

Quote:


> Originally Posted by *ondoy*
> 
> 
> 
> 
> 
> pls. remove if it's already posted...


That noise the card makes would drive me insane. Unacceptable in my opinion.


----------



## 2002dunx

Quote:


> Originally Posted by *harney*
> 
> this....i would like to know how come the fury has much more TFLOPS yet slower than a card with less TFLOPS


Simply.....

4096 > 3072

dunx

P.S. I'm simply glad to not pour excess cash into Nvidias pockets..... sadly I have no need for a compute-free GPU, Add R9 280X number four then......


----------



## Blameless

Quote:


> Originally Posted by *obababoy*
> 
> I did mention the word apparently but here you go:
> 
> "DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed. - See more at: http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html#sthash.feicveYS.dpuf"


Quite a bit wrong with that article.

SFR is not new (it's actually the oldest multi-GPU rendering method) and while the author's explanation of what SFR is generally correct, it's not limited to DX12 and does not provide the benefits claimed.

SFR, in and of itself, can only reduce the frame buffer size. Each GPU still needs a nearly complete mirror of a game's assets.
Quote:


> Originally Posted by *sugarhell*
> 
> Dx12 will probably use something better than SFR.


DX12 is supposedly able to divide rendering work and asset use with a lot more flexibility than simple SFR.

I'd have to research the details again to be more specific, but DX12 won't just divide up frames into different images to be rendered and composited, it will divide up whole categories of rendering tasks for the same image...or it will at least allow this to be done. Final implementation and thus the exact resources need to be mirrored or not will depend on developers on a per title basis.
Quote:


> Originally Posted by *harney*
> 
> this....i would like to know how come the fury has much more TFLOPS yet slower than a card with less TFLOPS


FP32 performance doesn't always translate into game performance. All the shader power in the world won't do anything if you are bottlenecked elsewhere.

The Fury may have ~40% more shader and texturing power, but it has ~60% lower pixel fill rate, and these are just peak theoretical figures anyway. There is a lot more to GPU performance and efficiency than simply comparing clock speeds and functional units, especially t when the architectures involved are different.


----------



## WorldExclusive

Quote:


> Originally Posted by *flash2021*
> 
> someone come get me when there are Win10 / DX12 benches. In about a month (for win10 rite?), we'll see the Fury X's true colors...whether good or bad (and I'm assuming AMD will have good DX12 drivers ready for day1)


Today is fact and tomorrow is a dream.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Slaughterem*
> 
> Again someone who does not read information that is readily available read the posts by Peter N in the link I provided.


I see. If true, I stand corrected. This sounds to me like AMD is giving up on Firepro then if willing to remove one of its main features.


----------



## toncij

FuryX shows
Quote:


> Originally Posted by *rdr09*
> 
> here was the reference . . .


How convenient that 980Ti reference all out of sudden got such a huge boost when it needs to be faster than FuryX...


----------



## Slaughterem

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I see. If true, I stand corrected. This sounds to me like AMD is giving up on Firepro then if willing to remove one of its main features.


That is quite an assumption on your part. IMO The fact is HBM is the future of graphics cards and they need to prove from a consumer level that this technology can be used in a professional level. One has to look at the big picture of graphics not just gaming.


----------



## CasualCat

Quote:


> Originally Posted by *sugarhell*
> 
> Ofc it has more TFlops the shader array of fury is insane. I will quote something from techreport
> Quote:
> 
> 
> 
> Koduri answered by stating my question another way: why didn't AMD build a bigger engine? That's an astute way to view things, because a bigger GPU engine would have taken fuller advantage of HBM's considerable bandwidth.
> 
> The reason why Fiji isn't any larger, he said, is that AMD was up against a size limitation: the interposer that sits beneath the GPU and the DRAM stacks is fabricated just like a chip, and as a result, the interposer can only be as large as the reticle used in the photolithography process. (Larger interposers might be possible with multiple exposures, but they'd likely not be cost-effective.) In an HBM solution, the GPU has to be small enough to allow space on the interposer for the HBM stacks. Koduri explained that Fiji is very close to its maximum possible size, within something like four square millimeters.
Click to expand...

Makes me wonder if the node shrink failure of foundries came into play here and/or if we'll see larger interposers in the future (though maybe that won't be an issue when they go to 3D vs 2.5D HBM). What would have 20nm FIJI been like?

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I see. If true, I stand corrected. This sounds to me like AMD is giving up on Firepro then if willing to remove one of its main features.


Can you clarify? I'm not following how you're reaching that conclusion from the post that was linked.


----------



## Kane2207

Quote:


> Originally Posted by *toncij*
> 
> FuryX shows
> How convenient that 980Ti reference all out of sudden got such a huge boost when it needs to be faster than FuryX...


Driver updates on a new arch.

There's nothing mysterious about it.


----------



## Forceman

Quote:


> Originally Posted by *toncij*
> 
> FuryX shows
> How convenient that 980Ti reference all out of sudden got such a huge boost when it needs to be faster than FuryX...


They changed from using a reference card to using a SC card instead.


----------



## CasualCat

Quote:


> Originally Posted by *Forceman*
> 
> They changed from using a reference card to using a SC card instead.


Quote:


> Originally Posted by *Kane2207*
> 
> Driver updates on a new arch.
> 
> There's nothing mysterious about it.


Quote:


> Originally Posted by *toncij*
> 
> FuryX shows
> How convenient that 980Ti reference all out of sudden got such a huge boost when it needs to be faster than FuryX...


It also appears in one test they used 4xAA and the other they didn't. If you look all the cards are faster in the one test including the 290X. So they changed from reference to non-reference and changed the bench standard. In other words you can't even compare those graphs.


----------



## Slaughterem

Quote:


> Originally Posted by *Kane2207*
> 
> Driver updates on a new arch.
> 
> There's nothing mysterious about it.


Oh so what your saying is that since the card has been out a month and has had some time to get better drivers it is getting better results? If this is true then should we also wait for Fury drivers to give better results? Or wait until Win 10 and DX12 comes out to see if both cards get a performance bump?


----------



## Blameless

If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
Quote:


> Originally Posted by *CasualCat*
> 
> Can you clarify? I'm not following how you're reaching that conclusion from the post that was linked.


He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree.

Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2.


----------



## sugarhell

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
> He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree.
> 
> Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2.


If the ROPs is the bottleneck.


----------



## Slaughterem

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
> He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree.
> 
> *Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2*.


If you were planning on using HBM 2 on your professional graphics cards, do you think it would be a good idea to see how HBM 1 works at higher color depths so that you can implement a better future solution?


----------



## CasualCat

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
> 
> He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree.
> 
> Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2.


Thanks.

Well it sounded like the larger interposer was at least partially a cost issue. I remember there being a GF presentation linked here a while back in which large interposers had a yield issue, but in either case given the premium paid for professional cards anyhow, I wonder even given those limitation if they could feasibly do larger interposers with more memory.

In other words, maybe they don't make sense in the <$1k consumer market space but in the $2k+ pro space they become feasible.


----------



## Casey Ryback

Quote:


> Originally Posted by *Slaughterem*
> 
> Oh so what your saying is that since the card has been out a month and has had some time to get better drivers it is getting better results? If this is true then should we also wait for Fury drivers to give better results? Or wait until Win 10 and DX12 comes out to see if both cards get a performance bump?


Only nvidia cards improve with driver updates








Quote:


> Originally Posted by *Kane2207*
> 
> Wait for DX12?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> now supposedly waiting for drivers, previously waiting 2 years for FreeSync, waiting for a frame timing fix..... it just goes on and on and on.


Nothing wrong with assuming better drivers will be coming, has happened on all of AMD's last cards, and even nvidia cards according to you.

Freesync works.

Frame timing on AMD cards is as good or better than rival nvidia cards.

Please stop living in the past.


----------



## Blameless

Quote:


> Originally Posted by *obababoy*
> 
> The problem is 15.15 is already out which is the Fury X "performance" update. Almost everyone has tested it and most of the benchmarks shown in the reviews reflect no major improvements.


15.15 is the first driver with full Fury support and Fury X drivers will be fairly immature for a while.
Quote:


> Originally Posted by *sugarhell*
> 
> If the ROPs is the bottleneck.


That's why I want a test sample; to find out for certain. Not that I expect them to build one; of course.

Regardless, I cannot help but think that keeping FP64 hardware in consumer parts is a serious mistake.
Quote:


> Originally Posted by *CasualCat*
> 
> Well it sounded like the larger interposer was at least partially a cost issue. I remember there being a GF presentation linked here a while back in which large interposers had a yield issue, but in either case given the premium paid for professional cards anyhow, I wonder even given those limitation if they could feasibly do larger interposers with more memory.
> 
> In other words, maybe they don't make sense in the <$1k consumer market space but in the $2k+ pro space they become feasible.


This is possible.


----------



## sugarhell

Quote:


> Originally Posted by *Blameless*
> 
> 15.15 is the first driver with full Fury support and Fury X drivers will be fairly immature for a while.
> That's why I want a test sample; to find out for certain. Not that I expect them to build one; of course.
> 
> Regardless, I cannot help but think that keeping FP64 hardware in consumer parts is a serious mistake.
> This is possible.


If i remember you cant really strip away the fp64 part of the gcn architecture. Its an engine with compute as the main aim.

I think its not a ROP problem at least the bottleneck on 1080p. I blame more the amd drivers overhead on dx11 driver thread.

I want a test sample like this too but i dont think so you can add 34 more ROPs without a big change on the die which means more R&D for amd-meh


----------



## decimator

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.


I brought this up a while ago.
Quote:


> Originally Posted by *decimator*
> 
> So what are the chances that we see a Fury X revision? There are still 12 months until the next node and Fury X's flaws are already apparent.
> 
> What would make sense to me is a slight decrease in the number of SP's (4096 just seems a bit absurd) to make more room on the die for more ROP's (maybe 96 or ideally 128). I'm not sure exactly how feasible this is, but I think it would behoove AMD to at the very least look into this...


Hopefully it comes to fruition somehow...


----------



## toncij

Quote:


> Originally Posted by *Forceman*
> 
> They changed from using a reference card to using a SC card instead.


Yes and that is annoying.


----------



## Slaughterem

Quote:


> Originally Posted by *Blameless*
> 
> *15.15 is the first driver with full Fury support and Fury X drivers will be fairly immature for a while*.
> That's why I want a test sample; to find out for certain. Not that I expect them to build one; of course.
> 
> Regardless, I cannot help but think that keeping FP64 hardware in consumer parts is a serious mistake.
> This is possible.


I would not expect an updated driver until Win 10 is officially released on July 29.


----------



## tajoh111

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
> He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree.
> 
> Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2.


From what I heard, they tossed out double precision already. Like tonga, double precision is cut down to 1/16.

AMD doesn't have a few hundred million to spend. Considering they spent 223 million on the whole totality of company for R and D the last quarter and mostly on Zen, it does't make sense to spend 300 million on a products that will generate so little money for them.

AMD has been making less than 150 million a quarter from their graphic division the last year or so. This high end card because of it's high cost was never going to make them that much money because the volume wasn't going to be there.

Add in the high cost to make these chips and generally, AMD could never recoup there investment, because it would take at a minimum 9 months to get all these changes done and to have fiji manufactured. And by then 16nm would be looming very soon.


----------



## Slaughterem

Quote:


> Originally Posted by *Blameless*
> 
> If AMD would spend a few hundred million and respin me a hypothetical test Fury X r1.5 where they throw out all FP64 support, knock the CU count down to 60 (meaning 240 TMUs and 3840 shaders), then use the transistors freed up to increase ROP count to 96, and expedite it to me for testing, I'd appreciate it.
> *He's saying that 10-bit per color channel and higher modes have traditionally been limited to professional cards and that allowing them on the consumer parts cannibalizes FirePro to some degree*.
> 
> Anyway, a bigger problem for a Fiji FirePro is the memory capacity. If they are unwilling or unable to build larger interposers, that essentially prevents any Fiji based FirePro until HBM2.


One of us has a reading comprehension problem did he say it was traditionally limited to professional cards or did he say 10 bit support is locked to Quadro + Firepro


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> Frame timing on AMD cards is as good or better than rival nvidia cards.
> 
> Please stop living in the past.


Frametimes suck on AMD..


----------



## mltms

interesting review about who the AA impact performance in 4k

http://www.tweakpc.de/hardware/tests/grafikkarten/radeon_r9_fury_x_vs_geforce_gtx_980_ti_benchmarks/s02.php


----------



## ZealotKi11er

Quote:


> Originally Posted by *mltms*
> 
> interesting review about who the AA impact performance in 4k
> 
> http://www.tweakpc.de/hardware/tests/grafikkarten/radeon_r9_fury_x_vs_geforce_gtx_980_ti_benchmarks/s02.php


But 4GB is not enough for 4K and MSAA.


----------



## Kane2207

Quote:


> Originally Posted by *Slaughterem*
> 
> Oh so what your saying is that since the card has been out a month and has had some time to get better drivers it is getting better results? If this is true then should we also wait for Fury drivers to give better results? Or wait until Win 10 and DX12 comes out to see if both cards get a performance bump?


Whilst I have no doubt that AMD will tighten things up a bit with drivers over the next few months, the magical performance increase you're trying to spin my words as will not appear.

Maxwell - new arch, plenty to optimise.
GCN revision - old arch, relatively little left to optimise for.

When AMD went from the 6000 series to 7000 they benefited from a ton of optimisation. They are not going to see that this time.

Please don't attempt to put words in my mouth to further your own agenda.


----------



## Kane2207

Quote:


> Originally Posted by *Casey Ryback*
> 
> Only nvidia cards improve with driver updates
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing wrong with assuming better drivers will be coming, has happened on all of AMD's last cards, and even nvidia cards according to you.
> 
> Freesync works.
> 
> Frame timing on AMD cards is as good or better than rival nvidia cards.
> 
> Please stop living in the past.


See my post above re: driver optimisation on a new architecture.

Freesync kind of works after waiting nearly two years for it, hardly a home run in its current state.

Frame pacing is fine unless you want to play any off the hundreds of DX9 games in existence, and that's never going to be fixed.

I'll stop living in the past when you get up to speed with the present.


----------



## Blameless

Quote:


> Originally Posted by *sugarhell*
> 
> If i remember you cant really strip away the fp64 part of the gcn architecture. Its an engine with compute as the main aim.


I've been looking over the GCN white paper (https://www.amd.com/Documents/GCN_Architecture_whitepaper.pdf) and there may be something to this. FP64 capability does seem to exist by combining adjacent lower precision registers and pipeline lanes, so there is no specific FP64 pipeline or larger than optimal registers that can be stripped away.

Still, FP64 is not synonymous with compute and I still wonder if there is any savings to be had by removing the capability to fuse lower precision components. Some of the execution units do seem like they may be wider than absolutely necessary.

Regardless, you now have me thinking that removing FP64 wouldn't be nearly as simple or beneficial as it was for NVIDIA's Maxwell.


----------



## GorillaSceptre

Lots of people are complaining about insane coil whine and pump noise..


----------



## AmericanLoco

That's what I really don't get about the Fury. In situations where that 4GB of VRAM should be crippling it, it starts to pull ahead of the 980TI, sometimes even matching the TitanX. Some people are suggesting the 64 ROPs are holding it back, but wouldn't that cause even more bottlenecking when you're pushing tons of pixels (like at 4K?)

Maybe it's some other kind of internal bottleneck, or the drivers are just that atrocious?


----------



## Forceman

Quote:


> Originally Posted by *mltms*
> 
> interesting review about who the AA impact performance in 4k
> 
> http://www.tweakpc.de/hardware/tests/grafikkarten/radeon_r9_fury_x_vs_geforce_gtx_980_ti_benchmarks/s02.php


Shouldn't something be 100% on those charts? How is the highest 97%?


----------



## Tivan

Quote:


> Originally Posted by *hamzta09*
> 
> Frametimes suck on AMD..


I'm getting 2-6ms frametimes on Path of Exile, unless caching assets, how do they suck :c

Blanket statements ahoi~

On that note, I wonder how this game would behave with HBM.


----------



## mltms

Quote:


> Originally Posted by *ZealotKi11er*
> 
> But 4GB is not enough for 4K and MSAA.


like you see it 4k 8MSAA the 6GB did not help 980ti


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Lots of people are complaining about insane coil whine and pump noise..


Source? I didn't know that enough have been delivered to qualify as "lots of people" yet.


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> Shouldn't something be 100% on those charts? How is the highest 97%?


The only thing that's 100% in this thread is my manly beard. Extrapolate from there.


----------



## hamzta09

Quote:


> Originally Posted by *Tivan*
> 
> I'm getting 2-6ms frametimes on Path of Exile, unless caching assets, how do they suck :c
> 
> Blanket statements ahoi~
> 
> On that note, I wonder how this game would behave with HBM.


----------



## sugarhell

Quote:


> Originally Posted by *Blameless*
> 
> I've been looking over the GCN white paper (https://www.amd.com/Documents/GCN_Architecture_whitepaper.pdf) and there may be something to this. FP64 capability does seem to exist by combining adjacent lower precision registers and pipeline lanes, so there is no specific FP64 pipeline or larger than optimal registers that can be stripped away.
> 
> Still, FP64 is not synonymous with compute and I still wonder if there is any savings to be had by removing the capability to fuse lower precision components. Some of the execution units do seem like they may be wider than absolutely necessary.
> 
> Regardless, you now have me thinking that removing FP64 wouldn't be nearly as simple or beneficial as it was for NVIDIA's Maxwell.


Yeah they can do either fp32 or fp64 and its locked through the drivers for marketing reasons. Fp64 is kinda tied with niche compute tho. Anyone have the dp flops for fury? Its locked at 1/16 like tonga?


----------



## Final8ty

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That's also panel limit.


I know but the point was what does the card support.


----------



## Final8ty

Quote:


> Originally Posted by *AMDMatt;28235270*
> Missed this earlier. Hold on a minute guys, he was counting the overall score not the graphics score. If you're working out CrossFire/QuadFire scaling in Firestrike go by Graphics Score. If you work it out, the scaling is good for immature drivers using a new technology. It will only get better, stay tuned!


Looks like they have some worth while improvements in the pipeline already.


----------



## Joystick0481

Quote:


> Originally Posted by *toncij*
> 
> Yes and that is annoying.


Is this even that relevant? Its clearly marked that it is and anyone with some interest in these cards would know that those numbers are not indicative of 980ti reference card performance.
More importantly for most people who buy high-end cards, overclockability is significant factor. Hence, they will either buy a non-reference card at a marginally higher price ($20~$40) or buy reference cards and build their own water cooling solution.
For consumers that don't want to deal with getting their own cooling solution are probably shopping for a non-reference card anyways. Given that the fury x is strictly a reference card and the price difference is only marginal, i think its only fair that they compare the two cards.


----------



## mltms

Quote:


> Originally Posted by *Forceman*
> 
> Shouldn't something be 100% on those charts? How is the highest 97%?


i think this is the Average fps the fury x and 980ti the same performance it 4k %100


----------



## Blameless

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah they can do either fp32 or fp64 and its locked through the drivers for marketing reasons. Fp64 is kinda tied with niche compute tho. Anyone have the dp flops for fury? Its locked at 1/16 like tonga?


Fury is 1/4 FP64. So around 2.2 TFLOPS.

Looks like that's wrong.


----------



## sugalumps

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *hamzta09*






Yup linked those earlier and bf4's frame times are just as bad for amd according to that site. Another point for the equaly priced 980ti.


----------



## Slaughterem

Quote:


> Originally Posted by *Kane2207*
> 
> Whilst I have no doubt that AMD will tighten things up a bit with drivers over the next few months, the magical performance increase you're trying to spin my words as will not appear.
> 
> Maxwell - new arch, plenty to optimise.
> GCN revision - old arch, relatively little left to optimise for.
> 
> When AMD went from the 6000 series to 7000 they benefited from a ton of optimisation. They are not going to see that this time.
> 
> Please don't attempt to put words in my mouth to further your own agenda.


I asked a few questions, you implied that I was spinning your words and have an agenda. My agenda FYI is to gain knowledge from the many people that are more knowledgeable than me who post comments on this site. Your response shows that you lack being open minded and that your opinions are biased. From this point on I will not entertain the validity of your comments without taking into account your bias towards AMD.


----------



## sugarhell

Quote:


> Originally Posted by *Blameless*
> 
> Fury is 1/4 FP64.


Are you sure? I dont remember any reviewer to mention this


----------



## Blameless

Quote:


> Originally Posted by *sugarhell*
> 
> Are you sure? I dont remember any reviewer to mention this


Hmm, looks like I was wrong.

According to the few reviews I just looked up it's FP 1/16, or ~500 GFLOPS.

Must have been recalling a rumor.


----------



## Alatar

I guess we'll just have to see if Fiji ever gets released with higher density HBM as a firepro. Hard to say much about how well it does with DP since the retail cards are 1/16

Fiji might also make an appearance on 16nm with more memory. Wouldn't surprise me at all to see it shrinked and sold as a 2nd tier card as well as a firepro.


----------



## ep45-ds3l

Just saw this..

http://www.eteknix.com/amd-fury-x-quadfire-results/


----------



## ANN1H1L1ST

Quote:


> Originally Posted by *ladcrooks*
> 
> Whatever way all you lot look at it, Amd , 1st attempt and being the 1st with a different architecture from scratch is a winner in my eyes - bored with all the poooers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn for not having 8gig - all you greeny's would be like rats jumping a sinking boat, and same if the shoe was on the other foot


This is what's funny about AMD consumers. They always say "If AMD did this" or "if AMD did that" Nvidia fans would be jumping ship. The reality is AMD can't deliver. They are all hype with stupid slogans "Green with envy" or "Never settle" etc etc.

They might as well lower the price on the Fury X now so I can hear AMD people toot the priceerformance horn they always seem to revert to, but fail to realize that most people who want the best and are spending $550+ on a GPU could care less about priceerformance.


----------



## rv8000

Quote:


> Originally Posted by *hamzta09*
> 
> 
> 
> Spoiler: Warning: Spoiler!


I'll leave these here, people need to stop cherry picking in their favor. Multiple sites show widely different results for EVERYTHING, I honestly don't trust a single review site to paint the picture properly...




Less than half the peak and average in both games?


----------



## tconroy135

Quote:


> Originally Posted by *ep45-ds3l*
> 
> Just saw this..
> 
> http://www.eteknix.com/amd-fury-x-quadfire-results/


Let's put it up against an application that needs more than 4GB of memory stored at a specific instance...


----------



## Kane2207

Quote:


> Originally Posted by *Slaughterem*
> 
> I asked a few questions, you implied that I was spinning your words and have an agenda. My agenda FYI is to gain knowledge from the many people that are more knowledgeable than me who post comments on this site. Your response shows that you lack being open minded and that your opinions are biased. From this point on I will not entertain the validity of your comments without taking into account your bias towards AMD.


OK, so I re-read your posts and if it was a genuine question, then I do apologise for my tone.

The question got my back up because there is a contingent that state we should wait for Windows 10/DX12 which after the lengthy wait for numerous other AMD products including the release of this card, is unacceptable for me personally.

Whilst I think it's an alright product, it isn't near many peoples expectations based on the marketing and hype leading up to the release.

Now, I do think AMD will improve things over the next few months through drivers, I do not believe they will be the miraculous cure all that is being banded about.

There's certainly room for improvement but some things could be a concern that drivers just won't cure.


----------



## hawker-gb

Quote:


> Originally Posted by *ANN1H1L1ST*
> 
> This is what's funny about AMD consumers. They always say "If AMD did this" or "if AMD did that" Nvidia fans would be jumping ship. The reality is AMD can't deliver. They are all hype with stupid slogans "Green with envy" or "Never settle" etc etc.
> 
> They might as well lower the price on the Fury X now so I can hear AMD people toot the priceerformance horn they always seem to revert to, but fail to realize that most people who want the best and are spending $550+ on a GPU could care less about priceerformance.


This post is in serious conflict with reality.









Peace.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> The only thing that's 100% in this thread is my manly beard. Extrapolate from there.


Finally an informative post. That also made me laugh.
Quote:


> Originally Posted by *rv8000*
> 
> I'll leave these here, people need to stop cherry picking in their favor. Multiple sites show widely different results for EVERYTHING, I honestly don't trust a single review site to paint the picture properly...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Less than half the peak and average in both games?


Makes you wonder what's going on, if it's just variation in test setup/settings/method or if there is some other issue.
Quote:


> Originally Posted by *tconroy135*
> 
> Let's put it up against an application that needs more than 4GB of memory stored at a specific instance...


Has anyone found one yet?


----------



## CrazyElf

Quote:


> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> These are DirectFET MosFET and there are 6 of them. IR6811 and IR6894 DirectFET
> 
> 
> 
> I think these are the datasheets:
> http://www.irf.com/product-info/datasheets/data/irf6811spbf.pdf
> http://www.irf.com/product-info/datasheets/data/irf6894mpbf.pdf
> 
> 
> 
> Thanks.
> 
> Maximum operating temperature is 150C, but they lose a lot of current capacity as temperature rises.
> 
> I have some of these Enzotech sinks...had one epoxied to the back of my GTX 480 for a while. They work pretty well, but add a lot of weight, being solid forged copper and all.
> 
> It should fit fine, at least width wise; length might present some issues, but I think they come in 66mm and that should be small enough. If not, there is always the saw.
Click to expand...

Actually that got me thinking. Maximum operating temperature might be closer than expected. If they are taking measurements from behind the PCB, what is the actual Mosfet temperature? It could be as much as 20C higher.

As far as the mass of the forged copper heatsinks on the rear, another option is to use aluminium heatsinks, which will have a lower mass and work almost as well.

There are other issues. If the VRM is running so hot even without an overclock, how bad could it get with an overclock? A lot of people seem to be expecting that if the voltage is unlocked, that much higher overclocking would be possible. We could find the card VRM temperature bottlenecked. The problem is compounded by the fact that no custom PCBs will be available.

If the card is only capable of 1/16 FP64, then the issue might be that the GCN 1. 2 simply hasn't made the efficiency jumps that Nvidia did from the SMX to SMM. This is even with the HBM power savings taken into account. Of course, 1/32 FP64 would improve efficiency for gaming more. I think that it is probably something AMD should consider. The low VRAM means that this card is not that good for computing anyways.

To be honest, this whole card is a poor value right now. It needs to be $550 USD or less to merit serious consideration.

Quote:


> Originally Posted by *Kane2207*
> 
> Whilst I have no doubt that AMD will tighten things up a bit with drivers over the next few months, the magical performance increase you're trying to spin my words as will not appear.
> 
> Maxwell - new arch, plenty to optimise.
> GCN revision - old arch, relatively little left to optimise for.
> 
> When AMD went from the 6000 series to 7000 they benefited from a ton of optimisation. They are not going to see that this time.
> 
> Please don't attempt to put words in my mouth to further your own agenda.


I am skeptical that Maxwell has a lot of room for optimisation. It has been around since the 750Ti, and that has been around for close to a year and a half.

Neither architecture is new. GCN has been around for quite a bit as well. I expect major room for optimisation if AMD transitions past GCN to something else. That was the reason why the 7970 was able to get so much out of it, the changes from VLIW to GCN.

I think that the 16nm architecture will see some pretty big gains though. It is the classic buy now or wait conundrum.

Sent from my SGH-T889 using Tapatalk


----------



## tconroy135

Quote:


> Originally Posted by *tconroy135*
> 
> Let's put it up against an application that needs more than 4GB of memory stored at a specific instance...


Quote:


> Originally Posted by *Forceman*
> 
> Has anyone found one yet?


I would imagine Watch Dogs might at 4k. Arkham Knight does although I'm not sure that counts.


----------



## PlugSeven

Quote:


> Originally Posted by *hamzta09*


Those are FRAPS frametimes NOT FCAT.
Go see pcper's fcat frametimes


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> Finally an informative post. That also made me laugh.
> Makes you wonder what's going on, if it's just variation in test setup/settings/method or if there is some other issue.
> Has anyone found one yet?


I wish the major review sites would use a standardized test setup, same cpu, same motherboard, same memory, same ssd and so on. If we just had 4 or 5 sites that had consistently similar results we could finally have an accurate measurement for all metrics of a gpu. Until then we get a pile, leading to more and more flameboi wars, which in turn makes it IMPOSSIBLE to get useful information out of threads







.


----------



## curlyp

Quote:


> Originally Posted by *criminal*
> 
> That noise the card makes would drive me insane. Unacceptable in my opinion.


Quote:


> Originally Posted by *criminal*
> 
> That noise the card makes would drive me insane. Unacceptable in my opinion.


Interesting...mine is quiet. The noise you hear is my other fans and the dust screen rattling underneath.

R9 Fury Fist Run

Edit: Misspelled word.


----------



## Blameless

Quote:


> Originally Posted by *Forceman*
> 
> Has anyone found one yet?


Plenty of GPGPU apps, but I don't have any _games_ I can reliably force to use more than 4GiB.


----------



## DampMonkey

Quote:


> Originally Posted by *Alatar*
> 
> All the reviews I've seen have been against those reference 980Tis.
> 
> The aftermarket ones are easily faster than the Fury X and even the Titan X.


Also, overclock3d used an overclocked 980ti in their 4k tests
http://www.overclock3d.net/reviews/gpu_displays/amd_r9_fury_x_review/21


----------



## criminal

Quote:


> Originally Posted by *hamzta09*
> 
> Frametimes suck on AMD..


I haven't seen you post anything constructive in this thread. Seems like you made your decision. Now you seem to just be trying to rub salt in the wound. Frametimes are comparable now. Quit cherry picking.
Quote:


> Originally Posted by *Kane2207*
> 
> OK, so I re-read your posts and if it was a genuine question, then I do apologise for my tone.
> 
> The question got my back up because there is a contingent that state we should wait for Windows 10/DX12 which after the lengthy wait for numerous other AMD products including the release of this card, is unacceptable for me personally.
> 
> Whilst I think it's an alright product, it isn't near many peoples expectations based on the marketing and hype leading up to the release.
> 
> Now, I do think AMD will improve things over the next few months through drivers, I do not believe they will be the miraculous cure all that is being banded about.
> 
> There's certainly room for improvement but some things could be a concern that drivers just won't cure.


Fiji has room for driver tweaks. Quit being so bias. Looks like you hold onto your cards for a while. That Titan is comparable to a 7970 now. Seems like AMD makes good driver improvements over an extended period. Lets give Fury a few weeks before discounting anything.

Quote:


> Originally Posted by *hawker-gb*
> 
> This post is in serious conflict with reality.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peace.


Best just to ignore that person. Giant Nvidia fanboy.
Quote:


> Originally Posted by *curlyp*
> 
> Interesting...mine is quiet. The noise you hear is my other fans and the ducat screen rattling underneath.
> 
> R9 Fury Fist Run


Good deal. Maybe it is just select units.


----------



## Asmodian

Quote:


> Originally Posted by *Ganf*
> 
> You will see that the only option for consumer cards at 10 bit color depth will be the AMD.


Nvidia supports 10, 12, and 16 bit now too. madVR can output 10-bit to compatible displays with both AMD and Nvidia consumer GPUs.

10-bit OpenGL is still for professional cards only but Direct3D has supported >8-bit for a long time with both Nvidia and AMD consumer GPUs. However, nothing supported it until very recently (Windows itself is limited to 8-bit only and nothing AMD or Nvidia can do will change that).

Edit: Recently they both added options for >10-bit output but they are simply padding Window's 8-bit with extra 0's unless using a Full Screen Exclusive application that supports >8 bit modes.


----------



## GorillaSceptre

Quote:


> Originally Posted by *p4inkill3r*
> 
> Source? I didn't know that enough have been delivered to qualify as "lots of people" yet.


Can't be bothered linking sources, there's plenty in this thread.

A few reviewers have mentioned it, whether that's indicative of the shipped cards i don't know. I've also seen some consumers say their cards sound fine, so maybe it was just a faulty batch?


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Can't be bothered linking sources, there's plenty in this thread.
> 
> A few reviewers have mentioned it, whether that's indicative of the shipped cards i don't know. I've also seen some consumers say their cards sound fine, so maybe it was just a faulty batch?


So it was just FUD then.

Ok


----------



## Bartouille

I never thought I would say this but I'm more exited about the 390x than the fury x even tho it's a rebrand lol.


----------



## iLeakStuff

More people getting high pitched noise from the Fury X.
Guess AMD didnt fix them all before shipping to customers
That sound would drive me crazy :/

https://www.youtube.com/watch?v=XfyQzroYnrI

https://www.youtube.com/watch?v=yLW7cZPW2fA

https://www.youtube.com/watch?v=DKoNa1OXnQA

http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1

Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


----------



## Kane2207

Quote:


> Originally Posted by *criminal*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fiji has room for driver tweaks. Quit being so bias. Looks like you hold onto your cards for a while. That Titan is comparable to a 7970 now. Seems like AMD makes good driver improvements over an


So you just chose to ignore where I clearly stated I have no doubts that AMD will make _some_ improvements and decided that's showing bias, really?









Once again, I do believe they'll improve the situation, I really do. Do I think that DX12 will magically make this card phenomenally faster than the competition? Not really. It stands to reason that any of the core benefits (not feature elements) of DX12 will be beneficial to both camps.

I already applauded AMDs improvements they've made since going to GCN from TeraScale but driver optimisations are not infinite, eventually they'll have wrung everything out of GCN they possibly can.

And as you noted, I've had my hardware for quite a while, having two kids in 4.5 years'll do that to you, other priorities take over. I guess I bought in to the hype leading up to Fiji's release, expecting them to knock it out of the park like they did with the 7970 (which as you noted, is still a strong card today) but unfortunately that doesn't appear to be the case. I've waited that long to upgrade now I suppose I may as well wait for a node shrink, unless AMD prices tank like they did 12-18 months into the 290X. If AMD have improved the card significantly by then, then I may still purchase one. Who knows.


----------



## aDyerSituation

Quote:


> Originally Posted by *Bartouille*
> 
> I never thought I would say this but I'm more exited about the 390x than the fury x even tho it's a rebrand lol.


It's performing really well.


----------



## GorillaSceptre

Quote:


> Originally Posted by *p4inkill3r*
> 
> So it was just FUD then.
> 
> Ok


I don't own the card, and neither do you







I repeated what *reviewers* have said, those reviews are linked in *this* thread.. I'm relying on what they say before making a purchase.

So wheres the FUD?

You corporate slaves are something else.


----------



## flopper

Quote:


> Originally Posted by *Kane2207*
> 
> So you just chose to ignore where I clearly stated I have no doubts that AMD will make _some_ improvements and decided that's showing bias, really?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once again, I do believe they'll improve the situation, I really do. Do I think that DX12 will magically make this card phenomenally faster than the competition? Not really. It stands to reason that any of the core benefits (not feature elements) of DX12 will be beneficial to both camps.
> 
> I already applauded AMDs improvements they've made since going to GCN from TeraScale but driver optimisations are not infinite, eventually they'll have wrung everything out of GCN they possibly can.
> 
> And as you noted, I've had my hardware for quite a while, having two kids in 4.5 years'll do that to you, other priorities take over. I guess I bought in to the hype leading up to Fiji's release, expecting them to knock it out of the park like they did with the 7970 (which as you noted, is still a strong card today) but unfortunately that doesn't appear to be the case. I've waited that long to upgrade now I suppose I may as well wait for a node shrink, unless AMD prices tank like they did 12-18 months into the 290X. If AMD have improved the card significantly by then, then I may still purchase one. Who knows.


10% driver improvement in a couple of iterations and the impact is different for the fury totally.
if a 1250mhz OC can be made well its done deal clobber time.
I am buying fury.


----------



## Tivan

Quote:


> Originally Posted by *Bartouille*
> 
> I never thought I would say this but I'm more exited about the 390x than the fury x even tho it's a rebrand lol.


The 390(X) seem mighty exciting indeed. Which raises the question when we can expect _something_ that puts the ~45% extra shaders (compared to 390X) on the FuryX to some good use.


----------



## GorillaSceptre

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=XfyQzroYnrI
> 
> https://www.youtube.com/watch?v=yLW7cZPW2fA
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA
> 
> http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1
> 
> Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


Don't post that in here.. The shills will say you're spreading fud
















at that one vid. "All this padding and this ***** still screams"


----------



## Orivaa

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA


That is not the retail version, like the guy in the video admitted.
There seems to be some accidental sales going around, selling pre-fix cards.


----------



## Dhoulmagus

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Don't post that in here.. The shills will say you're spreading fud


Thank god for EK water blocks


----------



## hamzta09

Quote:


> Originally Posted by *rv8000*
> 
> I'll leave these here, people need to stop cherry picking in their favor. Multiple sites show widely different results for EVERYTHING, I honestly don't trust a single review site to paint the picture properly...
> 
> 
> 
> 
> Less than half the peak and average in both games?


And you didnt Cherry Pick? I just googled Fury X Frametimes and took the first one.


----------



## iLeakStuff

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Don't post that in here.. The shills will say you're spreading fud


They can say what they want, I dont care. Its not FUD when we get threads and users pitching in on the noise and putting out youtube videos about it.

Then again there are a lot of people that still believe 9/11 was FBI or whatever and we never landed on the moon. Internet is a crazy place


----------



## rv8000

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=XfyQzroYnrI
> 
> https://www.youtube.com/watch?v=yLW7cZPW2fA
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA
> 
> http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1
> 
> Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


In the 3 first videos, the noise is almost not audible from a distance of more than 6" away, these people have their cameras half an inch away >_>. And the last video the pump is slightly above the radiator which has been said to cause more noise.

I'm not saying there isn't a problem, or some revisions were not made, but looks like people getting carried away with something after hearing complaints about it from other sources. One thing is for sure, if my card makes the same noise while im sitting @ my desk (roughly 2.5ft) that card is instantly going back.


----------



## Slaughterem

Quote:


> Originally Posted by *rv8000*
> 
> I wish the major review sites would use a standardized test setup, same cpu, same motherboard, same memory, same ssd and so on. If we just had 4 or 5 sites that had consistently similar results we could finally have an accurate measurement for all metrics of a gpu. Until then we get a pile, leading to more and more flameboi wars, which in turn makes it IMPOSSIBLE to get useful information out of threads
> 
> 
> 
> 
> 
> 
> 
> .


I agree that the review sites differ with their results. So much so no one knows who to believe. The only way we could get a fair review is for the sites to list their games system.cfg file. And if they have an Autoexec.cfg file list that contents also. For all we know they made changes that could have favored one card or the other. Some of these changes would not show up in the games preferences screen. I would hope that this type of thing would not be an issue, but there is to much discrepancies coming from the review sites one can only wonder why.


----------



## szeged

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=XfyQzroYnrI
> 
> https://www.youtube.com/watch?v=yLW7cZPW2fA
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA
> 
> http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1
> 
> Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


i kinda had a feeling that amd didnt manage to pull all the retail samples back and fix them before distributing them in time. Just a guess since amd has been caught in more than a few lies recently.


----------



## hamzta09

Quote:


> Originally Posted by *criminal*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't seen you post anything constructive in this thread. Seems like you made your decision. Now you seem to just be trying to rub salt in the wound. Frametimes are comparable now. Quit cherry picking.


Didnt cherry pick, took the first reliable source on google.

Doesnt matter, frametimes suck. I had 2x 280X before. I also had a 7950 and a 5850.


----------



## iLeakStuff

Quote:


> Originally Posted by *rv8000*
> 
> In the 3 first videos, the noise is almost not audible from a distance of more than 6" away, these people have their cameras half an inch away >_>. And the last video the pump is slightly above the radiator which has been said to cause more noise.
> 
> I'm not saying there isn't a problem, or some revisions were not made, but looks like people getting carried away with something after hearing complaints about it from other sources. One thing is for sure, if my card makes the same noise while im sitting @ my desk (roughly 2.5ft) that card is instantly going back.


Have you heard high pitched noise before? It can go through pretty much everything and overshadow almost any sound. Ive had PSUs and GPUs, even broken CPUs with that sound before.
Its horrible
Quote:


> Originally Posted by *szeged*
> 
> i kinda had a feeling that amd didnt manage to pull all the retail samples back and fix them before distributing them in time. Just a guess since amd has been caught in more than a few lies recently.


I hope it doesnt become a wide problem and its just a few percentage of users


----------



## Kane2207

Quote:


> Originally Posted by *flopper*
> 
> 10% driver improvement in a couple of iterations and the impact is different for the fury totally.
> if a 1250mhz OC can be made well its done deal clobber time.
> I am buying fury.


I wish you all the best and hope you're happy with your purchase.

1250MHz is going to put it about on par with an overclocked 980ti.

Personally, if I had to upgrade now I'd look at the ti since we have no idea if the 10% in drivers will happen with Fury, nor do we know exactly when that 10% will be delivered. It's nothing but guess work at present.


----------



## p4inkill3r

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=XfyQzroYnrI
> 
> https://www.youtube.com/watch?v=yLW7cZPW2fA
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA
> 
> http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1
> 
> Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


I listened to them all and that's definitely an annoying noise and would cause me to RMA, it sounds like the pump to me. Way too many people complain about 'coil whine' but don't even know what a coil is.

That being said, this doesn't seem ubiquitous or particularly widespread; more people (including me!) need their cards.


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I don't own the card, and neither do you
> 
> 
> 
> 
> 
> 
> 
> I repeated what *reviewers* have said, those reviews are linked in *this* thread.. I'm relying on what they say before making a purchase.
> 
> So wheres the FUD?
> 
> You corporate slaves are something else.


I'm not going to go through 2k posts to prove an assertion you made.


----------



## GorillaSceptre

Quote:


> Originally Posted by *p4inkill3r*
> 
> I'm not going to go through 2k posts to prove an assertion you made.


So then don't go through all the posts, just look at the reviews..


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So then don't go through all the posts, just look at the reviews..


I have, all of them. Whose should I believe?


----------



## rv8000

Quote:


> Originally Posted by *iLeakStuff*
> 
> Have you heard high pitched noise before? It can go through pretty much everything and overshadow almost any sound. Ive had PSUs and GPUs, even broken CPUs with that sound before.
> Its horrible
> I hope it doesnt become a wide problem and its just a few percentage of users


In the past year and a half I've had 3x290s, 3x780s, 2x970s, 2x290x, and 2 PSU's, I re-wired the wall sockets in my room, I have 3 fan controllers in my pc, sound dampening foam, about 4 different types of expensive fans multiples of each, all because of coil whine, bearing whine, and electrical noise. I SURE AS HELL know what the noises sound like. Go put your ear next to virtually any CLC be it corsair, nzxt, coolermaster and so on, you are going to hear the pump whine, sure it might not be as loud as *some*, but the noise is there and most of the time it is just as loud as the general cases being shown here.


----------



## GorillaSceptre

Quote:


> Originally Posted by *p4inkill3r*
> 
> I have, all of them. Whose should I believe?


In your case? I guess only the positive ones.


----------



## rdr09

Quote:


> Originally Posted by *iLeakStuff*
> 
> Have you heard high pitched noise before? It can go through pretty much everything and overshadow almost any sound. Ive had PSUs and GPUs, even broken CPUs with that sound before.
> Its horrible
> I hope it doesnt become a wide problem and its just a few percentage of users


your shift is not done yet? must be paid a lot.


----------



## criminal

Quote:


> Originally Posted by *Kane2207*
> 
> So you just chose to ignore where I clearly stated I have no doubts that AMD will make _some_ improvements and decided that's showing bias, really?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once again, I do believe they'll improve the situation, I really do. Do I think that DX12 will magically make this card phenomenally faster than the competition? Not really. It stands to reason that any of the core benefits (not feature elements) of DX12 will be beneficial to both camps.
> 
> I already applauded AMDs improvements they've made since going to GCN from TeraScale but driver optimisations are not infinite, eventually they'll have wrung everything out of GCN they possibly can.
> 
> And as you noted, I've had my hardware for quite a while, having two kids in 4.5 years'll do that to you, other priorities take over. I guess I bought in to the hype leading up to Fiji's release, expecting them to knock it out of the park like they did with the 7970 (which as you noted, is still a strong card today) but unfortunately that doesn't appear to be the case. I've waited that long to upgrade now I suppose I may as well wait for a node shrink, unless AMD prices tank like they did 12-18 months into the 290X. If AMD have improved the card significantly by then, then I may still purchase one. Who knows.


My apologizes then, I must have missed that part. And I know about priorities. I have a 4 year old and another on the way. This 980 cost me $80 more after selling my 780, otherwise I would be rocking a 780 right now. Other than that, been kinda spare cash going to the rig instead of an upgrade fund like I use to have. I do have a little money left over my bonus last year and I have that pesky upgrade itch. Not quite satisfied with the Fury X, but I am hoping a custom Fury card will be awesome. Not that I need to upgrade, just wanted to play on the red side for a while.

Anyway, sorry again, you just seemed to constantly refute this being a driver issue and it got under my skin. Let's just be fair and give it a few more weeks is all I am saying.


----------



## curlyp

Quote:


> Originally Posted by *iLeakStuff*
> 
> More people getting high pitched noise from the Fury X.
> Guess AMD didnt fix them all before shipping to customers
> That sound would drive me crazy :/
> 
> https://www.youtube.com/watch?v=XfyQzroYnrI
> 
> https://www.youtube.com/watch?v=yLW7cZPW2fA
> 
> https://www.youtube.com/watch?v=DKoNa1OXnQA
> 
> http://forums.anandtech.com/showpost.php?s=457cd00d38a211191c6a9e81a27161c6&p=37512130&postcount=1
> 
> Reviewer: https://www.youtube.com/watch?v=iEwLtqbBw90


Very interesting. I posted mine earlier and I don't have any sound. She is quiet as a mouse! I guess I must have been a lucky one. Now, if I can get around the lock voltage, I would like to overclock her!

R9 Fury X First Run

Edit: I will unplug my fans and take another video. Will post later.


----------



## harney

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Don't post that in here.. The shills will say you're spreading fud
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> at that one vid. "All this padding and this ***** still screams"


that padding inside will melt at 104c







1st vid


----------



## criminal

Quote:


> Originally Posted by *curlyp*
> 
> Very interesting. I posted mine earlier and I don't have any sound. She is quiet as a mouse! I guess I must have been a lucky one. Now, if I can get around the lock voltage, I would like to overclock her!
> 
> R9 Fury X First Run


Looks sweet.









I really want one just to play with, but I am going to wait for the 14th. Would love to get my hands on a Hawk or Lightning Fury.


----------



## Schmuckley

Quote:


> Originally Posted by *Ganf*
> 
> The only thing that's 100% in this thread is my manly beard. Extrapolate from there.


Truest post in this whole thread


----------



## p4inkill3r

Quote:


> Originally Posted by *GorillaSceptre*
> 
> In your case? I guess only the positive ones.


"Lots of people" aren't having issues. I guess we'll find out sooner rather than later how widespread this issue is.


----------



## Orivaa

Quote:


> Originally Posted by *curlyp*
> 
> Very interesting. I posted mine earlier and I don't have any sound. She is quiet as a mouse! I guess I must have been a lucky one. Now, if I can get around the lock voltage, I would like to overclock her!
> 
> R9 Fury X First Run
> 
> Edit: I will unplug my fans and take another video. Will post later.


Nah, those people were just unlucky.


----------



## Thoth420

^ This


----------



## Kane2207

Quote:


> Originally Posted by *criminal*
> 
> My apologizes then, I must have missed that part. And I know about priorities. I have a 4 year old and another on the way. This 980 cost me $80 more after selling my 780, otherwise I would be rocking a 780 right now. Other than that, been kinda spare cash going to the rig instead of an upgrade fund like I use to have. I do have a little money left over my bonus last year and I have that pesky upgrade itch. Not quite satisfied with the Fury X, but I am hoping a custom Fury card will be awesome. Not that I need to upgrade, just wanted to play on the red side for a while.
> 
> Anyway, sorry again, you just seemed to constantly refute this being a driver issue and it got under my skin. Let's just be fair and give it a few more weeks is all I am saying.


Ha, no worries, and I know all about the bonus situation - mine's annual around Feb. Right at the wrong time for 980ti's and Fury X's









After buying a Titan, then Nvidia releasing the 780ti, Titan Black etc and AMD matching Titan performance for half the cash, you can be damn sure I was waiting with baited breath this time for both companies to release their cards before I'm ready to lay down the cash.

I'm now faced with a 980ti which Nvidia might choose to not optimise for newer games once Pascal drops, or purchasing AMD and hoping they find performance via drivers in a timely manner.

Having only two players in the market sucks!!!!

(I'm now going back to Yoshi's Wooly World for a bit - <£200 machine with a game developed by a competent publisher who actually gives a damn about the quality of their output.... Us PC gamers are all mugs!!!!







)


----------



## NuclearPeace

My 750 Ti FTW had coil whine for about half a day and then it suddenly fixed itself. Its an extremely annoying sound that can pretty much pierce though noise dampening material. My case, the PS07B, is basically a "silent" version of the TJ08B-E and I could still very well hear the coils whining through the case and through my HD 439 headphones. My case of coil whine was also more on the mild side considering how bad some of the 970s were.

Is coil whine acceptable in a yaer old $130ish card? Maybe. I definately would not expect or tolerate coil/pump whine or noise on a brand spanking new $650 card made to give off a premium feel. I would return it before even making it go through its rounds of stress testing.


----------



## extracrunchy

Has anyone tried the hacked windows 10 drivers on this yet, or do they work?


----------



## FallenFaux

Quote:


> Originally Posted by *NuclearPeace*
> 
> My 750 Ti FTW had coil whine for about half a day and then it suddenly fixed itself. Its an extremely annoying sound that can pretty much pierce though noise dampening material. My case, the PS07B, is basically a "silent" version of the TJ08B-E and I could still very well hear the coils whining through the case and through my HD 439 headphones. My case of coil whine was also more on the mild side considering how bad some of the 970s were.
> 
> Is coil whine acceptable in a yaer old $130ish card? Maybe. I definately would not expect or tolerate coil/pump whine or noise on a brand spanking new $650 card made to give off a premium feel. I would return it before even making it go through its rounds of stress testing.


I fixed coil whine on a GTX480 and 290x by just running them at full load for a while. It can just go away on its own by just using the card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *FallenFaux*
> 
> I fixed coil whine on a GTX480 and 290x by just running them at full load for a while. It can just go away on its own by just using the card.


I think i has some with my 290X but after using it no more. People are trying really hard to down this card.


----------



## mav451

Quote:


> Originally Posted by *Kane2207*
> 
> (I'm now going back to Yoshi's Wooly World for a bit - <£200 machine with a game developed by a competent publisher who actually gives a damn about the quality of their output.... Us PC gamers are all mugs!!!!
> 
> 
> 
> 
> 
> 
> 
> )


Still a 4 month wait in NA. I guess AUS/EU wins this round


----------



## curlyp

Quote:


> Originally Posted by *criminal*
> 
> Looks sweet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really want one just to play with, but I am going to wait for the 14th. Would love to get my hands on a Hawk or Lightning Fury.


Thanks! Is MSI coming out with a Lightning Fury?


----------



## harney

Quote:


> Originally Posted by *rv8000*
> 
> In the past year and a half I've had 3x290s, 3x780s, 2x970s, 2x290x, and 2 PSU's, I re-wired the wall sockets in my room, I have 3 fan controllers in my pc, sound dampening foam, about 4 different types of expensive fans multiples of each, all because of coil whine, bearing whine, and electrical noise. I SURE AS HELL know what the noises sound like. Go put your ear next to virtually any CLC be it corsair, nzxt, coolermaster and so on, you are going to hear the pump whine, sure it might not be as loud as *some*, but the noise is there and most of the time it is just as loud as the general cases being shown here.


High frequency vibration ...suppose the electric current runs that fast into the components causing high pitch vibes

there is an scientific name for it piquete effect or something its called ...again i am no high frequency expert









http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67822-graphics-card-coil-whine-investigation.html

http://www.ukgamingcomputers.co.uk/capacitor-squeal-coil-whine-explained-a-63.html


----------



## pengs

Same, I just let it be and it went away in about a week.


----------



## Blameless

Quote:


> Originally Posted by *Bartouille*
> 
> I never thought I would say this but I'm more exited about the 390x than the fury x even tho it's a rebrand lol.


The X isn't worth the price premium, but the 390 non-X is going to be the best deal around once all the cheap 290X parts dry up, which is already happening.


----------



## Thoth420

Coil whine for me always came down to dirty power or a crap PSU. I use an AVR USPS and a EVGA G2...no coil whine since. Also the BeQuiet! Dark Power Pro is a great PSU but very high price tag...defeated my coil whine as well.


----------



## rdr09

Quote:


> Originally Posted by *Blameless*
> 
> The X isn't worth the price premium, but the 390 non-X is going to be the best deal around once all the cheap 290X parts dry up, which is already happening.


me thinks its the Fury (nonX).


----------



## HanSomPa

Well it's not absolute garbage, but it's not worth my money either way. Hardcore enthusiasts are more likely to prefer custom water cooling, and for everyone else a non-reference 980TI will do the trick.


----------



## DMatthewStewart

My only real question at this point is...

Since I have a 1920x1080 (144hz) monitor, and its fairly new, is it even worth getting this card? I know AMD says it can render 4k and then scale it down to your native res, but I cant imagine that that would matter all that much.

I was trying to get away with at least another year out of this monitor


----------



## p4inkill3r

Quote:


> Originally Posted by *DMatthewStewart*
> 
> My only real question at this point is...
> 
> Since I have a 1920x1080 (144hz) monitor, and its fairly new, is it even worth getting this card? I know AMD says it can render 4k and then scale it down to your native res, but I cant imagine that that would matter all that much.
> 
> I was trying to get away with at least another year out of this monitor


Probably not.


----------



## rv8000

Quote:


> Originally Posted by *harney*
> 
> High frequency vibration ...suppose the electric current runs that fast into the components causing high pitch vibes
> 
> there is an scientific name for it piquete effect or something its called ...again i am no high frequency expert
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67822-graphics-card-coil-whine-investigation.html
> 
> http://www.ukgamingcomputers.co.uk/capacitor-squeal-coil-whine-explained-a-63.html


Resonant frequencies of components as well as vibrations, oscillations etc etc..., I've also asked several of my physics professors about the topic many times. I know as much as I can about it


----------



## DMatthewStewart

Quote:


> Originally Posted by *p4inkill3r*
> 
> Probably not.


Thank you. It was a tad rhetorical to even ask but I figured that I may not know as much as I think I do (which happens a lot).


----------



## NuclearPeace

Interesting. I did reduce my overclock to 1400 core and that allowed me to reduce voltage back to stock (down 6mV) for a while, hoping that it would make it go away. It did recede but it sometimes came back up for a few seconds and stopped. From coming back from my vacation, it seems to be completely gone now. I'm back up to my original 1450 core OC.

My PSU is the XFX TS 550W, which might be only 80+ Bronze but is made by Seasonic and its fairly heavy. Its also a tier two PSU so I don't think it was the source of conflict.

Anyway after that experience that's enough for me to pretty much try to avoid coil while like the plague. I think i'm only going special/high end editions of graphics cards for here on out.

Edit: Way better sentence syntax


----------



## toncij

Quote:


> Originally Posted by *DMatthewStewart*
> 
> My only real question at this point is...
> 
> Since I have a 1920x1080 (144hz) monitor, and its fairly new, is it even worth getting this card? I know AMD says it can render 4k and then scale it down to your native res, but I cant imagine that that would matter all that much.
> 
> I was trying to get away with at least another year out of this monitor


I'd rather move to [email protected] or 21:9 1080 @144 - 4K isn't there yet. I expect dual Pascal/FuryXnext will work for 4K fine, but 144Hz is 144Hz...


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> Probably not.


Great Avatar!

I also agree 1440 or 4k(lol) for this card. I am going to be using 1440 60hz Dell awaiting some new freesync displays and news about fixes in the meantime with the Fury X.
Quote:


> Originally Posted by *NuclearPeace*
> 
> Interesting. I did reduce my overclock to 1400 core and that allowed me to reduce voltage back to stock (down 6mV) for a while, hoping that it would make it go away. It did recede but it sometimes came back up for a few seconds and stopped. From coming back from my vacation, it seems to be completely gone now. I'm back up to my original 1450 core OC.
> 
> My PSU is the XFX TS 550W, which might be only 80+ Bronze but is made by Seasonic and its fairly heavy. Its also a tier two PSU so I don't think it was the source of conflict.
> 
> Anyway after that experience that's enough for me to pretty much try to avoid coil while like the plague. I think i'm only going special/high end editions of graphics cards for here on out.
> 
> Edit: Way better sentence syntax


What reso and refresh are you using? Are you hearing it all the time? On desktop even? Does it vary based on what you are doing? I find the best way to find any GPU related coil whine is to boot Hitman Absolution with V sync off and let it sit and the menu. It has no FPS cap so it manifests coil whine or cap whine whatever people want to call it to it's maximum.

Also not trying to trash talk your PSU because I love XFX...chose them for my Fury X and loved my 6970 of theirs which never had any whine. I think your PSU is a bit right on the line for the GPU and while it is not bad quality it is definitely at high load with OC's. I recommend the PSU in my sig strongly and 850 watts is plenty and I can vouch it shows no noticeable whine from any of the components I have used it on...some which suffered it prior on PSU's considered very high quality. The OEM of the G2 is SuperFlower and they are great!


----------



## NuclearPeace

I'm at 1920x1080 with a 60Hz monitor. I usually limit all of my games to a maximum of 60FPS since i'm sensitive to screen tearing and extremely high FPS can cause coil whine. I'm not getting any noises while playing BF4, so that's good enough for me.


----------



## Thoth420

Quote:


> Originally Posted by *NuclearPeace*
> 
> Interesting. I did reduce my overclock to 1400 core and that allowed me to reduce voltage back to stock (down 6mV) for a while, hoping that it would make it go away. It did recede but it sometimes came back up for a few seconds and stopped. From coming back from my vacation, it seems to be completely gone now. I'm back up to my original 1450 core OC.
> 
> My PSU is the XFX TS 550W, which might be only 80+ Bronze but is made by Seasonic and its fairly heavy. Its also a tier two PSU so I don't think it was the source of conflict.
> 
> Anyway after that experience that's enough for me to pretty much try to avoid coil while like the plague. I think i'm only going special/high end editions of graphics cards for here on out.
> 
> Edit: Way better sentence syntax


Quote:


> Originally Posted by *NuclearPeace*
> 
> I'm at 1920x1080 with a 60Hz monitor. I usually limit all of my games to a maximum of 60FPS since i'm sensitive to screen tearing and extremely high FPS can cause coil whine. I'm not getting any noises while playing BF4, so that's good enough for me.


I agree if you were running 144hz and getting high frames it would probably be even louder. I haven't gotten to building my system yet still waiting on some more hardware but I hope mine doesn't have any sound issues...it is a huge pet peeve of mine. I will report back when I have mine up and running on acoustics.

I also edited my last post a bit...brain diarrhea

But yeah Hitman Abso Menu with no V Sync and FPS clamp and OMG the best way to test for it. Even moreso on high refresh panels. If you don't hear it there you probably won't hear it anywhere.


----------



## DMatthewStewart

Quote:


> Originally Posted by *toncij*
> 
> I'd rather move to [email protected] or 21:9 1080 @144 - 4K isn't there yet. I expect dual Pascal/FuryXnext will work for 4K fine, but 144Hz is 144Hz...


Right. And I like the 144hz refresh so its going to be hard to part with that. But since I want to get some time out of this monitor anyway it looks like there is no conflict [for me]. Refresh rates on 4k will probably get higher by the time Im ready for a newer, higher resolution monitor. Is there even 120hz 4k monitor yet? As you can ptobably tell, I dont follow the monitor market closely. I dont even have time to figure out my own hardware let alone read-up on new things. Except the Fury X...that I have been trying to keep up on.


----------



## Thoth420

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Right. And I like the 144hz refresh so its going to be hard to part with that. But since I want to get some time out of this monitor anyway it looks like there is no conflict [for me]. Refresh rates on 4k will probably get higher by the time Im ready for a newer, higher resolution monitor. Is there even 120hz 4k monitor yet? As you can ptobably tell, I dont follow the monitor market closely. I dont even have time to figure out my own hardware let alone read-up on new things. Except the Fury X...that I have been trying to keep up on.


I am also finding it hard to go back to 60hz for games I play with mouse and kb. For controller games I just put on V Sync and it's not so bad...be worse if I had to drop back to 1080 at least for me.


----------



## iLeakStuff

Quote:


> Originally Posted by *DMatthewStewart*
> 
> My only real question at this point is...
> 
> Since I have a 1920x1080 (144hz) monitor, and its fairly new, is it even worth getting this card? I know AMD says it can render 4k and then scale it down to your native res, but I cant imagine that that would matter all that much.
> 
> I was trying to get away with at least another year out of this monitor


Take your pick


----------



## toncij

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Right. And I like the 144hz refresh so its going to be hard to part with that. But since I want to get some time out of this monitor anyway it looks like there is no conflict [for me]. Refresh rates on 4k will probably get higher by the time Im ready for a newer, higher resolution monitor. Is there even 120hz 4k monitor yet? As you can ptobably tell, I dont follow the monitor market closely. I dont even have time to figure out my own hardware let alone read-up on new things. Except the Fury X...that I have been trying to keep up on.


I own a 5K too now. It is nice, very nice. But, you need 4x TitanX (Fury won't do) to run Witcher 3 at 60ish FPS. 5K euro of cards for 5K monitor seems fun.

Anyway. I don't game at 5K unless I play Diablo 3 or World of Warships since I have only one TitanX.

You can make a 4K at 144Hz the same way they made 5K at 60, with dual DP cables. But, nobody made one yet. Such device would cost like 3-4K probably.

You do have 21.9 1080 at 144 and 21:9 1440 at 75, but those arrive by end of the year.


----------



## TopicClocker

This thing is priced way too close to the 980 Ti to have 2GB less VRAM and be noticeably slower than it, I know it has HBM and all but damn, it has insane competition against it with the 980 Ti.

Especially at 4K, I can't see 4GB being too great for games at 4K with graphical settings high in the latest more intensive games.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TopicClocker*
> 
> This thing is priced way too close to the 980 Ti to have 2GB less VRAM and be noticeably slower than it, I know it has HBM and all but damn, it has insane competition against it with the 980 Ti.
> 
> Especially at 4K, I can't see 4GB being too great for games at 4K with graphical settings high in the latest more intensive games.


You don't see but they are. GTX780 Ti was slightly faster then R9 290X and has 1GB less vRAM and was $150 more. Talk about value.


----------



## CasualCat

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You don't see but they are. GTX780 Ti was slightly faster then R9 290X and has 1GB less vRAM and *was $150 more*. Talk about value.


Well at least until you factored in mining mark up in the US. 290/290Xs were going for stupid high prices. I wanted one back then and didn't get one because I didn't want to pay above msrp.


----------



## DividebyZERO

Quote:


> Originally Posted by *iLeakStuff*
> 
> Take your pick


First thing i thought when i saw this is only two titles in that list would do any good with 144hz. So i guess that means relying on sli cf?


----------



## BoredErica

Quote:


> Originally Posted by *DividebyZERO*
> 
> First thing i thought when i saw this is only two titles in that list would do any good with 144hz. So i guess that means relying on sli cf?


Or turning down MSAA...


----------



## Kaltenbrunner

Nvidia seems to have had better and more consistent frametimes for 2 gens iirc.........probably related to why I don't trust crossfire now after 2 rounds of CF failing me


----------



## Remij

If you can talk about value then I don't understand why you can't talk about the card's possible effect on AMD in these review threads. Many reviews reflect on the state of AMD GPUs and their potential in the market. I've always thought of these review threads as sort of 'Official' threads for talking about the cards and their effect on the market. Wheras the 'Owners Club' threads are for personal experience, help and general discussion about said cards.

That's just me though.


----------



## curlyp

I really want to give AMD a fair shot and use there card through an iteration until a new line comes out; however, every time I buy an AMD card and attempt to try, I have issues. The issues could be on my end, but attempting different series of cards over the course of several years points me to believe it is AMD.

First issue - Every time my monitor goes to sleep (after 15 minutes) I receive multi-colored vertical and horizontal lines. This only happens with AMD cards (currently the R9 Fury X). My GTX 970, allows it to stay on the black screen. See pictures below of the lines and the power settings.

Second issue - AMD cards never registers correctly in my Device Manager as well as the Catalyst software seems to be buggy for me. Under Display adapters (see picture below) it shows AMD Radeon R9 series and Intel(R) HD Graphics 4000. I've made sure I have the latest AMD driver and Catalyst version and all my Nvidia drivers are uninstalled. I ran Fire Strike about 20 minutes ago and the 'run details' show both Radeon and Intel GPUs (see pictures below).

Lastly, I do not even see an increase in GW2 FPS. I am getting 55 FPS with the R9 Fury X which is the same as my GTX 970! Check out the picture below.

I will give it a couple more days, but I may just return the card back to Microcenter or sell it online or to someone in the OC community.









edit: grammatical
edit#2: I forgot to add my monitor is a Dell U2711 2560 x 1440


----------



## azanimefan

Quote:


> Originally Posted by *hamzta09*
> 
> I love how everyone assumes 1080p means <60fps.
> 1080p gaming today is all about 120-144hz meaning 120-144fps.
> 
> Thus 280X is farrrrrrrrrrrrrrrrrrrrr from enough.
> Heck a 980 Ti isnt even enough.
> 
> Why anyone wants to play on a 60hz monitor with that amount of ghosting, motionblur and what not.. is beyond me.


I don't really play FPS, ghosting and motion blur don't exist in my world.

I play on 3x 1080p monitors with one gtx970; it is MORE then enough for that.


----------



## p4inkill3r

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Sure it is, the reviews are obviously having an effect on their stock. That is very relevant and on topic.


LOL, come on. You did see that Micron lost almost 20% today, right? Intel was down 3% and nvidia was down 2%.

I know your AMD short must be burning a hole in your margin account right now, but it is possible that the sector overall was down today:


Spoiler: Warning: Spoiler!



Micron blamed the results almost exclusively on "weakness in the PC sector," prompting a large number of Wall Street brokerages to take a scalpel to their 12-month price targets on the stock and issue notes to clients expressing concern about the company's growth plans.


----------



## FallenFaux

Quote:


> Originally Posted by *p4inkill3r*
> 
> LOL, come on. You did see that Micron lost almost 20% today, right? Intel was down 3% and nvidia was down 2%.
> 
> I know your AMD short must be burning a hole in your margin account right now, but it is possible that the sector overall was down today:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Micron blamed the results almost exclusively on "weakness in the PC sector," prompting a large number of Wall Street brokerages to take a scalpel to their 12-month price targets on the stock and issue notes to clients expressing concern about the company's growth plans.


Micron had a competing memory standard called HMC (Hybrid Memory Cube) that lost out to HBM... I'm sure none of this is related.


----------



## kckyle

i haven't been following this, so excuse me for my silly question. but how is the fury x with 4gb ram keeping up with 980 ti and titan x in 4k bench? are these games not using more than 4gb or something?


----------



## FallenFaux

Quote:


> Originally Posted by *kckyle*
> 
> i haven't been following this, so excuse me for my silly question. but how is the fury x with 4gb ram keeping up with 980 ti and titan x in 4k bench? are these games not using more than 4gb or something?


The 295x2 with 4GB VRAM beats the Titan X in most 4k stuff too which would indicated that they don't need more than 4GB. I think you do start to run out when you start tacking on MSAA though.


----------



## rdr09

Quote:


> Originally Posted by *curlyp*
> 
> I really want to give AMD a fair shot and use there card through an iteration until a new line comes out; however, every time I buy an AMD card and attempt to try, I have issues. The issues could be on my end, but attempting different series of cards over the course of several years points me to believe it is AMD.
> 
> First issue - Every time my monitor goes to sleep (after 15 minutes) I receive multi-colored vertical and horizontal lines. This only happens with AMD cards (currently the R9 Fury X). My GTX 970, allows it to stay on the black screen. See pictures below of the lines and the power settings.
> 
> Second issue - AMD cards never registers correctly in my Device Manager as well as the Catalyst software seems to be buggy for me. Under Display adapters (see picture below) it shows AMD Radeon R9 series and Intel(R) HD Graphics 4000. I've made sure I have the latest AMD driver and Catalyst version and all my Nvidia drivers are uninstalled. I ran Fire Strike about 20 minutes ago and the 'run details' show both Radeon and Intel GPUs (see pictures below).
> 
> Lastly, I do not even see an increase in GW2 FPS. I am getting 55 FPS with the R9 Fury X which is the same as my GTX 970! Check out the picture below.
> 
> I will give it a couple more days, but I may just return the card back to Microcenter or sell it online or to someone in the OC community.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: grammatical
> edit#2: I forgot to add my monitor is a Dell U2711 2560 x 1440


two things . . .

1. the applications such as 3dmark might have not been updated to recognize the gpu

2. your igpu might be enabled.


----------



## hamzta09

Quote:


> Originally Posted by *azanimefan*
> 
> I don't really play FPS, ghosting and motion blur don't exist in my world.
> 
> I play on 3x 1080p monitors with one gtx970; it is MORE then enough for that.


Sure if you enjoy lowering settings alot or below 60fps..

Ghosting and Motionblur isnt tied to FPS games.. just scrolling in Chrome/FF has blur/ghosting. Moving a window around has blur/ghosting.


----------



## speedyeggtart

Quote:


> Originally Posted by *47 Knucklehead*
> 
> Looks like the word is out on the reviews.
> 
> AMD's stock is taking a beating ... down $0.11 for the day, after taking a $0.17 hit at today's lowest point. At least it's back up to the amazing $2.47 a share price.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nVidia's stock remains the same as it opened today at $21.00 per share in after hours trading.


Quote:


> Originally Posted by *47 Knucklehead*
> 
> I don't know why not. All the reviews are linked to on page one, they have been talked about to death for the past 45 pages. Why not talk about the economic ramifications that AMD is now going to have to deal with for releasing an entire line of rebrands and their flag ship card being 3% slower than nVidia's SECOND fastest card for the same price?


A re-brand that can do 10bit + color and performs better than the previous? If it is a straight re-brand it would not have new features and not perform better.

Flag Ship card has new tech (HBM) and it is water cooled = costs. I'm pretty sure if it was not water cooled it would cost $50 to $100 less.

Anyhow I work in the financial industry - I love day traders because they provide liquidity for the stock market.... and I love people shorting stocks because they provides an insurance options market to hedge my investments...


----------



## provost

Quote:


> Originally Posted by *speedyeggtart*
> 
> A re-brand that can do 10bit + color and performs better than the previous? If it is a straight re-brand it would not have new features and not perform better.
> 
> Flag Ship card has new tech (HBM) and it is water cooled = costs. I'm pretty sure if it was not water cooled it would cost $50 to $100 less.
> 
> Anyhow I work in the financial industry - I love day traders because they provide liquidity for the stock market.... and I love people shorting stocks because they provides an insurance options market to hedge my investments...


I don't much about trading and all that, but do day traders provide a lot of liquidity?.. hmmm
I guess it depends on what you mean by "day traders"... Lol


----------



## speedyeggtart

Quote:


> Originally Posted by *Darkwizzie*
> 
> Ingles por favor.


Basically he said a long and wise investor do not care about these things. Both AMD and Nvidia are mature companies that do not provide substantial stock growth - unless they do something really new, gain back lost market share in the respective specialties, or game changing in the industry.

Nvidia is trying to do this by going mobile (Tegra Processors), Living room gaming vs consoles (Nvidia Shield), Automotive Industry (GPU for Mobile Graphic Interfaces/GPS Auto Systems), and entry into the Smart Phone Industry (which they failed and are existing).

AMD is trying to do this by trying to gain back their Server and Enterprise Business, Desktop CPU, Laptop CPUs, console CPUs, and HBM.

Nvidia are entering territories/markets they are not familiar with and are fighting behemoths! i.e. reason why they failed and existed the mobile smart phone industry. On their other projects - Which if successful will significantly boost their stocks and bottom line or they can crash and burn R&D/Marketing costs.

AMD is fighting with things they are familiar and already have experience in the market with to gain revenue- if successful, can gain back market share


----------



## speedyeggtart

Quote:


> Originally Posted by *provost*
> 
> I don't much about trading and all that, but do day traders provide a lot of liquidity?.. hmmm
> I guess it depends on what you mean by "day traders"... Lol


Liquidity = easy to convert to cash or sell = what day traders do, buy and sell stocks daily...


----------



## provost

Quote:


> Originally Posted by *speedyeggtart*
> 
> Liquidity = easy to convert to cash or sell = what day traders do, buy and sell stocks daily...


I think my emphasis was on " a lot" of liquidity... But it got lost in my Ingles... Lol

But, I see what you mean.









However, if ok, I have no comment on the translation of my Spanish above. It was gibberish on my part anyway .


----------



## Noufel

Quote:


> Originally Posted by *szeged*
> 
> if it was air cooled it would cost more after the inevitable house fire damages.
> 
> sorry had to


FirefuryX


----------



## blue1512

Quote:


> Originally Posted by *szeged*
> 
> if it was air cooled it would cost more after the inevitable house fire damages.
> 
> sorry had to


http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-7.html
Max power 450W, but *average* 220W in gaming loop.
Most of review only show the peak number, but the *average* consumption is that really matter.
Fyi, 980ti has 295W max but *235W average*
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-7.html

Your sarcasm seems off.


----------



## szeged

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html

your reading seems off.


----------



## Siezureboy

Quote:


> Originally Posted by *szeged*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> your reading seems off.


The compact water cooler does a good job maintaining frosty temperatures; we recorded 54 °C during the gaming loop (221W) and 64 °C during the stress test (347W). - See more at: http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html#sthash.BTP96WS3.dpuf


----------



## Standards

Quote:


> Originally Posted by *szeged*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> your reading seems off.


You.. do realize that what you linked just confirmed what he said, correct?
Quote:


> Originally Posted by *blue1512*
> 
> Max power 450W, but *average* 220W in gaming loop.


What you both linked to says "we recorded 54 °C during the gaming loop (221W)"


----------



## bfedorov11

Quote:


> Originally Posted by *Thoth420*
> 
> Coil whine for me always came down to dirty power or a crap PSU. I use an AVR USPS and a EVGA G2...no coil whine since. Also the BeQuiet! Dark Power Pro is a great PSU but very high price tag...defeated my coil whine as well.


I was right about to say that.. I wonder how large of a roll the psu plays with coil whine. I've pushed 1.45v through 780tis and the only time they made noise was at the start or end of a run. Never even had it with older amd cards. Always had a quality psu. evga 1300w g2 here


----------



## hamzta09

Quote:


> Originally Posted by *bfedorov11*
> 
> I was right about to say that.. I wonder how large of a roll the psu plays with coil whine. I've pushed 1.45v through 780tis and the only time they made noise was at the start or end of a run. Never even had it with older amd cards. Always had a quality psu. evga 1300w g2 here


Start and end? Guess what - Superhigh FPS = Coilwhine.


----------



## Cool Mike

Received my Fury X today(XFX version). I wanted to go with a smaller mid-size case, moved all my hardware over to the Corsair Obsidian 450D. My goal was to mount fury's radiator in the front of the case and this worked out very well as I also used the 140mm fan that came with the case as a pull configuration. Ran valley and 3Dmark fire strike to lock in a overclock of 1125Mhz on the core. No voltage control yet, but I suspect with a bump on core voltage I could hit 1175-1200Mhz. Haven't run any games yet but ran a few fire strike Ultra (4K) runs. Yes, its all true at 1080p the 980Ti will beat the Fury all day every day. 4K is fury's game here and after a few runs of Fire strike ultra it was clear I was in Titan X territory. After looking at titan X 4K scores the fury is in par with the Titan X in many cases. My Ultra score was 4224. So far, I am really enjoying the fury. Being a hardware lover, I really like the RED LED bar graph that indicates GPU load.


----------



## Casey Ryback

Quote:


> Originally Posted by *blue1512*
> 
> Your sarcasm seems off.


According to anandtech (which I believe to be true)

The 980ti uses around 20W less during gaming (crysis 3)

408W vs 388W.

People will always refer to AMD as burning things down and chewing excessive power.

Many people just get used to certain things, or stuck in the past and can't let them go, this is the way of OCN.

Just let them have their fun they'll never change


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> According to anandtech (which I believe to be true)
> 
> The 980ti uses around 20W less during gaming (crysis 3)
> 
> 408W vs 388W.
> 
> People will always refer to AMD as burning things down and chewing excessive power.
> 
> Many people just get used to certain things, or stuck in the past and can't let them go, this is the way of OCN.
> 
> Just let them have their fun they'll never change


550W in Tomb Raider @ 4K.


----------



## rv8000

Quote:


> Originally Posted by *curlyp*
> 
> I really want to give AMD a fair shot and use there card through an iteration until a new line comes out; however, every time I buy an AMD card and attempt to try, I have issues. The issues could be on my end, but attempting different series of cards over the course of several years points me to believe it is AMD.
> 
> First issue - Every time my monitor goes to sleep (after 15 minutes) I receive multi-colored vertical and horizontal lines. This only happens with AMD cards (currently the R9 Fury X). My GTX 970, allows it to stay on the black screen. See pictures below of the lines and the power settings.
> 
> Second issue - AMD cards never registers correctly in my Device Manager as well as the Catalyst software seems to be buggy for me. Under Display adapters (see picture below) it shows AMD Radeon R9 series and Intel(R) HD Graphics 4000. I've made sure I have the latest AMD driver and Catalyst version and all my Nvidia drivers are uninstalled. I ran Fire Strike about 20 minutes ago and the 'run details' show both Radeon and Intel GPUs (see pictures below).
> 
> Lastly, I do not even see an increase in GW2 FPS. I am getting 55 FPS with the R9 Fury X which is the same as my GTX 970! Check out the picture below.
> 
> I will give it a couple more days, but I may just return the card back to Microcenter or sell it online or to someone in the OC community.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: grammatical
> edit#2: I forgot to add my monitor is a Dell U2711 2560 x 1440


GW2 is SO cpu bottlenecked it isn't even funny, expect no fps increase in cities and around large groups of people. Go to the shiverpeaks and enjoy like 200+ fps


----------



## ZealotKi11er

Quote:


> Originally Posted by *rv8000*
> 
> GW2 is SO cpu bottlenecked it isn't even funny, expect no fps increase in cities and around large groups of people. Go to the shiverpeaks and enjoy like 200+ fps


Yeah lol. I have upgrade many GPU since GW2 came out and get same fps lol.


----------



## Assirra

MMO's always are more CPU limited.
They should honestly never be counted for GPU testing.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> According to anandtech (which I believe to be true)
> 
> The 980ti uses around 20W less during gaming (crysis 3)
> 
> 408W vs 388W.
> 
> People will always refer to AMD as burning things down and chewing excessive power.
> 
> Many people just get used to certain things, or stuck in the past and can't let them go, this is the way of OCN.
> 
> Just let them have their fun they'll never change


http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html

Fury X is till hotter than other without the cooler and the VRM is boiling even with it.


----------



## criminal

Quote:


> Originally Posted by *Cool Mike*
> 
> Received my Fury X today(XFX version). I wanted to go with a smaller mid-size case, moved all my hardware over to the Corsair Obsidian 450D. My goal was to mount fury's radiator in the front of the case and this worked out very well as I also used the 140mm fan that came with the case as a pull configuration. Ran valley and 3Dmark fire strike to lock in a overclock of 1125Mhz on the core. No voltage control yet, but I suspect with a bump on core voltage I could hit 1175-1200Mhz. Haven't run any games yet but ran a few fire strike Ultra (4K) runs. Yes, its all true at 1080p the 980Ti will beat the Fury all day every day. 4K is fury's game here and after a few runs of Fire strike ultra it was clear I was in Titan X territory. After looking at titan X 4K scores the fury is in par with the Titan X in many cases. My Ultra score was 4224. So far, I am really enjoying the fury. Being a hardware lover, I really like the RED LED bar graph that indicates GPU load.


Looks good.


----------



## Lshuman

People please, the 980ti has been out for a couple of weeks now. So this has given Nvidia time to have a couple of driver updates. My thoughts is, give AMD a couple of weeks with their card (The Fury X) and updated drivers before we start bashing the performance. Lets not forget the Titan x and the 980ti has the same architecture so the drivers are even more updated than the initial 980ti's launch.


----------



## curlyp

Quote:


> Originally Posted by *rv8000*
> 
> GW2 is SO cpu bottlenecked it isn't even funny, expect no fps increase in cities and around large groups of people. Go to the shiverpeaks and enjoy like 200+ fps


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah lol. I have upgrade many GPU since GW2 came out and get same fps lol.


Thanks for the info.


----------



## Exilon

Quote:


> Originally Posted by *Sashimi*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> Fury X is till hotter than other without the cooler and the VRM is boiling even with it.


Fury X also has the benefit of running at 50C, which lowers leakage power by ~30W compared to a chip running at 80C. It'll be interesting to see how well air-cooled Fiji fares against GM200.


----------



## looniam

my thoughts after reading 2254 posts:

i have no life.


----------



## ladcrooks

Quote:


> Originally Posted by *Lshuman*
> 
> People please, the 980ti has been out for a couple of weeks now. So this has given Nvidia time to have a couple of driver updates. My thoughts is, give AMD a couple of weeks with their card (The Fury X) and updated drivers before we start bashing the performance. Lets not forget the Titan x and the 980ti has the same architecture so the drivers are even more updated than the initial 980ti's launch.


exactly - i keep coming here to see more, whats been said . Sometimes i jump to the end of the posts, so much more has accumulated here, what 2254, so i am going to miss a few









Point i wanted to make which i believe i have done here already - is, its new, amd got here 1st







them! The results speak for themselves, the card is brilliant. Knock the price down and it will be a winner. So many moaners on here.

If this was nividia wearing the same shoes - my views would be exactly the same - new tecky is always a blessing









So *criminal* - are you happy with the results so far ?


----------



## pengs

Quote:


> Originally Posted by *Sashimi*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> Fury X is till hotter than other without the cooler and the VRM is boiling even with it.


70c isn't bad, nor is it bad for a VRM, it's actually well well within acceptance. 90c is getting there, still within acceptance but I guess it's something to consider if you play Furmark all day long


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> Fury X is till hotter than other without the cooler and the VRM is boiling even with it.


No fury X is not hotter.

The cores are cooler, the VRM's are getting hot because there's no airflow on them. This is the trade off with the fury.

The fury cards with air coolers made by all the big brands will have similar core temps, and vrm temps to the 980ti.


----------



## Casey Ryback

Quote:


> Originally Posted by *pengs*
> 
> 70c isn't bad, nor is it bad for a VRM, it's actually well well within acceptance. 90c is getting there, still within acceptance but I guess it's something to consider if you play Furmark all day long


The MSI gtx 970 vrm's get to 90C under furmark too.

But of course it's AMD that are way hotter than anything else.

Seriously people need to take their green goggles off.


----------



## Rei86

Just saying man.

Also I don't get why they couldn't have done a hybrid like cooler that EVGA did or even the 295x2. Keeping some airflow to the other components on the card would've been a blessing.


----------



## Casey Ryback

Quote:


> Originally Posted by *Rei86*
> 
> Just saying man.
> 
> Also I don't get why they couldn't have done a hybrid like cooler that EVGA did or even the 295x2. Keeping some airflow to the other components on the card would've been a blessing.


It doesn't matter vrm's are within safe parameters.

The hybrid EVGA is more expensive than the AMD so they've done an ok job with it.

While you say it's not going to be a fury X, we've seen how good a slightly lower spec card can be in the example of the 980ti vs titan X.

Here's a 980ti in a gaming loop. 85C, who cares.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-8.html

EVGA 980ti SC ACX gets too hot.

http://forums.evga.com/980Ti-SC-ACX-20-High-temps-at-stock-speeds-m2349755.aspx

Yet AMD gets a little hot and uh oh better go and make sure evryone knows how bad the AMD cards are.

90% of people excessively bagging the fury have nvidia cards, it's no coincidence.


----------



## iCrap

Gonna have to skip this one after reading the reviews, looking at benchmarks and hearing about the coil whine / pump whine. Just not that great.


----------



## pengs

Oh hai, a Titan X in a _gaming loop_



looks for nearest exit


----------



## Casey Ryback

Quote:


> Originally Posted by *pengs*
> 
> Oh hai, a Titan X in a _gaming loop_
> 
> 
> 
> looks for nearest exit


But.....but....but AMD is bad....mmmkay.


----------



## Standards

People on the internet that shill for Nvidia (specifically OCN, LTT, etc) are like abused dogs that come back and cuddle with their owners at night. It's very sad to watch.


----------



## gigatiger

Quote:


> Originally Posted by *Casey Ryback*
> 
> But.....but....but AMD is bad....mmmkay.


sorry, but titan x goes 100 c...???????? so what type of cooling it needs? the iceman from xmen?

dont get crazy with video cards, i believe with a two cay sli or two way crossfire you are excellent BUT above all lis cooling


----------



## looniam

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pengs*
> 
> Oh hai, a Titan X in a _gaming loop_
> 
> 
> 
> looks for nearest exit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But.....but....but AMD is bad....mmmkay.
Click to expand...

point made but at what fan speed?


oh, ~60%

have no idea why tom's used titan x's rpms in the 980ti review but it's the same here:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-6.html
i mean looking at temps w/o knowing fan speeds is like looking at gaming benches w/o knowing core clocks . .amirite?


----------



## Rei86

Quote:


> Originally Posted by *Casey Ryback*
> 
> It doesn't matter vrm's are within safe parameters.
> 
> The hybrid EVGA is more expensive than the AMD so they've done an ok job with it.
> 
> While you say it's not going to be a fury X, we've seen how good a slightly lower spec card can be in the example of the 980ti vs titan X.
> 
> Here's a 980ti in a gaming loop. 85C, who cares.
> 
> http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-8.html
> 
> EVGA 980ti SC ACX gets too hot.
> 
> http://forums.evga.com/980Ti-SC-ACX-20-High-temps-at-stock-speeds-m2349755.aspx
> 
> Yet AMD gets a little hot and uh oh better go and make sure evryone knows how bad the AMD cards are.
> 
> 90% of people excessively bagging the fury have nvidia cards, it's no coincidence.


The point is we know these hybrid like coolers have been coming on to market and some have before all this done DYI crap. We've seen people fry their VRAM because all they did was cool their GPU over everything. Yes I know we're talking about the VRM right, but having some cooling going over it would have helped.
Quote:


> Originally Posted by *Standards*
> 
> People on the internet that shill for Nvidia (specifically OCN, LTT, etc) are like abused dogs that come back and cuddle with their owners at night. It's very sad to watch.


AMD apologist are just as annoying.


----------



## speedyeggtart

Quote:


> Originally Posted by *Sashimi*
> 
> Great humor!!


Just thought to post something to calm the unnecessary Nvidia vs AMD flame going on in this thread... at the end of the day we are all PC gaming fans/enthusiasts that we can unite for a good cause!


----------



## Klocek001

apart from fury x test drivers argument, what the deuce is wrong with those reviewers.
techpowerup shows 295x2 behind 980ti and almost equal to 390x in FC4 1440p. tomshardware shows 295x2 miles ahead of all the competition in the same game, same resolution, with 30fps difference over what techpowerup measured. CF either works or not, it's not weather dependent.
And if I was determined to stay on the red side, after those reviews I'd be sure to stay away from fury and get a 295x2. The "get what is best at this very moment" point for 980ti just doesn't speak to me. Look at 290x vs Titan (the original one) now.


----------



## Casey Ryback

Quote:


> Originally Posted by *Klocek001*
> 
> apart from fury x test drivers argument, what the deuce is wrong with those reviewers.
> techpowerup shows 295x2 behind 980ti and almost equal to 390x in FC4 1440p. tomshardware shows 295x2 miles ahead of all the competition in the same game, same resolution, with 30fps difference over what techpowerup measured. CF either works or not, it's not weather dependent.


The discrepancies between reviews is really interesting.

Some of the websites are using different versions of windows along with different systems of course. Not sure how that is causing so much difference though.


----------



## Mtom

Quote:


> Originally Posted by *Sashimi*
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-8.html
> 
> Fury X is till hotter than other without the cooler and the VRM is boiling even with it.


The card is below 50C in gaming, and VRM is around 60C hows that boiling?
Or you play with furmark? Because thats where the VRM is 100C

I suggest to check hardware.fr
They measured the same on fury x with thermal cam....and in the same test they measured 96C on the 980Ti VRM.


----------



## Klocek001

measuring card's temps while running furmark is as informative as measuring a car's fuel consumption while burning rubber on handbrake.


----------



## Noufel

Quote:


> Originally Posted by *Klocek001*
> 
> measuring card's temps while running furmark is as informative as measuring a car's fuel consumption while burning rubber on handbrake.


you didn't know furmark is an excelent game that most people play hours & hours all day long, it's the first game that you have tou use to bench a gpu


----------



## Klocek001

there's one thing I learnt from furmark - use it if you wanna break your card.


----------



## mark_thaddeus

I was hoping the Fury X or the 980Ti would be able to hold 60 fps and even exceed it at 1440p (Ultra settings and all), looking at the benchmarks it can't really do it for all games.

I guess I wait for pascal or the AMD counterpart before upgrading!


----------



## michaelius

Quote:


> Originally Posted by *Klocek001*
> 
> apart from fury x test drivers argument, what the deuce is wrong with those reviewers.
> techpowerup shows 295x2 behind 980ti and almost equal to 390x in FC4 1440p. tomshardware shows 295x2 miles ahead of all the competition in the same game, same resolution, with 30fps difference over what techpowerup measured. CF either works or not, it's not weather dependent.
> And if I was determined to stay on the red side, after those reviews I'd be sure to stay away from fury and get a 295x2. The "get what is best at this very moment" point for 980ti just doesn't speak to me. Look at 290x vs Titan (the original one) now.


Nothing weird in that - as long as those sites don't use canned benchmarks then they will have diffrent test places. Throw even a part of cpu demanding place into the benchmark routine and CF starts to lose it's magic.


----------



## Blameless

Quote:


> Originally Posted by *Klocek001*
> 
> there's one thing I learnt from furmark - use it if you wanna break your card.


No non-defective part will be broken by FurMark when run anywhere near spec.


----------



## Klocek001

omni
Quote:


> Originally Posted by *Blameless*
> 
> No non-defective part will be broken by FurMark when run anywhere near spec.


mr omniscience speaking here. how can you know that for sure?
Quote:


> Originally Posted by *michaelius*
> 
> Nothing weird in that - as long as those sites don't use canned benchmarks then they will have diffrent test places. Throw even a part of cpu demanding place into the benchmark routine and CF starts to lose it's magic.


all the other cards tested get very close results in both benchmarks.


----------



## Tivan

Quote:


> Originally Posted by *Blameless*
> 
> No non-defective part will be broken by FurMark when run anywhere near spec.


Running FurMark near spec is impossible without throttling, and running it while throttling defeats the purpose of using it, as I see it.

Raising power limit by 50% to increase max wattage drawn by 50% beyond specs is not running 'near spec'

Not saying that Furmark is worthless, but it's not even all that good for stability testing sometimes. Just run a game that knows to utilize some extra power c;


----------



## Kane2207

Furmark is pretty worthless, both AMD and Nvidia recognise the binary at driver level and throttle the card since forever.


----------



## harney

Quote:


> Originally Posted by *curlyp*
> 
> I really want to give AMD a fair shot and use there card through an iteration until a new line comes out; however, every time I buy an AMD card and attempt to try, I have issues. The issues could be on my end, but attempting different series of cards over the course of several years points me to believe it is AMD.
> 
> First issue - Every time my monitor goes to sleep (after 15 minutes) I receive multi-colored vertical and horizontal lines. This only happens with AMD cards (currently the R9 Fury X). My GTX 970, allows it to stay on the black screen. See pictures below of the lines and the power settings.
> 
> Second issue - AMD cards never registers correctly in my Device Manager as well as the Catalyst software seems to be buggy for me. Under Display adapters (see picture below) it shows AMD Radeon R9 series and Intel(R) HD Graphics 4000. I've made sure I have the latest AMD driver and Catalyst version and all my Nvidia drivers are uninstalled. I ran Fire Strike about 20 minutes ago and the 'run details' show both Radeon and Intel GPUs (see pictures below).
> 
> Lastly, I do not even see an increase in GW2 FPS. I am getting 55 FPS with the R9 Fury X which is the same as my GTX 970! Check out the picture below.
> 
> I will give it a couple more days, but I may just return the card back to Microcenter or sell it online or to someone in the OC community.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: grammatical
> edit#2: I forgot to add my monitor is a Dell U2711 2560 x 1440


maybe disable intel gfx in bios see if that helps with the amd card in sleep


----------



## youra6

Quote:


> Originally Posted by *Klocek001*
> 
> omni
> mr omniscience speaking here. how can you know that for sure?
> all the other cards tested get very close results in both benchmarks.


Basically, you can't overload a GPU like you can a CPU. Normally when you run a game or a synthetic benchmark like Heaven the GPU, it's never utilized 100% (disregard what software monitoring tells you).

However, the case is different for Furmark. When you run it, the GPU is being pounded with data to process all the time. Your VRMs gets feed more power than what your GPU is "comfortable with." And with sustained use, I can see cards going bye bye.

Its definitely not fiction. On a side note, my r9 290X just DIED like 2 minutes ago.


----------



## iLeakStuff

Quote:


> Originally Posted by *pengs*
> 
> Oh hai, a Titan X in a _gaming loop_
> 
> 
> 
> looks for nearest exit


Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


----------



## tx12

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


Its better to have 100+ deg. C on VRM than on VRAM. Its much worse to have hot VRAM chips.
VRM components are allowed to 150 deg. C but VRAM should be 85 deg. C or 105 deg. C MAX.


----------



## flopper

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


its watercooled to keep it cool and silent.
Not hot and noisy like 980ti.


----------



## blue1512

Quote:


> Originally Posted by *youra6*
> 
> Basically, you can't overload a GPU like you can a CPU. Normally when you run a game or a synthetic benchmark like Heaven the GPU, it's never utilized 100% (disregard what software monitoring tells you).
> 
> However, the case is different for Furmark. When you run it, the GPU is being pounded with data to process all the time. Your VRMs gets feed more power than what your GPU is "comfortable with." And with sustained use, I can see cards going bye bye.
> 
> Its definitely not fiction. On a side note, my r9 290X just DIED like 2 minutes ago.


It aren't applied to nVidia cards. Their driver restricts them from pulling to much power in Furmark specifically.


----------



## blue1512

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


They removed the backplate in that test. A simple Fujipoly treatment in the VRM zone to transfer heat to the backplate will solve it


----------



## gigatiger

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


both are too hot, way too hot...


----------



## delboy67

Quote:


> Originally Posted by *flopper*
> 
> its watercooled to keep it cool and silent.
> Not hot and noisy like 980ti.


I wish I had of copied and saved some the posts on here around xmas 2013, back then everything on ocn was different, extra vram didn't matter, a few percent performance didn't matter and a few extra dollas didnt matter, it was all about the premium quality cooler, heat and noise.


----------



## blue1512

Quote:


> Originally Posted by *delboy67*
> 
> I wish I had of copied and saved some the posts on here around xmas 2013, back then everything on ocn was different, extra vram didn't matter, a few percent performance didn't matter and a few extra dollas didnt matter, it was all about the premium quality cooler, heat and noise.


It's 290x vs 780Ti, right? You know what, because nVidia has more customers and people normally defend their decision, no matter if it's right or wrong.


----------



## delboy67

Quote:


> Originally Posted by *blue1512*
> 
> It's 290x vs 780Ti, right? You know what, because nVidia has more customers and people normally defend their decision, no matter if it's right or wrong.


Yes the 290x, it was far to hot and loud to consider over the competition even in quiet mode, the reference cooler was to cheap and plastic-y and the extra vram did nothing, almost a gimmick for bragging rights. hmm


----------



## Tivan

Quote:


> Originally Posted by *curlyp*
> 
> Lastly, I do not even see an increase in GW2 FPS. I am getting 55 FPS with the R9 Fury X which is the same as my GTX 970!


MMOs tend to be cpu/api limited, so wait for Vulkan/DX12 or get a better CPU c: (though of course, even the best CPU is limiting in MMOs in anything with a bunch of players.)


----------



## BigTree

I still wondering what Joe Macri meant saying:
Quote:


> You'll be able to overclock this thing like no tomorrow.
> This is an overclocker's dream.


Has AMD something up its sleeve or is it just hot air?


----------



## Klocek001

Quote:


> Originally Posted by *youra6*
> 
> Its definitely not fiction. On a side note, my r9 290X just DIED like 2 minutes ago.


this is why I'm on the prowl for a gpu, my 290 trix is slowly running out of warranty. shame on sapphire for only 24 months.


----------



## MadRabbit

Quote:


> Originally Posted by *BigTree*
> 
> I still wondering what Joe Macri meant saying:
> Has AMD something up its sleeve or is it just hot air?


If it would have been Taylor saying that it would be just hot air in and out but Macri...


----------



## blue1512

Quote:


> Originally Posted by *BigTree*
> 
> I still wondering what Joe Macri meant saying:
> Has AMD something up its sleeve or is it just hot air?


Voltage control is coming in the next AB, Trixx or GPU Tweak. It's the same IR chip in 290x afterall.

And Hardocp achieved 37C with 1140 MHz core when turning up the fan to 100% (acceptable noise). FuryX will not be hold back by temp for sure.

Furthermore it has 8+8pin feeding setup, so it will have a lot of of juice to play with.

To be honest it has many features of an overclocker's dream. Let's hope that the silicon of Fiji won't let us down.


----------



## The Stilt

Quote:


> Originally Posted by *tx12*
> 
> VRM components are allowed to 150 deg. C but VRAM should be 85 deg. C or 105 deg. C MAX.


Yeah, the fets used on Fury X indeed have TjMax of 150°C... with zero load.

IRF6894M derating curve:

25°C = 160A
50°C = 150A
75°C = 130A
100°C = 108A
125°C = 75A
150°C = 0A

The VRM controller will trip the overheating protection at 127°C and force shutdown at 134°C.


----------



## Themisseble

https://www.youtube.com/watch?v=jMeZUNuaBVI
Its dying lihgt and R9 Fury should be slower than GTX 980Ti
Please look at VRAM usage!


----------



## curlyp

Quote:


> Originally Posted by *youra6*
> 
> Its definitely not fiction. On a side note, my r9 290X just DIED like 2 minutes ago.


That sucks







R.I.P
Quote:


> Originally Posted by *Klocek001*
> 
> this is why I'm on the prowl for a gpu, my 290 trix is slowly running out of warranty. shame on sapphire for only 24 months.


That's why I use my American Express to purchase G-Cards (or anything I want covered) for the extended warranty by one year after the manufactures warranty ends.


----------



## XxOsurfer3xX

Quote:


> Originally Posted by *Themisseble*
> 
> https://www.youtube.com/watch?v=jMeZUNuaBVI
> Its dying lihgt and R9 Fury should be slower than GTX 980Ti
> Please look at VRAM usage!


That is one of my main concers about this card 4K + 4GB is not great for really demanding games. That dying light framerate is worse than my overclocked 980


----------



## criminal

Quote:


> Originally Posted by *Themisseble*
> 
> https://www.youtube.com/watch?v=jMeZUNuaBVI
> Its dying lihgt and R9 Fury should be slower than GTX 980Ti
> Please look at VRAM usage!


That is interesting. VRAM usage was not bad at all.


----------



## redshoulder

Have 6950cf and 780sli, my thoughts with this card are that 4gb vram is a bit on the low side.

Got burnt from getting the 780sli, vram is also low (at that time people said that 3gb was plenty) and 6 months later the ti version was released.

It seems that for me waiting for the 290x was the better option, more vram and cards last longer because of the whole Keplar driver update fiasco.


----------



## Mopar63

Quote:


> Originally Posted by *harney*
> 
> Go with 3440x1440 Ultra wide great for gaming excellent for movies in 2.35 above aspect ratios..... had my doubts at 1st until i tried now will never look back recommend the dell curved one


I have made the move purely to ultra-wide and think 4K is just lame for gaming immersion by comparison. While I LOVE my LG 3440x1440 curved, I find my gaming experience better on my LG 2560x1080 Freesync. The resolution makes it fairly easy to get a single card frame rate above 40 and thus utilizing Freesync and everything is butter smooth.

Now get me a 3440x1440 IPS that is curved with Freesync and I am so there!!!!
Quote:


> Originally Posted by *redshoulder*
> 
> Have 6950cf and 780sli, my thoughts with this card are that 4gb vram is a bit on the low side.


The lower VRAM is something to consider but testing is showing that it might not be as big a deal as it is being made out to be. In high VRAM usage the speed of the HBM architecture seems to mitigate the lower VRAM.


----------



## Devnant

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


Seems like a defective product? According to Guru 3D temps are not nearly that high:
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


----------



## Cool Mike

I agree. One or two driver updates will bump the performance for sure.


----------



## Kylar182

I would love to see this in Quadfire with a 5960x and voltage unlocks. I run 7680x1440p and I'm so ready for DX12 or OpenGL 4.5+ to be the norm. Even the most optimized games at the moment crush at least one core at max settings.


----------



## Boomstick727

Quote:


> Originally Posted by *Devnant*
> 
> Seems like a defective product? According to Guru 3D temps are not nearly that high:
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


Agreed, they had removed the backplate as well. Obviously whatever they did afftected temps because like you said in other reviews were fine.


----------



## SpeedyVT

Quote:


> Originally Posted by *Boomstick727*
> 
> Agreed, they had removed the backplate as well. Obviously whatever they did afftected temps because like you said in other reviews were fine.


I'm thinking they blocked the circulation of the rad's airflow.


----------



## iLeakStuff

Quote:


> Originally Posted by *blue1512*
> 
> They removed the backplate in that test. A simple Fujipoly treatment in the VRM zone to transfer heat to the backplate will solve it


Quote:


> Originally Posted by *Devnant*
> 
> Seems like a defective product? According to Guru 3D temps are not nearly that high:
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


You guys are absolutely right. Hardwarecanucks also got like 60C max with infrared camera.


----------



## provost

Ok, this one is for the gpu gurus/computer engineers/uber geeks out there.









Can someone please explain to me the correlation between gpu cores and performance?

So for example, 290x has 2816 gpu cores and it is now nearing Titan Black's performance which makes sense. 980 has 2048 cores vs Titan Black's 2880 cores, but it performs better than Titan Black. I don't know how many cores 970 has, but that also performs better than Titan Black.

Furyx has an astounding 4096 cores vs Titan x 3072 cores, but it barely matches it.

How much of the performance that the end user sees depends on software, bios, drivers vs raw hardware power?

And, please no vague black box answer such as "architectural efficiency"... Lol

And, if that is the answer, how does it breakdown between software and hardware performance?

Tks in advance .


----------



## Piddeman

AMD got really good drivers for now, I hope they can fix better peformance in the drivers for the Fury X. Right now it seems that I made the right choice, I bought a new Asus R9 290X for 159USD.


----------



## SpeedyVT

Quote:


> Originally Posted by *provost*
> 
> Ok, this one is for the gpu gurus/computer engineers/uber geeks out there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone please explain to me the correlation between gpu cores and performance?
> 
> So for example, 290x has 2816 gpu cores and it is now nearing Titan Black's performance which makes sense. 980 has 2048 cores vs Titan Black's 2880 cores, but it performs better than Titan Black. I don't know how many cores 970 has, but that also performs better than Titan Black.
> 
> Furyx has an astounding 4096 cores vs Titan x 3072 cores, but it barely matches it.
> 
> How much of the performance that the end user sees depends on software, bios, drivers vs raw hardware power?
> 
> And, please no vague black box answer such as "architectural efficiency"... Lol
> 
> And, if that is the answer, how does it breakdown between software and hardware performance?
> 
> Tks in advance .


Cores are uniquely different they are not a good measure of how powerful a card is specifically, but having too little cores is as each individual core is responsible for feeding data in their cluster.

AMD's performance issues are not core related, but driver. They need to work overtime!


----------



## harney

Quote:


> Originally Posted by *iLeakStuff*
> 
> You guys are absolutely right. Hardwarecanucks also got like 60C max with infrared camera.


i am more than sure the photos of the hot ones is they where running some form of fur mark .....even with the plates on heat would show up at the top if it where the same


----------



## flopper

Quote:


> Originally Posted by *harney*
> 
> i am more than sure the photos of the hot ones is they where running some form of fur mark .....even with the plates on heat would show up at the top if it where the same


That french site is nvidia pro beyond belief as they try to actively sabotage anything regarding amd in any way they can.
there is no conspiracy,


----------



## CrazyElf

Quote:


> Originally Posted by *provost*
> 
> Ok, this one is for the gpu gurus/computer engineers/uber geeks out there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone please explain to me the correlation between gpu cores and performance?
> 
> So for example, 290x has 2816 gpu cores and it is now nearing Titan Black's performance which makes sense. 980 has 2048 cores vs Titan Black's 2880 cores, but it performs better than Titan Black. I don't know how many cores 970 has, but that also performs better than Titan Black.
> 
> Furyx has an astounding 4096 cores vs Titan x 3072 cores, but it barely matches it.
> 
> How much of the performance that the end user sees depends on software, bios, drivers vs raw hardware power?
> 
> And, please no vague black box answer such as "architectural efficiency"... Lol
> 
> And, if that is the answer, how does it breakdown between software and hardware performance?
> 
> Tks in advance .


There isn't as strong a correlation because the cores are not the bottleneck here.

Basically it comes down to:

They increased the shaders and TMUs by about 45%, but they kept the ROPs at 64 for both the 290X and Fury X, suggesting that the card is ROP bottlenecked; 8 ACEs too were kept (perhaps they should have had 12), which is the same number as the 290X
It's not clear whether or not the card is 1/4 FP64 or 1/16 FP64, but that uses power (Maxwell is 1/32 FP64, which is partly why it is more power efficient than Kepler - the Titan was effectively an entry level compute card)
The other problem is that Nvidia made big power efficiency improvements transitioning from the SMX to SMM (they split the crossbar and clearly have a much better power gating). AMD did realize some power savings from HBM and from GCN 1.0 to 1.2, but not enough to match.
The end result has been a card that has less performance per watt, and worse, less performance per mm^2.

I've got a thread on it:
http://www.overclock.net/t/1562121/could-amd-have-made-a-faster-gpu-with-96-rop-no-fp64-and-perhaps-8gb-of-vram/0_100

Quote:


> Originally Posted by *flopper*
> 
> That french site is nvidia pro beyond belief as they try to actively sabotage anything regarding amd in any way they can.
> there is no conspiracy,


You know, I've been called an AMD fanboy, but that's no conspiracy. That is a very hot running VRM.

There are several sites confirming this. Hardware.fr is a pretty solid site when it comes to reviews.
Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


The scary thing is, they took the temperature behind the PCB. What are the actual Mosfet temperatures running? As I noted earlier, perhaps 20C higher.

The other question I have is, for those of you advocating for voltage unlocks to overclock, do you really think that you'll get much headroom with the VRMs overheating? You'll need a full coverage waterblock, or at least some custom VRM cooling.

Quote:


> Originally Posted by *youra6*
> 
> Basically, you can't overload a GPU like you can a CPU. Normally when you run a game or a synthetic benchmark like Heaven the GPU, it's never utilized 100% (disregard what software monitoring tells you).
> 
> However, the case is different for Furmark. When you run it, the GPU is being pounded with data to process all the time. Your VRMs gets feed more power than what your GPU is "comfortable with." And with sustained use, I can see cards going bye bye.
> 
> Its definitely not fiction. On a side note, my r9 290X just DIED like 2 minutes ago.


Yeah I agree. Furmark and OCCT can indeed fry cards. See the above on VRMs - they're running dangerously hot as is.


----------



## nakano2k1

Quote:


> Originally Posted by *CrazyElf*
> 
> There are several sites confirming this. Hardware.fr is a pretty solid site when it comes to reviews.
> The scary thing is, they took the temperature behind the PCB. What are the actual Mosfet temperatures running? As I noted earlier, perhaps 20C higher.
> 
> The other question I have is, for those of you advocating for voltage unlocks to overclock, do you really think that you'll get much headroom with the VRMs overheating? You'll need a full coverage waterblock, or at least some custom VRM cooling.
> Yeah I agree. Furmark and OCCT can indeed fry cards. See the above on VRMs - they're running dangerously hot as is.


Wait... There was a backplate to cool the card, right? Why would you take it off and then complain about the temps??


----------



## Tivan

Quote:


> Originally Posted by *nakano2k1*
> 
> Wait... There was a backplate to cool the card, right?


Nope. It's for looks and there's no thermal transfer pads or fins or anything.

edit: I do think however, that if they were able to cause that much load on the card as was needed to get it that hot, in a gaming environment, then the card would be seriously faster than any other card around.


----------



## looniam

Quote:


> Originally Posted by *nakano2k1*
> 
> Wait... There was a backplate to cool the card, right? Why would you take it off and then complain about the temps??


because you can't get an accurate vrm temp reading with a backplate on. i would think there ought to be a hot spot on the backplate where the vrms are at.

so i wonder if having a backplate on shows ~50c but ~100c with it off; is the blackplate actually cooling anything? is the copper tube over the vrms better than adequate? would some thermal tape be a wise consideration?


----------



## Blameless

Quote:


> Originally Posted by *Tivan*
> 
> Raising power limit by 50% to increase max wattage drawn by 50% beyond specs is not running 'near spec'


Indeed.

Still, most cards will handle unthrottled FurMark at stock clocks and voltages if cooling is allowed to ram up, and I'm highly disdainful of parts that cannot.

Contrary to popular belief, GPUs, even consumer ones, aren't just toys only suitable for playing games. I wouldn't accept a CPU that could not run any code it was capable of executing 24/7 at stock speeds, and I won't accept less from a GPU.

If certainly had some parts that weren't engineered sufficiently to handle peak loads, but the overall failure rate on those must have been terrible, because succeeding generations of GPUs have generally had more robust VRMs.

If it ends up taking less than fifteen thousand hours of unthrottled Furmark to kill a typical Fury X, the card isn't worth the PCB it's printed on.
Quote:


> Originally Posted by *Tivan*
> 
> Not saying that Furmark is worthless, but it's not even all that good for stability testing sometimes. Just run a game that knows to utilize some extra power c;


FurMark isn't a good stability test because it doesn't have an artifact checker. It's still a damn good way to find out if a part has non-defective VRM and sufficent VRM cooling.
Quote:


> Originally Posted by *Kane2207*
> 
> Furmark is pretty worthless, both AMD and Nvidia recognise the binary at driver level and throttle the card since forever.


FurMark hasn't been specifically throttled by AMD drivers in quite some time. Just altering power limit is enough to prevent most AMD GPUs from throttling in FurMark.
Quote:


> Originally Posted by *youra6*
> 
> Basically, you can't overload a GPU like you can a CPU. Normally when you run a game or a synthetic benchmark like Heaven the GPU, it's never utilized 100% (disregard what software monitoring tells you).
> 
> However, the case is different for Furmark. When you run it, the GPU is being pounded with data to process all the time. Your VRMs gets feed more power than what your GPU is "comfortable with." And with sustained use, I can see cards going bye bye.


I've killed a few cards with FurMark and OCCT (which is even hotter than FurMark, often by a lot), but modern reference GPUs, if uncapped in power and fan speed limits, can almost always handle them indefinitely.

Indeed, a long OCCT or FurMark run is one of the first things I do on any new GPU to see if it's defective or not.
Quote:


> Originally Posted by *The Stilt*
> 
> Yeah, the fets used on Fury X indeed have TjMax of 150°C... with zero load.
> 
> IRF6894M derating curve:
> 
> 25°C = 160A
> 50°C = 150A
> 75°C = 130A
> 100°C = 108A
> 125°C = 75A
> 150°C = 0A
> 
> The VRM controller will trip the overheating protection at 127°C and force shutdown at 134°C.


Yep, which means that once people start really OCing these things, VRM temps are going to need to be kept around 75C, or below, for reliable operation.


----------



## rv8000

Quote:


> Originally Posted by *Blameless*
> 
> Indeed.
> 
> Still, most cards will handle unthrottled FurMark at stock clocks and voltages if cooling is allowed to ram up, and I'm highly disdainful of parts that cannot.
> 
> Contrary to popular belief, GPUs, even consumer ones, aren't just toys only suitable for playing games. I wouldn't accept a CPU that could not run any code it was capable of executing 24/7 at stock speeds, and I won't accept less from a GPU.
> 
> If certainly had some parts that weren't engineered sufficiently to handle peak loads, but the overall failure rate on those must have been terrible, because succeeding generations of GPUs have generally had more robust VRMs.
> 
> If it ends up taking less than fifteen thousand hours of unthrottled Furmark to kill a typical Fury X, the card isn't worth the PCB it's printed on.
> FurMark isn't a good stability test because it doesn't have an artifact checker. It's still a damn good way to find out if a part has non-defective VRM and sufficent VRM cooling.
> FurMark hasn't been specifically throttled by AMD drivers in quite some time. Just altering power limit is enough to prevent most AMD GPUs from throttling in FurMark.
> I've killed a few cards with FurMark and OCCT (which is even hotter than FurMark, often by a lot), but modern reference GPUs, if uncapped in power and fan speed limits, can almost always handle them indefinitely.
> 
> Indeed, a long OCCT or FurMark run is one of the first things I do on any new GPU to see if it's defective or not.
> Yep, which means that once people start really OCing these things, VRM temps are going to need to be kept around 75C, or below, for reliable operation.


While you generally make very good points, why would you purposely put a GPU in scenarios it clearly isn't meant for. Fury X is not a workstation card, a server farm card or anything of the sorts. That's like buying a cellphone and throwing it in a bathtub and seeing how long it takes before it dies; it's not meant to be used in those operating conditions.

On a side note, the longer this thread goes on the more carried away people are getting with this temperature thing. Every thermal imaging picture people are using as examples are under Furmark, OCCT, and not loads this card is meant to be running 24/7; in normal operating conditions the card and vrms are anywhere from 30-40c cooler at max temps. Even with moderate OC's these extreme circumstances would not apply.


----------



## provost

Quote:


> Originally Posted by *CrazyElf*
> 
> There isn't as strong a correlation because the cores are not the bottleneck here.
> 
> Basically it comes down to:
> 
> They increased the shaders and TMUs by about 45%, but they kept the ROPs at 64 for both the 290X and Fury X, suggesting that the card is ROP bottlenecked; 8 ACEs too were kept (perhaps they should have had 12)


Thanks. I guess the question is how important are ROPs, given that they bumped up every other gpu hardware performance feature such as shaders, TMUs, and bandwidth? May be early teething issues with memory management since rops tie into memory management?
Again, I have absolutely zero knowledge of these matters, so these may be elementary questions, but I am still curious as to the performance difference between Furyx and Titanx, given that Furyx looks much better on paper, notwithstanding the power consumption efficiency of Maxwell which is not a big concern for me as a desktop pc gamer, and I think I kind of understand that better efficiency leads to higher clocks which also factors into the TMU/Shader/Fill rate/ROP equation. Of course, better efficiency, is better... lol


----------



## iinversion

Quote:


> Originally Posted by *provost*
> 
> Ok, this one is for the gpu gurus/computer engineers/uber geeks out there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone please explain to me the correlation between gpu cores and performance?
> 
> So for example, 290x has 2816 gpu cores and it is now nearing Titan Black's performance which makes sense. 980 has 2048 cores vs Titan Black's 2880 cores, but it performs better than Titan Black. I don't know how many cores 970 has, but that also performs better than Titan Black.
> 
> Furyx has an astounding 4096 cores vs Titan x 3072 cores, but it barely matches it.
> 
> How much of the performance that the end user sees depends on software, bios, drivers vs raw hardware power?
> 
> And, please no vague black box answer such as "architectural efficiency"... Lol
> 
> And, if that is the answer, how does it breakdown between software and hardware performance?
> 
> Tks in advance .


In addition to what people already have told you:

AMD cards are also gimped by their drivers. If they had less CPU overhead they the Fury X would likely outperform the Titan X based on current numbers.

Essentially AMD cards perform worse no matter the CPU when presented with a CPU bound situation because of the driver scheduler. More driver overhead = CPU has to work harder

http://forums.guru3d.com/showthread.php?t=398858

The above is testing in a few games with the same CPU's. It shows the FPS you get with a i7-4790K and then an i3 with both an Nvidia GPU and an AMD GPU.



Notice how the AMD GPU severely tanks with the i3 but the Nvidia GPU only loses a few FPS. This is because the i3 can't overcome the driver overhead for the AMD GPU. This is only one picture pulled from the article, you can check the rest yourself.

This is exactly why you see Nvidia cards performing WAY better in certain games as they are usually CPU bound games where AMD drivers just have too much overhead and cause the cards to fall way behind. WoW is a good example. If you take a look at recent WoW benchmarks for WoD expansion you will see how hard it hits the AMD cards. It's not because WoW runs better on an Nvidia GPU, but because AMD drivers have much more CPU overhead.

AMD GPU's themselves are good products. I really hope they can catch up in terms of drivers sometime..

Things may get better for AMD in this aspect with DX12, but no one can say for sure.


----------



## The Stilt

Quote:


> Originally Posted by *Blameless*
> 
> Yep, which means that once people start really OCing these things, VRM temps are going to need to be kept around 75C, or below, for reliable operation.


Not really as there are six phases for the GPU, each capable supplying that amount.
So 450A at 125°C in theory.


----------



## harney

Quote:


> Originally Posted by *iinversion*
> 
> In addition to what people already have told you:
> 
> AMD cards are also gimped by their drivers. If they had less CPU overhead they the Fury X would likely outperform the Titan X based on current numbers.
> 
> Essentially AMD cards perform worse no matter the CPU when presented with a CPU bound situation because of the driver scheduler. More driver overhead = CPU has to work harder
> 
> http://forums.guru3d.com/showthread.php?t=398858
> 
> The above is testing in a few games with the same CPU's. It shows the FPS you get with a i7-4790K and then an i3 with both an Nvidia GPU and an AMD GPU.
> 
> 
> 
> Notice how the AMD GPU severely tanks with the i3 but the Nvidia GPU only loses a few FPS. This is because the i3 can't overcome the driver overhead for the AMD GPU. This is only one picture pulled from the article, you can check the rest yourself.
> 
> This is exactly why you see Nvidia cards performing WAY better in certain games as they are usually CPU bound games where AMD drivers just have too much overhead and cause the cards to fall way behind. WoW is a good example. If you take a look at recent WoW benchmarks for WoD expansion you will see how hard it hits the AMD cards. It's not because WoW runs better on an Nvidia GPU, but because AMD drivers have much more CPU overhead.
> 
> AMD GPU's themselves are good products. I really hope they can catch up in terms of drivers sometime..
> 
> Things may get better for AMD in this aspect with DX12, but no one can say for sure.


Did i miss some thing the video shows 4970k where is my 4970K i have a 4790K maybe some parallel worlds internet is getting crossed here









Or could be them damn Chinese with there Intel counterfeits

http://www.pcpowerplay.com.au/review/intel-i74970k,391334


----------



## Blameless

Quote:


> Originally Posted by *rv8000*
> 
> While you generally make very good points, why would you purposely put a GPU in scenarios it clearly isn't meant for.


Several reasons:

1. I have the (not always fulfilled) expectation that when performance figures are advertised that running code that approaches those figures won't immolate the part.

2. I'm not a typical user, and I do things other than game on my cards. There is no disclaimer about what I am or am not allowed to run on these parts.

3. The standards for reliability of my other consumer grade hardware, except perhaps for some shoddy HDDs, have no such compromises. A CPU that cannot execute any code cable of running on it, 24/7, 365.25 for at least the entire warranty period is defective. GPUs should not be so different, especially with both AMD and NVIDIA pushing GPGPU for the last decade.
Quote:


> Originally Posted by *The Stilt*
> 
> Not really as there are six phases for the GPU, each capable supplying that amount.
> So 450A at 125°C in theory.


Ah, I had thought you were referring to the VRM as a whole, which did seem a bit low.

Guess I should have Googled the part number first!


----------



## iinversion

Quote:


> Originally Posted by *harney*
> 
> Did i miss some thing the video shows 4970k where is my 4970K i have a 4790K maybe some parallel worlds internet is getting crossed here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn Chinese with there Intel counterfeits
> 
> http://www.pcpowerplay.com.au/review/intel-i74970k,391334


Lol think it was just a typo


----------



## pengs

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


The problem is that that was a Titan X through a gaming loop. Where is the source to this picture and is it stress test or normal load?


----------



## Blameless

Quote:


> Originally Posted by *pengs*
> 
> Where is the source to this picture and is it stress test or gaming loop?


It's a dedicated stress test roughly equivalent to FurMark, so not comparable to gaming loads.


----------



## tx12

Quote:


> Originally Posted by *The Stilt*
> 
> Fury X has throttling temperature limit of 75°C so it might limit maximum OC when voltage is adjusted.
> OTP limit for the GPU is 79°C.


How do they plan to release air cooled fury with 75 deg. C limit? Extra 15 deg. C or even more is needed for air cooling.
In the case they'll be forced to keep 75-80 deg. C on GPU Fury pro is going to be the noisiest card ever made.


----------



## Forceman

Quote:


> Originally Posted by *Devnant*
> 
> Seems like a defective product? According to Guru 3D temps are not nearly that high:
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


I'm looking at that picture again - why is nothing in that shot hot? Nothing on the motherboard, not the tubing, not the cables, nothing.

Edit: oh, that's the front of the card. Did they post a picture of the back?


----------



## toncij

Quote:


> Originally Posted by *provost*
> 
> Ok, this one is for the gpu gurus/computer engineers/uber geeks out there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone please explain to me the correlation between gpu cores and performance?
> 
> So for example, 290x has 2816 gpu cores and it is now nearing Titan Black's performance which makes sense. 980 has 2048 cores vs Titan Black's 2880 cores, but it performs better than Titan Black. I don't know how many cores 970 has, but that also performs better than Titan Black.
> 
> Furyx has an astounding 4096 cores vs Titan x 3072 cores, but it barely matches it.
> 
> How much of the performance that the end user sees depends on software, bios, drivers vs raw hardware power?
> 
> And, please no vague black box answer such as "architectural efficiency"... Lol
> 
> And, if that is the answer, how does it breakdown between software and hardware performance?
> 
> Tks in advance .


I'm a cs/sw/ engineer and can try to explain, but I'll keep it simple:

Something called rendering pipeline is a complex thing and underlying hw architecture is similar, but different from nvidia to amd. We have vertex, geometry shaders, computing, pixel etc. All those are used on hw unified shader cores and differ in shader code used. The higher the number of shader cores, more matrix calculations can be performed at a time. The performance of this part heavily depends on how those tasks are scheduled and how the architecture in question performs operations, what is the latency and how much those are in idle state (core waiting for command previous to continue to command next) but also on how the cores and core groups are designed and grouped with other units, how SIMD ALUs are designed, fetch/decodes, etc.

GPUs have multiple hw units and ROPs are Raster Operations that perform final pipeline stage called rasterization or blending operations of processing frame buffer which consists of multiple buffers like z-buffer, stencil buffer and other pipeline end buffers that have a task of rendering primitives on the screen in regard to their position, alpha values, mask in stencil, etc. In essence you need more ROPs if you have more pixels to blend, mostly affected by post-processing effects like blur, motion blur, sharpening, SSAO, stencil shadow calculations, anti-aliasing.

TL;DR: different architectures perform rather differently on different tasks. That is why at high resolutions AMD may win, at lower Nvidia. In OpenCL computing AMD architecture is vastly superior and obliterates Nvidia, for example... it all depends on how each task is fit for the underlying architecture.

It is really on case to case basis and heavily depends on how complex, how arranged and what type of shaders are executed and when.

Quote:


> Originally Posted by *iLeakStuff*
> 
> Fury X got that beat, Im extremely surprised that a water cooled card can get hotter than air cooled Titan. Maybe that is why full Fiji is locked to water?


This looks nothing short of a malicious or really incompetent testing result. You can't make the card that hot by mistake....


----------



## rv8000

Quote:


> Originally Posted by *Blameless*
> 
> Several reasons:
> 
> 1. I have the (not always fulfilled) expectation that when performance figures are advertised that running code that approaches those figures won't immolate the part.
> 
> 2. I'm not a typical user, and I do things other than game on my cards. There is no disclaimer about what I am or am not allowed to run on these parts.
> 
> 3. The standards for reliability of my other consumer grade hardware, except perhaps for some shoddy HDDs, have no such compromises. A CPU that cannot execute any code cable of running on it, 24/7, 365.25 for at least the entire warranty period is defective. GPUs should not be so different, especially with both AMD and NVIDIA pushing GPGPU for the last decade.
> Ah, I had thought you were referring to the VRM as a whole, which did seem a bit low.
> 
> Guess I should have Googled the part number first!


Do most GPGPU base programs create this load? That is those that are actually used for content creation, research etc, and not just hammering the cards as much as possible?

While you're not the average user, forums like this and review sites will be the first things to pop up on google searches for "average" people who make be looking to buy this hardware. Without disclaimers, and everyone making a huge deal of this heat situation, we're creating a gross misrepresentation of heat during a "true" gaming load. And let's be honest, a lot of people take things at face value and do not apply any sort of critical thinking, especially on the internet; I see it happen every day.


----------



## Casey Ryback

Quote:


> Originally Posted by *toncij*
> 
> This looks nothing short of a malicious or really incompetent testing result. You can't make the card that hot by mistake....


They probably ran furmark for an extended period of time.

The top cards from both teams just run hot.

Here's the 980ti review and it shows 85C+ on a 10 minute gaming loop.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-8.html

They get a 83C on the core (through the software reading I assume)

These are reference designs though.


----------



## looniam

Quote:


> Originally Posted by *Forceman*
> 
> I'm looking at that picture again - why is nothing in that shot hot? Nothing on the motherboard, not the tubing, not the cables, nothing.
> 
> Edit: oh, that's the front of the card. Did they post a picture of the back?


the cpu vrms, x99 chipset, top of the card and tubes seem to have some 40-44ish temps. maybe a "blown up" shot?


i'm sure your screen is somewhat calibrated . .no?









edit: nvm just saw your edit.











http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,12.html


----------



## toncij

Quote:


> Originally Posted by *iinversion*
> 
> In addition to what people already have told you:
> 
> AMD cards are also gimped by their drivers. If they had less CPU overhead they the Fury X would likely outperform the Titan X based on current numbers.
> 
> Essentially AMD cards perform worse no matter the CPU when presented with a CPU bound situation because of the driver scheduler. More driver overhead = CPU has to work harder
> 
> http://forums.guru3d.com/showthread.php?t=398858
> 
> The above is testing in a few games with the same CPU's. It shows the FPS you get with a i7-4790K and then an i3 with both an Nvidia GPU and an AMD GPU.
> 
> 
> 
> Notice how the AMD GPU severely tanks with the i3 but the Nvidia GPU only loses a few FPS. This is because the i3 can't overcome the driver overhead for the AMD GPU. This is only one picture pulled from the article, you can check the rest yourself.
> 
> This is exactly why you see Nvidia cards performing WAY better in certain games as they are usually CPU bound games where AMD drivers just have too much overhead and cause the cards to fall way behind. WoW is a good example. If you take a look at recent WoW benchmarks for WoD expansion you will see how hard it hits the AMD cards. It's not because WoW runs better on an Nvidia GPU, but because AMD drivers have much more CPU overhead.
> 
> AMD GPU's themselves are good products. I really hope they can catch up in terms of drivers sometime..
> 
> Things may get better for AMD in this aspect with DX12, but no one can say for sure.


Driver and API overhead is really one of the hindering parts for AMD and it should get much better in that part with DX12 and Vulkan, but the difference you can already see with Mantle in BF4 and Thief. In general what you see there and even a tiny bit better should be real for AMD in the future.

The CPU itself is not the only problem. Sometimes even having the fastest CPU available, can't help with a monstrous driver/API overhead that is here. Anyway as he told you, CPU speed can really help. This is the article I just wrote yesterday on part of the problem, the draw calls issue and CPU relation to that: https://medium.com/@toncijukic/draw-calls-in-a-nutshell-597330a85381

Unfortunately for AMD, new APIs will help Nvidia too. I'm not sure how much, we need to see when we have final Windows 10, final drivers from AMD and Nvidia for that operating system and a good synthetic benchmark that is designed to perform most modern graphics on those new APIs.

What is certain is that new APIs will help AMD more than they help Nvidia and in that specific part of performance segment practically level them. It depends on what part of being slower than Nvidia (if!) AMD can thank to API overhead. If a lot - then AMD will get a lot of boost. If a small amount, then a small boost it will be.

What is the problem here is that to actually gain any help, you need games that actually work on new APIs. You won't see much, if any, performance gain on DX11 games.


----------



## toncij

Quote:


> Originally Posted by *Blameless*
> 
> Several reasons:
> 
> 1. I have the (not always fulfilled) expectation that when performance figures are advertised that running code that approaches those figures won't immolate the part.
> 
> 2. I'm not a typical user, and I do things other than game on my cards. There is no disclaimer about what I am or am not allowed to run on these parts.
> 
> 3. The standards for reliability of my other consumer grade hardware, except perhaps for some shoddy HDDs, have no such compromises. A CPU that cannot execute any code cable of running on it, 24/7, 365.25 for at least the entire warranty period is defective. GPUs should not be so different, especially with both AMD and NVIDIA pushing GPGPU for the last decade.
> Ah, I had thought you were referring to the VRM as a whole, which did seem a bit low.
> 
> Guess I should have Googled the part number first!


Only games usually make GPUs run at 99%. Rarely you get that with video processing (I can't get more than 50% on my TitanX when working on 4K video). But, even if you do perfectly optimized GPGPU work (fluid sims for example) all those cards should work perfectly stable 24/7 at stock clocks. Temperatures of 105°C are fine. Most components don't shut down or fail before 120°C.


----------



## Forceman

You see a lot of posts talking about how AMD's DX11 overhead kills performance, but you also see people talking about how AMD's Windows 10 drivers greatly improve the overhead in DX11. Is it not possible to test some DX11 games on the Windows 10 preview?


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> You see a lot of posts talking about how AMD's DX11 overhead kills performance, but you also see people talking about how AMD's Windows 10 drivers greatly improve the overhead in DX11. Is it not possible to test some DX11 games on the Windows 10 preview?


You can install these drivers on win7


----------



## provost

Quote:


> Originally Posted by *CrazyElf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> There isn't as strong a correlation because the cores are not the bottleneck here.
> 
> Basically it comes down to:
> 
> They increased the shaders and TMUs by about 45%, but they kept the ROPs at 64 for both the 290X and Fury X, suggesting that the card is ROP bottlenecked; 8 ACEs too were kept (perhaps they should have had 12)
> It's not clear whether or not the card is 1/4 FP64 or 1/16 FP64, but that uses power (Maxwell is 1/32 FP64, which is partly why it is more power efficient than Kepler - the Titan was effectively an entry level compute card)
> The other problem is that Nvidia made big power efficiency improvements transitioning from the SMX to SMM (they split the crossbar and clearly have a much better power gating). AMD did realize some power savings from HBM and from GCN 1.0 to 1.2, but not enough to match.
> The end result has been a card that has less performance per watt, and worse, less performance per mm^2.
> 
> I've got a thread on it:
> http://www.overclock.net/t/1562121/could-amd-have-made-a-faster-gpu-with-96-rop-no-fp64-and-perhaps-8gb-of-vram/0_100
> You know, I've been called an AMD fanboy, but that's no conspiracy. That is a very hot running VRM.
> 
> There are several sites confirming this. Hardware.fr is a pretty solid site when it comes to reviews.
> The scary thing is, they took the temperature behind the PCB. What are the actual Mosfet temperatures running? As I noted earlier, perhaps 20C higher.
> 
> The other question I have is, for those of you advocating for voltage unlocks to overclock, do you really think that you'll get much headroom with the VRMs overheating? You'll need a full coverage waterblock, or at least some custom VRM cooling.
> Yeah I agree. Furmark and OCCT can indeed fry cards. See the above on VRMs - they're running dangerously hot as is.


Quote:


> Originally Posted by *iinversion*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> In addition to what people already have told you:
> 
> AMD cards are also gimped by their drivers. If they had less CPU overhead they the Fury X would likely outperform the Titan X based on current numbers.
> 
> Essentially AMD cards perform worse no matter the CPU when presented with a CPU bound situation because of the driver scheduler. More driver overhead = CPU has to work harder
> 
> http://forums.guru3d.com/showthread.php?t=398858
> 
> The above is testing in a few games with the same CPU's. It shows the FPS you get with a i7-4790K and then an i3 with both an Nvidia GPU and an AMD GPU.
> 
> 
> 
> Notice how the AMD GPU severely tanks with the i3 but the Nvidia GPU only loses a few FPS. This is because the i3 can't overcome the driver overhead for the AMD GPU. This is only one picture pulled from the article, you can check the rest yourself.
> 
> This is exactly why you see Nvidia cards performing WAY better in certain games as they are usually CPU bound games where AMD drivers just have too much overhead and cause the cards to fall way behind. WoW is a good example. If you take a look at recent WoW benchmarks for WoD expansion you will see how hard it hits the AMD cards. It's not because WoW runs better on an Nvidia GPU, but because AMD drivers have much more CPU overhead.
> 
> AMD GPU's themselves are good products. I really hope they can catch up in terms of drivers sometime..
> 
> Things may get better for AMD in this aspect with DX12, but no one can say for sure.


Quote:


> Originally Posted by *toncij*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I'm a cs/sw/ engineer and can try to explain, but I'll keep it simple:
> 
> Something called rendering pipeline is a complex thing and underlying hw architecture is similar, but different from nvidia to amd. We have vertex, geometry shaders, computing, pixel etc. All those are used on hw unified shader cores and differ in shader code used. The higher the number of shader cores, more matrix calculations can be performed at a time. The performance of this part heavily depends on how those tasks are scheduled and how the architecture in question performs operations, what is the latency and how much those are in idle state (core waiting for command previous to continue to command next) but also on how the cores and core groups are designed and grouped with other units, how SIMD ALUs are designed, fetch/decodes, etc.
> 
> GPUs have multiple hw units and ROPs are Raster Operations that perform final pipeline stage called rasterization or blending operations of processing frame buffer which consists of multiple buffers like z-buffer, stencil buffer and other pipeline end buffers that have a task of rendering primitives on the screen in regard to their position, alpha values, mask in stencil, etc. In essence you need more ROPs if you have more pixels to blend, mostly affected by post-processing effects like blur, motion blur, sharpening, SSAO, stencil shadow calculations, anti-aliasing.
> 
> TL;DR: different architectures perform rather differently on different tasks. That is why at high resolutions AMD may win, at lower Nvidia. In OpenCL computing AMD architecture is vastly superior and obliterates Nvidia, for example... it all depends on how each task is fit for the underlying architecture.
> 
> It is really on case to case basis and heavily depends on how complex, how arranged and what type of shaders are executed and when.
> This looks nothing short of a malicious or really incompetent testing result. You can't make the card that hot by mistake....


Thanks to all of you for taking the time to explain this to me. I don't think I still got a clue about what you all are talking about, but that's on me, not you..









My simple layman question boils down to this:

How can Nvidia get more performance out of less hardware, thereby spending less money, than AMD?
Clearly, its costing more for AMD to put all that hardware stuff (that I don't understand, so I call it hardware stuff







) on the pcb, and thereby resulting in lower $ variable CM per unit , compared to Nvidia that can do more with less? Does this make any sense to you smart folks?


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> You can install these drivers on win7


So why hasn't anyone else done it and reported results, if it makes such a difference?


----------



## Thoth420

Hey all....a few pages back there was some talk of max power draw of the Fury X. I am I correct that not overclocked a single Fury X consumes a max of 375 watts (75 from the PCI-E lane and 150 from each 8 pin which total 375)? I am just wondering about in a gaming enviroment @ 1440 reso not benchmark programs etc.

I am just wondering if I should consider a new PSU since I tend to like them to be a bit overkill and run at around 50 to 60% load. I think this EVGA G2 850 would be running a bit past that while gaming and definitely once I get to OC'ing.

The reason I ask this potentially stupidly obvious question is because I read 450watt a few pages back and my eyes went a bit wide.


----------



## DividebyZERO

Quote:


> Originally Posted by *Forceman*
> 
> So why hasn't anyone else done it and reported results, if it makes such a difference?


Because the Fury is a Myth. I think maybe 6 people own them now?


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> So why hasn't anyone else done it and reported results, if it makes such a difference?


They are kinda unstable. And there is a whole thread about performance increases.

Check here:

http://forums.guru3d.com/showthread.php?t=399956


----------



## The Stilt

Quote:


> Originally Posted by *tx12*
> 
> How do they plan to release air cooled fury with 75 deg. C limit? Extra 15 deg. C or even more is needed for air cooling.
> In the case they'll be forced to keep 75-80 deg. C on GPU Fury pro is going to be the noisiest card ever made.


Obviously it´s not going to be 75°C for Fury and Fury Nano.
295X2 too have 75°C limit while 290 and 290X cards have 95°C.

Most likely the temperature limit is caused by the pump specifications.
Most water pumps have quite strict temperature limits.

Depending on cooling solution vendor the temperature limit is either 60 or 67°C.
Wonder if there is a difference between the two different cooling solutions on Fury X.


----------



## Forceman

Quote:


> Originally Posted by *DividebyZERO*
> 
> Because the Fury is a Myth. I think maybe 6 people own them now?


I mean test them in general, not just on the Fury. I just want to see some numbers behind the "wait for Win 10" mantra.
Quote:


> Originally Posted by *sugarhell*
> 
> They are kinda unstable. And there is a whole thread about performance increases.
> 
> Check here:
> 
> http://forums.guru3d.com/showthread.php?t=399956


Not seeing the massive gains people are implying will come with Win 10 there.


----------



## Thoth420

Quote:


> Originally Posted by *DividebyZERO*
> 
> Because the Fury is a Myth. I think maybe 6 people own them now?


I own one just hasn't arrived yet....Monday! I will try and give an review asap about the sound with the Rad installed *correctly* wondering if some of the complaints are PEBKAC or people got those bad samples review sites mentioned. I sure won't accept some sound outside the scope of a normal aio cooler and I have played with 5 different ones so I know what to expect.


----------



## DividebyZERO

Quote:


> Originally Posted by *sugarhell*
> 
> They are kinda unstable. And there is a whole thread about performance increases.
> 
> Check here:
> 
> http://forums.guru3d.com/showthread.php?t=399956


Hacked drivers?


----------



## sugarhell

Quote:


> Originally Posted by *DividebyZERO*
> 
> Hacked drivers?


.1040 are the latest windows 10 drivers. And they are hacked to work with win7/8 too


----------



## Shogon

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all....a few pages back there was some talk of max power draw of the Fury X. I am I correct that not overclocked a single Fury X consumes a max of 375 watts (75 from the PCI-E lane and 150 from each 8 pin which total 375)? I am just wondering about in a gaming enviroment @ 1440 reso not benchmark programs etc.
> 
> I am just wondering if I should consider a new PSU since I tend to like them to be a bit overkill and run at around 50 to 60% load. I think this EVGA G2 850 would be running a bit past that while gaming and definitely once I get to OC'ing.
> 
> The reason I ask this potentially stupidly obvious question is because I read 450watt a few pages back and my eyes went a bit wide.


With a single Fury X that 850W is more than enough power. Unless the Fury X is a power hungry monster that is, but I doubt it all honestly. Even when voltage is unlocked and you flash a custom bios on the thing it shouldn't cause you any issues. Maybe if you consider CrossFire that is, but even then an 850W unit like that shouldn't have any issues even with CF X. I'm running my Titan X on a 3 year old 650W power supply and even with using a bios that increases the voltage to 1.255v - 1.274v I barely hit 530W total power usage while gaming or benchmarking. Average has been under 500W and typically in the mid to low 400's according to my UPS.


----------



## Thoth420

Quote:


> Originally Posted by *Shogon*
> 
> With a single Fury X that 850W is more than enough power. Unless the Fury X is a power hungry monster that is, but I doubt it all honestly. Even when voltage is unlocked and you flash a custom bios on the thing it shouldn't cause you any issues. Maybe if you consider CrossFire that is, but even then an 850W unit like that shouldn't have any issues even with CF X. I'm running my Titan X on a 3 year old 650W power supply and even with using a bios that increases the voltage to 1.255v - 1.274v I barely hit 530W total power usage while gaming or benchmarking. Average has been under 500W and typically in the mid to low 400's according to my UPS.


OK thanks, it has always been a very great unit I am just overworrying....I tend to do that with a new build. Cheers


----------



## RagingCain

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> Because the Fury is a Myth. I think maybe 6 people own them now?
> 
> 
> 
> I mean test them in general, not just on the Fury. I just want to see some numbers behind the "wait for Win 10" mantra.
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> They are kinda unstable. And there is a whole thread about performance increases.
> 
> Check here:
> 
> http://forums.guru3d.com/showthread.php?t=399956
> 
> Click to expand...
> 
> Not seeing the massive gains people are implying will come with Win 10 there.
Click to expand...

Buying a video card now, for hypothetical gains later?

How about not buying the card now, and if hypothetical gains appear, purchase the card after AMD prices drop. Its literally a win win for a consumer.

Anybody who buys now on that premise is making an emotional decision, not a logical one.


----------



## mav451

Quote:


> Originally Posted by *MadRabbit*
> 
> If it would have been Taylor saying that it would be just hot air in and out but Macri...


Taylor huh


----------



## Shogon

Quote:


> Originally Posted by *Thoth420*
> 
> OK thanks, it has always been a very great unit I am just overworrying....I tend to do that with a new build. Cheers


I can understand worrying about it, but at least you have what I consider a proper power supply and not something that was "cheap" and affordable. I did that in the past and now I only choose certain brands that are based on quality units like SuperFlower, Seasonic, or the occasional Flextronics. I've thought about putting my Corsair AX860 in my PC just because it is relatively new and isn't as old as this Kingwin. I've been lazy though to switch them and even with custom bios's I don't appear to have many issues as far as power goes. If I had a custom 980ti that is, or any card with a dual bios switch I would certainly change the power supplies out







.


----------



## gamervivek

Quote:


> Originally Posted by *sugalumps*
> 
> Well if the peoples predictions for dx12 gains are the same as their predictions for the fury "destroying" the ti then I am not to hopefull


Quote:


> Originally Posted by *Thoth420*
> 
> Hey all....a few pages back there was some talk of max power draw of the Fury X. I am I correct that not overclocked a single Fury X consumes a max of 375 watts (75 from the PCI-E lane and 150 from each 8 pin which total 375)? I am just wondering about in a gaming enviroment @ 1440 reso not benchmark programs etc.
> 
> I am just wondering if I should consider a new PSU since I tend to like them to be a bit overkill and run at around 50 to 60% load. I think this EVGA G2 850 would be running a bit past that while gaming and definitely once I get to OC'ing.
> 
> The reason I ask this potentially stupidly obvious question is because I read 450watt a few pages back and my eyes went a bit wide.


It doesn't work like that, PCIE specs them at that level, doesn't mean that they can't draw a fair bit more. That's why 295x2 can use upto 500W using only 2 8-pin.

450W for what exactly?


----------



## Tivan

Quote:


> Originally Posted by *RagingCain*
> 
> Buying a video card now, for hypothetical gains later?
> 
> How about not buying the card now, and if hypothetical gains appear, purchase the card after AMD prices drop. Its literally a win win for a consumer.
> 
> Anybody who buys now on that premise is making an emotional decision, not a logical one.


It is not illogical to buy a slightly inferior product on the performance scale, if you value other values higher than performance. As for making an emotional decision, some factors that aren't performance based, are emotional. But that doesn't make the decision as a whole illogical, as long as enough factors are considered.

Logic includes attributing value to variables such as emotional gratification.

Of course there's also people who buy FuryX cards over 980ti's, who are neutral to the performance difference, or favoring Nvidia slightly due to it, as well as not particularly attached to either camp.

Too many factors to consider.

As for strictly the question you raised at the start of your post. Again, there's multiple factors involved, but emotional attachment to AMD does not magically make a card perform better. So I wouldn't say that making that speculation is necessarily an emotional one. Emotional attachment to AMD makes you buy a card and not mind that it is weaker all factors considered. Justifying your purchase by saying that it will be better later, can be motivated by factors that aren't emotional. Regardless of how right or wrong they are.

Sure, if you're emotionally extremely attached to AMD, and want to get the best performance, you might resort to that argument with no good basis for it. But I just don't see people making that argument like that.

Anyway, just wanted to clarify. Since calling things logical or illogical that aren't exactly like that, can grind some people's gears. = D
I'm neutral on this 'the last 2 AMD flagships have seen massive gains in performance over driver iterations, so the FuryX must follow' (or whatever you want to add to it) point. It might just be a dud, who knows. I'm not buying GPUs for over 200 bucks anyway.


----------



## Thoth420

Quote:


> Originally Posted by *gamervivek*
> 
> It doesn't work like that, PCIE specs them at that level, doesn't mean that they can't draw a fair bit more. That's why 295x2 can use upto 500W using only 2 8-pin.
> 
> 450W for what exactly?


Thank you for that explanation. Edit: Re read and realized...derp you already answered that.

I just meant that without overclocking and while gaming not benching or folding etc. would it ever draw 450w? If so why? What parameters would cause that?

I have been waiting for an excuse to buy the 1000 watt Be Quiet! Dark Power Pro so I'm kinda looking for one...an excuse


----------



## Kane2207

Quote:


> Originally Posted by *Forceman*
> 
> So why hasn't anyone else done it and reported results, if it makes such a difference?


Because it's all dependent on DX12 games, of which there are none.


----------



## sugarhell

Quote:


> Originally Posted by *Kane2207*
> 
> Because it's all dependent on DX12 games, of which there are none.


Not really. Wddm 2.0 will give some performance increase with dx11. Also new win10 drivers vs 15.15 are quite a bit faster


----------



## Forceman

Quote:


> Originally Posted by *Kane2207*
> 
> Because it's all dependent on DX12 games, of which there are none.


No, I'm talking about the DX11 overhead reduction that the Win 10 drivers supposedly contain. People have posted 3DMark API test results to show how the draw call limit is improved, and you see vague posts along the lines of "wait for Win 10 because the overhead reduction will allow the AMD cards to outperform" but I haven't seen any games results. Do games not work in Win 10 right now? It's still free right?
Quote:


> Originally Posted by *sugarhell*
> 
> Not really. Wddm 2.0 will give some performance increase with dx11. *Also new win10 drivers vs 15.15 are quite a bit faster*


Exactly what I mean. So where is the evidence?


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> No, I'm talking about the DX11 overhead reduction that the Win 10 drivers supposedly contain. People have posted 3DMark API test results to show how the draw call limit is improved, and you see vague posts along the lines of "wait for Win 10 because the overhead reduction will allow the AMD cards to outperform" but I haven't seen any games results. Do games not work in Win 10 right now? It's still free right?
> Exactly what I mean. So where is the evidence?


What? Did you even read the thread? The overhead increase vs 15.5 is over 300 k draw calls. Even with amd cpus are close to 1 million from 0,6-0,7.

We are talking about in general performance increase not per game.

Also this is the last win10 driver. You need to check the other threads comparing the 15.5 with the win10 drivers. This one is the latest win10 driver vs the previous one. SO you will not see a big performance increase.

Also

https://www.youtube.com/watch?v=XzFe5OOHZko


----------



## Blameless

Quote:


> Originally Posted by *rv8000*
> 
> Do most GPGPU base programs create this load? That is those that are actually used for content creation, research etc, and not just hammering the cards as much as possible?


Really depends on the program and how it utilizes the card.

Folding on a GPU tends to be more load intensive than games, and hashing, though not quite at FurMark levels, is more demanding still, assuming you are tuning your settings for peak performance. This is especially true on high bandwidth parts with shader heavy architectures.
Quote:


> Originally Posted by *rv8000*
> 
> And let's be honest, a lot of people take things at face value and do not apply any sort of critical thinking


True, but that's more an issue with the observer than the presenter of information.
Quote:


> Originally Posted by *sugarhell*
> 
> You can install these drivers on win7


Windows 7, Windows 8.1, and Windows 10 all use different WDDM revisions and this has an impact on overhead even with the same driver version.


----------



## provost

Quote:


> Originally Posted by *gamervivek*
> 
> It doesn't work like that, PCIE specs them at that level, doesn't mean that they can't draw a fair bit more. That's why 295x2 can use upto 500W using only 2 8-pin.
> 
> 450W for what exactly?


OT
How many Furyx can I power with these:









Spoiler: Warning: Spoiler!



http://s1364.photobucket.com/user/p...f-4488-a32a-7199f7eb16f3_zpspbe7ipaa.jpg.html

Ax 1500i and Seasonic 550 not shown, as they are currently busy... lol [\spoiler]


----------



## Kane2207

Quote:


> Originally Posted by *sugarhell*
> 
> Not really. Wddm 2.0 will give some performance increase with dx11. Also new win10 drivers vs 15.15 are quite a bit faster


Ah, I stand corrected then, thanks









Anyone benched any games in Win 10 then? I thought AMD and Nvidia both had drivers out, surely some site has covered this?


----------



## toncij

Even if possible (probably may be, but also might not, depends on WDM differences) the DirectX 11 driver on Windows 10 can't be faster than that one on Windows 8 or 7 any more than it will be with same optimizations when those are implemented everywhere.

WDM platform might give some advantage, but that will probably be minimal. The new WDM just enables DirectX 12 and a new driver.

Real advantage of Windows 10 is DirectX 12 because of drastic changes in the API and driver that runs it.

DirectX 11 game is not a DirectX 12 game. It will not magically start using newer API and become faster just because DirectX 12 is on the system.

Sent from my iPhone using Tapatalk


----------



## Redwoodz

Quote:


> Originally Posted by *RagingCain*
> 
> Buying a video card now, for hypothetical gains later?
> 
> How about not buying the card now, and if hypothetical gains appear, purchase the card after AMD prices drop. Its literally a win win for a consumer.
> 
> Anybody who buys now on that premise is making an emotional decision, not a logical one.


Buying a card within the top +/- 2% of any gpu in performance that has possible future potential is not. Funny how much effort some are willing to go to alter someone's purchase decisions.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Redwoodz*
> 
> Buying a card within the top +/- 2% of any gpu in performance that has possible future potential is not. Funny how much effort some are willing to go to alter someone's purchase decisions.


2%? The Fury X gets beaten pretty handily in every res but 4k, and no single GPU is enough for 4k anyway, so that res is a bit of a mute point. When bringing overclocking into it, there isn't even a contest.

Everyone where i live is out of stock of Ti's/ FX's, so i have no choice but to wait a while. Hopefully there will be newer drivers and unlocked voltage soon, then we can see what the best card really is


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> What? Did you even read the thread? The overhead increase vs 15.5 is over 300 k draw calls. Even with amd cpus are close to 1 million from 0,6-0,7.
> 
> We are talking about in general performance increase not per game.
> 
> Also this is the last win10 driver. You need to check the other threads comparing the 15.5 with the win10 drivers. This one is the latest win10 driver vs the previous one. SO you will not see a big performance increase.
> 
> Also
> 
> https://www.youtube.com/watch?v=XzFe5OOHZko


And where does any testing show that draw call increase translating into gaming improvements? That's what I'm looking for, and that I haven't seen. Why "wait for Win 10" if the only advantage is a synthetic draw call increase?
Quote:


> Originally Posted by *Kane2207*
> 
> Anyone benched any games in Win 10 then? I thought AMD and Nvidia both had drivers out, surely some site has covered this?


That's what I'm looking for also. Where do these new driver improvements translate into improved gaming performance, and why has no one tested it?


----------



## Blameless

Quote:


> Originally Posted by *toncij*
> 
> Even if possible (probably may be, but also might not, depends on WDM differences) the DirectX 11 driver on Windows 10 can't be faster than that one on Windows 8 or 7 any more than it will be with same optimizations when those are implemented everywhere.
> 
> WDM platform might give some advantage, but that will probably be minimal. The new WDM just enables DirectX 12 and a new driver.
> 
> Real advantage of Windows 10 is DirectX 12 because of drastic changes in the API and driver that runs it.
> 
> DirectX 11 game is not a DirectX 12 game. It will not magically start using newer API and become faster just because DirectX 12 is on the system.


Significant improvements in overhead of AMD drivers have been demonstrated in Windows 10 in some D3D*11* games.

The newest Windows 7 and 8.1 drivers do not show similar results. I don't know if the same improvements are possible in these older versions of Windows, or if AMD will ever get around to enabling them.

Advantages of WDDM 2.0 are not entirely limited to DX12. Indeed, many of them shouldn't be limited to DirectX at all. It's a new driver model, with far reaching impacts. DX12 is designed around WDDM 2.0 and required WDDM 2.0, but they are not the same thing.


----------



## Forceman

Quote:


> Originally Posted by *Blameless*
> 
> Significant improvements in overhead of AMD drivers have been demonstrated in Windows 10 in some D3D*11* games.


You have any links? That's exactly what I'm looking for.


----------



## CrazyElf

Quote:


> Originally Posted by *Blameless*
> 
> Several reasons:
> 
> 1. I have the (not always fulfilled) expectation that when performance figures are advertised that running code that approaches those figures won't immolate the part.
> 
> 2. I'm not a typical user, and I do things other than game on my cards. There is no disclaimer about what I am or am not allowed to run on these parts.
> 
> 3. The standards for reliability of my other consumer grade hardware, except perhaps for some shoddy HDDs, have no such compromises. A CPU that cannot execute any code cable of running on it, 24/7, 365.25 for at least the entire warranty period is defective. GPUs should not be so different, especially with both AMD and NVIDIA pushing GPGPU for the last decade.
> Ah, I had thought you were referring to the VRM as a whole, which did seem a bit low.
> 
> Guess I should have Googled the part number first!


I think it's still best to keep the VRMs below 75C though, if only to lower the risk of failure.

But yeah I agree with the assessment that at least several hours (say 12 hours) of OCCT with artifact checking is the correct approach. There are other uses as you've noted - Mining was an example of one in the past.

They do have 6x IR6811/IR6894. To their credit, AMD did include adequate VRM capacity on this PCB. I'm more worried about the probability of failure in a couple of years down the line.

Quote:


> Originally Posted by *provost*
> 
> Thanks to all of you for taking the time to explain this to me. I don't think I still got a clue about what you all are talking about, but that's on me, not you..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My simple layman question boils down to this:
> 
> How can Nvidia get more performance out of less hardware, thereby spending less money, than AMD?
> Clearly, its costing more for AMD to put all that hardware stuff (that I don't understand, so I call it hardware stuff
> 
> 
> 
> 
> 
> 
> 
> ) on the pcb, and thereby resulting in lower $ variable CM per unit , compared to Nvidia that can do more with less? Does this make any sense to you smart folks?


Basically it comes down to, AMD did not address the bottlenecks and as a result, the performance of this card got penalized.

Nvidia has however spent a lot more on R&D, and sadly, it does show (check the power efficiency in terms of performance per watt from Kepler to Maxwell). I'm not saying AMD did not improve (they actually did considerably and in some ways, their engineers are to be commended for achieving so much on a smaller budget), but they didn't improve as much as Nvidia.

Quote:


> Originally Posted by *Redwoodz*
> 
> Buying a card within the top +/- 2% of any gpu in performance that has possible future potential is not. Funny how much effort some are willing to go to alter someone's purchase decisions.


Like it or not, emotions tend to dominate. It's a tribal mentality I am afraid. You can see fans on both sides trying to defend their "side".

Quote:


> Originally Posted by *toncij*
> 
> This looks nothing short of a malicious or really incompetent testing result. You can't make the card that hot by mistake....


As alarming as this may sound to you, this is common.
http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12



The XFX DD coolers have become notorious for their inadequate VRM cooling. _Appallingly, this was on gaming!_ Anything more intense and the GPU would have shut down.

Personally, I think whoever designed and shipped this needs their head checked, but this is on an open bench at room temperature ambient we are talking about.


----------



## SpeedyVT

Quote:


> Originally Posted by *CrazyElf*
> 
> There isn't as strong a correlation because the cores are not the bottleneck here.
> 
> Basically it comes down to:
> 
> They increased the shaders and TMUs by about 45%, but they kept the ROPs at 64 for both the 290X and Fury X, suggesting that the card is ROP bottlenecked; 8 ACEs too were kept (perhaps they should have had 12), which is the same number as the 290X
> It's not clear whether or not the card is 1/4 FP64 or 1/16 FP64, but that uses power (Maxwell is 1/32 FP64, which is partly why it is more power efficient than Kepler - the Titan was effectively an entry level compute card)
> The other problem is that Nvidia made big power efficiency improvements transitioning from the SMX to SMM (they split the crossbar and clearly have a much better power gating). AMD did realize some power savings from HBM and from GCN 1.0 to 1.2, but not enough to match.
> The end result has been a card that has less performance per watt, and worse, less performance per mm^2.
> 
> I've got a thread on it:
> http://www.overclock.net/t/1562121/could-amd-have-made-a-faster-gpu-with-96-rop-no-fp64-and-perhaps-8gb-of-vram/0_100
> You know, I've been called an AMD fanboy, but that's no conspiracy. That is a very hot running VRM.
> 
> There are several sites confirming this. Hardware.fr is a pretty solid site when it comes to reviews.
> The scary thing is, they took the temperature behind the PCB. What are the actual Mosfet temperatures running? As I noted earlier, perhaps 20C higher.
> 
> The other question I have is, for those of you advocating for voltage unlocks to overclock, do you really think that you'll get much headroom with the VRMs overheating? You'll need a full coverage waterblock, or at least some custom VRM cooling.
> Yeah I agree. Furmark and OCCT can indeed fry cards. See the above on VRMs - they're running dangerously hot as is.


The temperature thing was proven false or bad gpu.

The core configuration 4096:256:*64* is the biggest set of core config. It doesn't exceed the rop cycles. 96 rops could've alieviated some stress. However it would require a rework of the current design under 1.2 GCN. 4096/64 = 64. If the card was any bigger the rops would have to match it.

Oddly when I took NVidia's core count and broke it from it's highest denominator to it's lowest 2816:176:96 it failed to produce me a whole number which means it's specs are not right.... (980ti) However the 980 is correctly divisible by the lowest. It typically should produce a whole number

Unifed Shaders (A) Texture Shaders (B) Render Output Units (C)

A:B:C


----------



## toncij

Quote:


> Originally Posted by *Blameless*
> 
> Significant improvements in overhead of AMD drivers have been demonstrated in Windows 10 in some D3D*11* games.
> 
> The newest Windows 7 and 8.1 drivers do not show similar results. I don't know if the same improvements are possible in these older versions of Windows, or if AMD will ever get around to enabling them.
> 
> Advantages of WDDM 2.0 are not entirely limited to DX12. Indeed, many of them shouldn't be limited to DirectX at all. It's a new driver model, with far reaching impacts. DX12 is designed around WDDM 2.0 and required WDDM 2.0, but they are not the same thing.


I'm not really sure at what moment during reading my comments you could even remotely read that I think it is the same thing.

But, unlike what you can read from my posts, I see that you're not really clear on what WDDM is or what DirectX is. One does not simply "design DirectX around WDDM", let alone Direct3D, but one does design WDDM in a way to make Direct3D/X 12 possible.

While your wild claims are here, you show no proof of that whatsoever...
Quote:


> Originally Posted by *Forceman*
> 
> You have any links? That's exactly what I'm looking for.


I doubt he has, but I would love to see those. My own tests show zero (0) improvement in several cases.


----------



## RagingCain

Quote:


> Originally Posted by *Tivan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> Buying a video card now, for hypothetical gains later?
> 
> How about not buying the card now, and if hypothetical gains appear, purchase the card after AMD prices drop. Its literally a win win for a consumer.
> 
> Anybody who buys now on that premise is making an emotional decision, not a logical one.
> 
> 
> 
> It is not illogical to buy a slightly inferior product on the performance scale, if you value other values higher than performance. As for making an emotional decision, some factors that aren't performance based, are emotional. But that doesn't make the decision as a whole illogical, as long as enough factors are considered.
> 
> Logic includes attributing value to variables such as emotional gratification.
> 
> Of course there's also people who buy FuryX cards over 980ti's, who are neutral to the performance difference, or favoring Nvidia slightly due to it, as well as not particularly attached to either camp.
> 
> Too many factors to consider.
> 
> As for strictly the question you raised at the start of your post. Again, there's multiple factors involved, but emotional attachment to AMD does not magically make a card perform better. So I wouldn't say that making that speculation is necessarily an emotional one. Emotional attachment to AMD makes you buy a card and not mind that it is weaker all factors considered. Justifying your purchase by saying that it will be better later, can be motivated by factors that aren't emotional. Regardless of how right or wrong they are.
> 
> Sure, if you're emotionally extremely attached to AMD, and want to get the best performance, you might resort to that argument with no good basis for it. But I just don't see people making that argument like that.
> 
> Anyway, just wanted to clarify. Since calling things logical or illogical that aren't exactly like that, can grind some people's gears. = D
> I'm neutral on this 'the last 2 AMD flagships have seen massive gains in performance over driver iterations, so the FuryX must follow' (or whatever you want to add to it) point. It might just be a dud, who knows. I'm not buying GPUs for over 200 bucks anyway.
Click to expand...

Quote:


> Originally Posted by *Redwoodz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> Buying a video card now, for hypothetical gains later?
> 
> How about not buying the card now, and if hypothetical gains appear, purchase the card after AMD prices drop. Its literally a win win for a consumer.
> 
> Anybody who buys now on that premise is making an emotional decision, not a logical one.
> 
> 
> 
> Buying a card within the top +/- 2% of any gpu in performance that has possible future potential is not. Funny how much effort some are willing to go to alter someone's purchase decisions.
Click to expand...

You guys obviously need some clarification.

A.) If you buy a card based on facts, i.e. how it runs right now is a fact, a decision that is at least based on logic.

B.) If you buy a card based on magical fairy dust performance at some later date on some future OS with some future driver, you are are not basing your decision on logic.

I merely suggested if you used B to solely justify your purchase, you should wait to see if it pans out before buying it, that way you can save money by getting it cheaper at that future date. There might even be a more powerful card with more VRAM in the future.

@Redwoodz
Keep on fighting that good fight against anyone who disagrees with your world views on GPU purchases. Took me all of a minute to suggest people be logical with their purchases. I nearly sprained a finger exerting all that effort. <- I can't stress this enough, if I was any more sarcastic I would pee blood.


----------



## 2010rig

Quote:


> Originally Posted by *Forceman*
> 
> You have any links? That's exactly what I'm looking for.


Just take AMD's word for it, they would never mislead anyone, ever.
Quote:


> Originally Posted by *RagingCain*
> 
> @Redwoodz
> Keep on fighting that good fight against anyone who disagrees with your world views on GPU purchases. Took me all of a minute to suggest people be logical with their purchases. I nearly sprained a finger exerting all that effort. <- I can't stress this enough, if I was any more sarcastic I would pee blood.


For someone who supposedly runs a computer store, you would think he'd be less biased.

Besides GCN is a pretty old architecture how much more can AMD possibly squeeze out of it, all past optimizations should already be in place, right?


----------



## Tivan

Quote:


> Originally Posted by *RagingCain*
> 
> I merely suggested if you used B to solely justify your purchase, you should wait to see if it pans out before buying it


Okay then, I fully agree with that, though it wasn't fully obvious to me from the wording used c:

Sorry about that!


----------



## Yungbenny911

I am disappoint...









Not bad, but that's it... "It's not a bad card".


----------



## Noufel

Quote:


> Originally Posted by *Yungbenny911*
> 
> I am disappoint...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not bad, but that's it... "It's not a bad card".


if only it was 550$ even 600 $ i could have said " it's not a bad card" but not for 650$ no


----------



## 2010rig

Quote:


> Originally Posted by *Noufel*
> 
> if only it was 550$ even 600 $ i could have said " it's not a bad card" but not for 650$ no


exactly, but that Big chip, HBM, and AIO aren't cheap.


----------



## curlyp

Since we are on the topic of DX12, how does it work for cards that do not support it? For example, on the benchmarks with the Fury X, 980TI, Titan X, and 295X2, the 295X2 is mainly on top. When Win10 is released with DX12, will the cards that support it obtain a significant boost to out beat the 295x2?

Reason why I am asking, is I always see deals on the Sapphire Radeon R9 295X2 and wondering if it would be better for me to return the Fury X and pick up the card? I am thinking it will perform better with the new Samsung U28E590D 4k FreeSync monitor. Any tips or suggestions?

Thanks!

edit: grammar


----------



## iinversion

Quote:


> Originally Posted by *curlyp*
> 
> Since we are on the topic of DX12, how does it work for cards that do not support it? For example, on the benchmarks with the Fury X, 980TI, Titan X, and 295X2, the 295X2 is mainly on top. When Win10 is released with DX12, will the cards that support it obtain a significant boost to out beat the 295x2?
> 
> Reason why I am asking, is I always see deals on the Sapphire Radeon R9 295X2 and wondering if it would be better for me to return the Fury X and pick up the card? I am thinking it will perform better with the new Samsung U28E590D 4k FreeSync monitor. Any tips or suggestions?
> 
> Thanks!
> 
> edit: grammar


The 295x2 is only on top because it is two 290X's on a single PCB and not a single GPU. With dual GPUs comes the typical dual GPU problems.

You should always go for the fastest single card before reverting to getting more than one.

Maxwell is more compliant with DX12 than GCN is. There were some charts showing the differences somewhere.. which of those features actually matter I don't know.


----------



## ZealotKi11er

Quote:


> Originally Posted by *iinversion*
> 
> The 295x2 is only on top because it is two 290X's on a single PCB and not a single GPU. With dual GPUs comes the typical dual GPU problems.
> 
> You should always go for the fastest single card before reverting to getting more than one.
> 
> Maxwell is more compliant with DX12 than GCN is. There were some charts showing the differences somewhere.. which of those features actually matter I don't know.


GCN is more compliant with DX12 while Maxwell has extra feature level 12.1. In reality Maxwell has nothing that really matters.


----------



## Neon Lights

Does anybody know of any news concerning the Catalyst Omega 2 release supposedly on July 7th? I read that somewhere without any other info.


----------



## blue1512

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GCN is more compliant with DX12 while Maxwell has extra feature level 12.1. In reality Maxwell has nothing that really matters.


Exactly
Quote:


> Direct3D 12 requires graphics hardware conforming to feature levels 11_0 and 11_1 which supports virtual memory address translations. It introduces a revamped resource binding model, which allows explicit control of memory using descriptor heaps and tables. This model is supported on majority of existing desktop GPU architectures and requires WDDM 2.0 drivers. Supported hardware is divided into three Resource Binding tiers, which define maximum numbers for descriptor heaps used for CBV (constant buffer view), SRV (shader resource view) and UAV (unordered access view); CBVs and SRVs per pipeline stage; UAVs for all pipeline stages; Samplers per stage; and SRV descriptor tables. *Tier 3 hardware such as AMD GCN has no limitations, allowing fully bindless resources managed through dynamic memory heap, while Tier 1 (Nvidia Fermi, Intel Haswell/Broadwell) and Tier 2 (Nvidia Kepler/Maxwell, Intel Skylake) hardware impose some limits on the number of these resources*.


----------



## i7monkey

How many DX12 games are coming out in the next year though?


----------



## blue1512

Quote:


> Originally Posted by *i7monkey*
> 
> How many DX12 games are coming out in the next year though?


Project Cars, Witcher 3 and Batman:AK would have Dx12 patch in next month. The first full DX12 would be Ashes of singularity which released the alpha version on 18/06/2015.

We don't have to wait for next year. M$ is pushing Dx12 as they fear the threat of Vulkan API on SteamOS.

FYI, Vulkan is merged from OpenGL and Mantle. AMD hardwares run even better on that API.


----------



## ZealotKi11er

Quote:


> Originally Posted by *i7monkey*
> 
> How many DX12 games are coming out in the next year though?


Looking at how Mantle got so many game out being only AMD, DX12 will stop all prior versions. Windows 10 is free upgrade, 4 years old cards support it unlike older DX where you had to buy a new GPU and with CPUs not getting any faster DX12 will help dev achieve goals in PC with less optimization on draw calls for the better. If DX12 does not have at least 15 games by end of 2016 i would be surprised. I see no reason why games coming in 2016 (AAA) are not DX12. The only problem is that DX is held back by PC only dev like Blizzard, Valve lol.


----------



## looniam

Quote:


> Originally Posted by *iinversion*
> 
> The 295x2 is only on top because it is two 290X's on a single PCB and not a single GPU. With dual GPUs comes the typical dual GPU problems.
> 
> You should always go for the fastest single card before reverting to getting more than one.
> 
> Maxwell is more compliant with DX12 than GCN is. There were some charts showing the differences somewhere.. which of those features actually matter I don't know.


i'm stealing this post
Quote:


> Not "DX12" tiers, but five separate optional capabilities which have "tiers": resource binding tiers, tiled resource tiers, conservative rasterization tiers, resource heap tiers, and cross-node sharing tiers.
> 
> All GCN and Xbox One are Resource binding Tier 3. Fermi and Haswell/Broadwell are Resource binding Tier 1, everything else is Tier 2 (Skylake, Kepler, Maxwell-1 and Maxwell-2).
> 
> en.wikipedia.org/wiki/Direct3D#Feature_levels
> 
> GCN 1.1/1.2 support feature level 12_0 and GCN 1.0 supports feature level 11_1 - but it also supports two of the three mandatory features of level 12_0 with the exception of Tiled Resources Tier 2, which is really a very minor improvement that can be easily worked around.
> 
> These are not "instructions".
> 
> Feature level 12_1 includes two new pipeline states for the fixed function rasterization hardware, which are currently only supported by Maxwell-2.
> 
> Feature level 12_0 exposes "virtual memory" capabilities (dynamic heaps for resource descriptor tables, UAV descriptor tables for typed texture formats, virtual memory "paging" for Texture2D resources) of the GPU memory management unit which were not supported in earlier versions of DXGI/WDDM.
> 
> None of these affect performance in any significant way (maybe with the exception of ROVs).
> 
> Feature level 12_0 is supported by Iceland (R5 M240/M250/M255 from end of 2014), Bonnaire (R7 260, R9 M280), Hawaii (R9 290), Tonga (R9 285, R9 M295) and Fiji.
> 
> And BTW it's time to stop this "fully compatible" nonsense. We have an new API that offers like several thousand important features vital for getting the most out of advanced graphics hardware that rivals supercomputers of the past. Yet everyone has to whine about 3 (three) minor features that no-one will be using any time soon... beats me completely.
> 
> OK, here is the exact difference between tiled resource tiers 1 and 2 (which is the only feature that differs for older GCN 1.0 parts conforming to level 11_1 and newer GCN 1.1/1.2 parts with level 12_0):
> 
> a) if a virtual memory page is not present in GPU memory, tier 1 returns garbage on reads and writes raise an exception, so the app has to check page residency first; tier 2 returns zero on reads and silently discards any writes;
> b) you can use LOD level clamp with tiled textures on tier 2 and read back tile mapping status; on tier 1, you have to emulate LOD level clamp with shader code;
> c) MIP levels spanning multiple tiles are guaranteed to not use hardware-specific packing formats on Tier 1 if they are an integer multiple of a tile shape, while on Tier 2 this expands to MIPs that can fit in a single tile.
> 
> Now show me one single game that depends on tiled resources.


----------



## Majin SSJ Eric

I seriously wouldn't expect drivers or any software to make a significant difference in the performance results. The card is competitive with a 980Ti and that's all it HAD to do. Would have been nice if it outright beat it but that didn't happen and no amount of drivers or new API is going to change that. I still don't get why the actual performance it has is such a "fail" though? Its basically the same performance the Nvidia fanboys were hailing as "unprecedented" when the 980Ti showed up just three weeks ago. Guess it just has the wrong name on the box...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I seriously wouldn't expect drivers or any software to make a significant difference in the performance results. The card is competitive with a 980Ti and that's all it HAD to do. Would have been nice if it outright beat it but that didn't happen and no amount of drivers or new API is going to change that. I still don't get why the actual performance it has is such a "fail" though? Its basically the same performance the Nvidia fanboys were hailing as "unprecedented" when the 980Ti showed up just three weeks ago. Guess it just has the wrong name on the box...


I think the card is about as fast. The problem is in 1080 and some cases 1440 CPU DX11 Overhead makes it so GTX980 Ti is faster. Also we still dont know how much this card overclocks. AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.


----------



## Dhoulmagus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think the card is about as fast. The problem is in 1080 and some cases 1440 CPU DX11 Overhead makes it so GTX980 Ti is faster. Also we still dont know how much this card overclocks. AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.


That's really what it boils down to and it's a repeating story at this point, I couldn't believe how well my 7970 would overclock. I threw in a 280x in this rig as a hold me over card and while I'm still on 1080P it's actually plenty for me with an OC. More food for thought, if the GPU in fury is capable of a large overclock, that translates directly to the increased memory bandwidth being able to spread its wings does it not? The extra memory bandwidth leaves room for a huge GPU frequency with no bottleneck, which in turn could start spitting out higher frames than a 980Ti. I just hope they allow memory frequency overclocking, it would help.

But yes, it does seem that anything from AMD that doesn't make Nvidia's current flagship look like a radeon 9000 trying to run Crysis 3 is certified garbage anymore.


----------



## DividebyZERO

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think the card is about as fast. The problem is in 1080 and some cases 1440 CPU DX11 Overhead makes it so GTX980 Ti is faster. Also we still dont know how much this card overclocks. AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.


I dunno according to this site, 780TI is faster than 980/970/290x/390x even at 4k.

http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1632&page=5


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.


You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.


----------



## blue1512

Quote:


> Originally Posted by *DividebyZERO*
> 
> I dunno according to this site, 780TI is faster than 980/970/290x/390x even at 4k.
> 
> http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1632&page=5


This site is not so reliable. Those number are submitted without any core/mem set up by the way. IRL, 780Ti at max OC can beat 980/970, and AMD cards don't do well in tessellation extreme Heaven.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Serious_Don*
> 
> That's really what it boils down to and it's a repeating story at this point, I couldn't believe how well my 7970 would overclock. I threw in a 280x in this rig as a hold me over card and while I'm still on 1080P it's actually plenty for me with an OC. More food for thought, if the GPU in fury is capable of a large overclock, that translates directly to the increased memory bandwidth being able to spread its wings does it not? The extra memory bandwidth leaves room for a huge GPU frequency with no bottleneck, which in turn could start spitting out higher frames than a 980Ti. I just hope they allow memory frequency overclocking, it would help.
> 
> *But yes, it does seem that anything from AMD that doesn't make Nvidia's current flagship look like a radeon 9000 trying to run Crysis 3 is certified garbage anymore.*


It isn't remotely certified garbage. However, you have to understand that AMD's marketshare is what...around 24%? So someone who is already buying an Nvidia card and looking to upgrade, is almost certainly going to be looking to upgrade to the Nvidia equivalent, while the AMD person looking to upgrade is going to be looking to an AMD card to upgrade to. So that keeps the status quo for all of the buyers.

But AMD needs something to sway those Nvidia people who are upgrading their GPU to the latest Nvidia GPU. So they can swing that 24% back up from where it is. Otherwise they just maintain their status quo of miserable marketshare compared to Nvidia. Hence why (and unfortunately for AMD), they have to come with an extremely compelling reason over the Nvidia equivalent. Or those Nvidia buyers just aren't even going to look at their products to purchase.

Do you kinda get what I'm trying to say logic wise? Because this is kinda business 101.


----------



## RagingCain

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.
> 
> 
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.
Click to expand...

Well that's not true, logically, both can co-exist.

However my thread clearly shows that driver performance has increased/remain consistent after Maxwell's release, but only recently has stumbled by 4.09% on average performance degradation. Which is within an margin error. Link is in my signature.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.


It not Nvidia gimping Kepler. It just they are not putting much optimization in terms of architecture. GCN is a ongoing slow improvement since HD 7970.


----------



## RagingCain

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.
> 
> 
> 
> It not Nvidia gimping Kepler. It just they are not putting much optimization in terms of architecture. GCN is a ongoing slow improvement since HD 7970.
Click to expand...

Fury X may beg to differ. Kepler is over 3 years old now. Other than new title support, I am not sure what you are going to get out of it other than bug fixes. Speed / optimizations do not go on forever.

Now they are supporting three major architectures, with focus on Maxwell. AMD has one architecture to focus on. GCN optimizations can't last forever either.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It not Nvidia gimping Kepler. It just they are not putting much optimization in terms of architecture. GCN is a ongoing slow improvement since HD 7970.


You can probably find 10 posts in this thread alone alluding to Kepler gimping, and at least another 10 praising AMD's amazing driver improvement. I'm just pointing out the way that performance delta is portrayed by people with different agendas (or viewpoints, if you want a less prejudicial interpretation).


----------



## RagingCain

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> It not Nvidia gimping Kepler. It just they are not putting much optimization in terms of architecture. GCN is a ongoing slow improvement since HD 7970.
> 
> 
> 
> You can probably find 10 posts in this thread alone alluding to Kepler gimping, and at least another 10 praising AMD's amazing driver improvement. I'm just pointing out the way that performance delta is portrayed by people with different agendas.
Click to expand...

See my previous 2 posts above Force.

@All
One could, I am NOT saying this, but, one could make the claim that AMD purposely with held performance in GCN to release it slowly over time. For whatever reason, such as accidentally/incompetence, or to "sell" the R9 390x as a higher performing card over the R9 290x.


----------



## DividebyZERO

Quote:


> Originally Posted by *blue1512*
> 
> This site is not so reliable. Those number are submitted without any core/mem set up by the way. IRL, 780Ti at max OC can beat 980/970, and AMD cards don't do well in tessellation extreme Heaven.


I would love to agree with this, but that said look at how FuryX is compared to 980ti. Everyone only cares about overclocking performance. In fact doesn't really probably matter if it's even stable. such is the way of Gamer vs OC bencher vs casual user. Nope wait, we have to eliminate casual user because they don't buy 650$ cards. Hmmm..


----------



## ZealotKi11er

Quote:


> Originally Posted by *DividebyZERO*
> 
> I would love to agree with this, but that said look at how FuryX is compared to 980ti. Everyone only cares about overclocking performance. In fact doesn't really probably matter if it's even stable. such is the way of Gamer vs OC bencher vs casual user. Nope wait, we have to eliminate casual user because they don't buy 650$ cards. Hmmm..


Overclocking GPUs is so different then CPUs because with GPU you have temperature effect, ambient difference, different games handle overclocked different. A lot of people report their MAX OC but most MAX OC are not 24/7/365 stable. Even with water cooling in summer you MAX OC takes a hit. Eventhough this is OCN i know in the real world most people dont OC even high end GPUs. They buy high end GPU so they dont have to OC.


----------



## harney

Quote:


> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.


Quote:


> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.


Well surly this can be tested debunked & show if its nvidia gimping or AMD's driver magic ...... take an 780ti get the same drivers that where used at the time when reviews showed the 780 ti beating the 290x then do the same for the 290x same drivers ect.....then make sure you run the same bench software same versions ect and take it from there...

Now if i had a 780ti and a 290x i would do this test myself but its unfortunate as i do not...

So any body out there that has the gear & is willing to try this ..then we would know for sure SIMPLE.....


----------



## iinversion

Quote:


> Originally Posted by *harney*
> 
> Well surly this can be tested debunked & show if its nvidia gimping or AMD's driver magic ...... take an 780ti get the same drivers that where used at the time when reviews showed the 780 ti beating the 290x then do the same for the 290x same drivers ect.....then make sure you run the same bench software same versions ect and take it from there...
> 
> Now if i had a 780ti and a 290x i would do this test myself but its unfortunate as i do not...
> 
> So any body out there that has the gear & is willing to try this ..then we would know for sure SIMPLE.....


It has already been shown that the 780 Ti or Kepler in general did not get worse. It's still performing the same as it did when the 290X came out, the 290X has just improved a little with drivers.

The information is out there for anyone to research themselves. You can go look at credible reviews of the 290X and look at the 780 Ti's frame rates then and now. They don't change within a small margin of error, but the 290X has gotten slightly better.

The whole Nvidia gimping their Kepler cards is just a myth. As it as already been said Kepler can't keep getting optimized and better forever and it's at that point. Kepler didn't get gimped. Maxwell and GCN have gotten better.


----------



## RagingCain

Quote:


> Originally Posted by *harney*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> You know, I was thinking about this earlier. It's funny how the 290X catching up to the 780 Ti is hailed as both evidence of AMD's great driver support, and proof of Nvidia "gimping" Kepler cards. So which is it, because I don't see how it can be both.
> 
> Click to expand...
> 
> Well surly this can be tested debunked & show if its nvidia gimping or AMD's driver magic ...... take an 780ti get the same drivers that where used at the time when reviews showed the 780 ti beating the 290x then do the same for the 290x same drivers ect.....then make sure you run the same bench software same versions ect and take it from there...
> 
> Now if i had a 780ti and a 290x i would do this test myself but its unfortunate as i do not...
> 
> So any body out there that has the gear & is willing to try this ..then we would know for sure SIMPLE.....
Click to expand...

Quote:


> Originally Posted by *iinversion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *harney*
> 
> Well surly this can be tested debunked & show if its nvidia gimping or AMD's driver magic ...... take an 780ti get the same drivers that where used at the time when reviews showed the 780 ti beating the 290x then do the same for the 290x same drivers ect.....then make sure you run the same bench software same versions ect and take it from there...
> 
> Now if i had a 780ti and a 290x i would do this test myself but its unfortunate as i do not...
> 
> So any body out there that has the gear & is willing to try this ..then we would know for sure SIMPLE.....
> 
> 
> 
> It has already been shown that the 780 Ti or Kepler in general did not get worse. It's still performing the same as it did when the 290X came out, the 290X has just improved a little with drivers.
> 
> The information is out there for anyone to research themselves. You can go look at credible reviews of the 290X and look at the 780 Ti's frame rates then and now. They don't change within a small margin of error, but the 290X has gotten slightly better.
> 
> The whole Nvidia gimping their Kepler cards is just a myth. As it as already been said Kepler can't keep getting optimized and better forever and it's at that point. Kepler didn't get gimped. Maxwell and GCN have gotten better.
Click to expand...




Source: http://www.overclock.net/t/1562094/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/0_50


----------



## iinversion

Those numbers are pretty close. I think aside from the 347.88 driver the rest is within the margin error. The 347.88 is even a fairly small deviation.

There's also a new 353.38 hotfix driver you can try


----------



## toncij

Quote:


> Originally Posted by *curlyp*
> 
> Since we are on the topic of DX12, how does it work for cards that do not support it? For example, on the benchmarks with the Fury X, 980TI, Titan X, and 295X2, the 295X2 is mainly on top. When Win10 is released with DX12, will the cards that support it obtain a significant boost to out beat the 295x2?
> 
> Reason why I am asking, is I always see deals on the Sapphire Radeon R9 295X2 and wondering if it would be better for me to return the Fury X and pick up the card? I am thinking it will perform better with the new Samsung U28E590D 4k FreeSync monitor. Any tips or suggestions?
> 
> Thanks!
> 
> edit: grammar


295X2 supports most important parts of DirectX 12 - new lighter drivers and multiple better approaches to task scheduling, etc.

295X2, if CrossFire is supported, will always be faster. Remember, 295X2 is identical to dual 290 cards in CrossFire. Just it is on the same PCB instead of dual cards.

Quote:


> Originally Posted by *iinversion*
> 
> Maxwell is more compliant with DX12 than GCN is. There were some charts showing the differences somewhere.. which of those features actually matter I don't know.


Some do and actually some that Maxwell does do more than what GCN does. But the fact is games are developed for the lowest common feature set and that will certainly be 12_0. Anything else will be optional.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> GCN is more compliant with DX12 while Maxwell has extra feature level 12.1. In reality Maxwell has nothing that really matters.


I disagree here. Maxwell has pretty nice 12_1 CR feature that makes visual difference.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Looking at how Mantle got so many game out being only AMD, DX12 will stop all prior versions. Windows 10 is free upgrade, 4 years old cards support it unlike older DX where you had to buy a new GPU and with CPUs not getting any faster DX12 will help dev achieve goals in PC with less optimization on draw calls for the better. If DX12 does not have at least 15 games by end of 2016 i would be surprised. I see no reason why games coming in 2016 (AAA) are not DX12. The only problem is that DX is held back by PC only dev like Blizzard, Valve lol.


PC-only devs don't hold back DX. It is the other way around.


----------



## Kuivamaa

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think the card is about as fast. The problem is in 1080 and some cases 1440 CPU DX11 Overhead makes it so GTX980 Ti is faster. Also we still dont know how much this card overclocks. AMD has proven with HD 7970 and 290X that they will beat Nvidia. 290X is as fast as GTX780 Ti. During Launch GTX780 Ti was clearly faster.


1080p is largely irrelevant in the era of DSR/VSR.I suppose competitive CSGO or Dota 2,LoL players still like this resolution but I don't think they are the target group of elite GPUs like those trwo. As for oc, from the reviews that I have read, it gets higher core clock without voltage tweak than my 290X gets with some added voltage. This card is a winner in my book, the only problem atm is availability. The standard Fury is near and my local stores still have no Fury X units or even set price (it is bound to change). By the time it is readily available ,I expect aftermarker Fury model reviews to be out, really.


----------



## Alatar

1080p is definitely still relevant because 120Hz and 144Hz monitors are still a thing that people use.

And honestly even the 980Ti is far from maintaining those framerates at 1080p.


----------



## Kuivamaa

These monitors are mostly relevant in 'FPS competitive shooters,people have been attaining 120fps there for quite a while by turning detail low. Or in the case of CSGO triple that amount.


----------



## Smanci

Shooters, Racing, you name it


----------



## Geek Branden

The performance at 4k+ is not too shabby. 4GB cap on the other hand is not attractive. It is possible an 8GB (8x1 dies) version could be introduced in the future.


----------



## provost

Quote:


> Originally Posted by *harney*
> 
> Well surly this can be tested debunked & show if its nvidia gimping or AMD's driver magic ...... take an 780ti get the same drivers that where used at the time when reviews showed the 780 ti beating the 290x then do the same for the 290x same drivers ect.....then make sure you run the same bench software same versions ect and take it from there...
> 
> Now if i had a 780ti and a 290x i would do this test myself but its unfortunate as i do not...
> 
> So any body out there that has the gear & is willing to try this ..then we would know for sure SIMPLE.....


That would be an interesting analysis. My guess would be that the "lowly" 290x would beat out the "mighty" Titan/Titan Black









So how long should Nvidia provide optimization support for its cards? How about longer than a few months as in the case of Titan Black and Titan Z, 13 months of 780 Ti, and 18 months as in the case of OG Titan.
Everyone keeps parroting "Kepler is old", like a gospel..








Gk 110 is not!

Nvidia can call Maxwell Kepler 2, Joke is on you, or whatever the heck its wants to call it. Still on 28 nm, and the only way to sell more cards is to re-spin kepler 1 and stop optimizing for Kepler 1. 970 is inferior to Titan Black in every hardware feature set imaginable , yet it outperforms the Titan, how? Simple, software controlled performance management to maximize sku profitability. Business 101... lol

So, the question is what do you get in optimization support period when you are spending $700-$1000 on a GPU? If someone likes spending that much money for a few months of performance, that's their business. For me, I see it as a scam , if I was being candid.

When Nvidia says "we control it all" (meaning performance) they really mean it.. lol And, I don't mean random posters saying it, or clueless useless reviewers marching to the orders, but the people who actually matter at Nvidia..









Can you imagine what this artificial upgrade cycle would look like, if AMD disappeared, 3 months, tops? ... lol
So this is to hoping that some members don't get their wish of AMD disappearing from the consumer gpu scence.. amen.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Geek Branden*
> 
> The performance at 4k+ is not too shabby. 4GB cap on the other hand is not attractive. It is possible an 8GB (8x1 dies) version could be introduced in the future.


The thing is Fury X is doing better at 4K then 1440p or 1080p. It should be the other way. Also the future is here. You will not see the need for 8GB anytime soon because we already got hit by the next gen console vRAM usage. Another thing is that there is Crysis 4 or any other game we are waiting for that will crush our cards.


----------



## toncij

Well, there are several problems:


FuryX, TitanX and 980Ti are high-end cards aimed at enthusiasts
Enthusiasts usually go year by year for latest and greatest, if not they go for newest when there is a large performance jump, which leads me to the next point...
... Pascal and FuryXvNext are going to 14/16nm process which is 1/2 of the current. That will significantly improve their ability to increase performance which should be north of 50% better
4GB is enough for most games even at 4K, for many at 5K even
Given the expected lifetime of FuryX, nobody should get too stressed about 4GB. By 2016. we won't see more hunger for VRAM and we will see HBM2.


----------



## Ganf

Quote:


> Originally Posted by *toncij*
> 
> Well, there are several problems:
> 
> 
> FuryX, TitanX and 980Ti are high-end cards aimed at enthusiasts
> Enthusiasts usually go year by year for latest and greatest, if not they go for newest when there is a large performance jump, which leads me to the next point...
> ... Pascal and FuryXvNext are going to 14/16nm process which is 1/2 of the current. That will significantly improve their ability to increase performance which should be north of 50% better
> 4GB is enough for most games even at 4K, for many at 5K even
> Given the expected lifetime of FuryX, nobody should get too stressed about 4GB. By 2016. we won't see more hunger for VRAM and we will see HBM2.


Yep. Been saying this for a while. You can't future proof your GPU right now, I don't care if you're buying Titan X SLI, next year is going to put it to shame.


----------



## FreeElectron

true.
but enthusiasts don't like having bottlenecks like VRAM bottleneck for example.
I am playing on 1440P and GTA V (High or Ultra (Don't remember) with 2x aa) and i am getting upto 3700GB VRAM usage.
Also.. Enthusiasts may not also upgrade yearly as it might stretch a bit.


----------



## Ganf

Quote:


> Originally Posted by *FreeElectron*
> 
> true.
> but enthusiasts don't like having bottlenecks like VRAM bottleneck for example.
> I am playing on 1440P and GTA V (High or Ultra (Don't remember) with 2x aa) and i am getting upto 3700GB VRAM usage.
> Also.. Enthusiasts may not also upgrade yearly as it might stretch a bit.


And 4k benchmarks show that the card doesn't bottleneck in games like GTA V that just cache VRAM for the sake of caching it, so still... No issue.

The card doesn't even have a problem in Shadows of Mordor which legitimately uses 6GB. Sooo... No bottleneck.


----------



## toncij

Quote:


> Originally Posted by *FreeElectron*
> 
> true.
> but enthusiasts don't like having bottlenecks like VRAM bottleneck for example.
> I am playing on 1440P and GTA V (High or Ultra (Don't remember) with 2x aa) and i am getting upto 3700GB VRAM usage.
> Also.. Enthusiasts may not also upgrade yearly as it might stretch a bit.


To be honest, I'm not as nearly as enthusiast or loaded as some and I will upgrade next year. Simply, performance difference should be insane.

The problem is - you say 3700GB - yes, and that is still significantly lower than 4GB. GTAV is so nicely modifiable that it can really work nice on 3GB too.

Any game you manage to go over 4GB now is probably hitting GPU perf. wall much earlier than VRAM wall. Unless you go with 4 cards for 5K Witcher 3 - but then again... you don't care about cost anyway so 4 TitanX will be...


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Ganf*
> 
> Yep. Been saying this for a while. You can't future proof your GPU right now, I don't care if you're buying Titan X SLI, next year is going to put it to shame.


Which is why I'm sticking with my lowly Titans!


----------



## blue1512

4 GB this 4 GB that. They even ignored that this is 4 GB of HBM and just claimed that they worked the same as 4GB of GDDR5, while they are different architectures and behave differently.

This is 6GB of GDDR5


And this is 4 GB of HBM


Still want to compare them?


----------



## DividebyZERO

Quote:


> Originally Posted by *blue1512*
> 
> 4 GB this 4 GB that. They even ignored that this is 4 GB of HBM and just claimed that they worked the same as 4GB of GDDR5, while they are different architectures and behave differently.
> 
> This is 6GB of GDDR5
> 
> 
> And this is 4 GB of HBM
> 
> 
> Still want to compare them?


So funnel mentally they are the same? Except blue dots and red dots smileys, i didn't know they were in my gpu. Do i need to take some acid first?


----------



## FreeElectron

Quote:


> Originally Posted by *Ganf*
> 
> And 4k benchmarks show that the card doesn't bottleneck in games like GTA V that just cache VRAM for the sake of caching it, so still... No issue.
> 
> The card doesn't even have a problem in Shadows of Mordor which legitimately uses 6GB. Sooo... No bottleneck.


Was it stutter (frametime) tested to show that it has no issue?

Quote:


> Originally Posted by *blue1512*
> 
> 4 GB this 4 GB that. They even ignored that this is 4 GB of HBM and just claimed that they worked the same as 4GB of GDDR5, while they are different architectures and behave differently.
> 
> This is 6GB of GDDR5
> 
> 
> And this is 4 GB of HBM
> 
> 
> Still want to compare them?


lolwat?


----------



## nyxagamemnon

4gb is 4gb doesn't matter what storage medium it is. If it was really 4x of hbm notice I said x because gb is a standard measurement we would have a new scale comparing x to gb where x of x is a certain number in hbm in relation to gb.

Like 1 hbm unit is the same capacity as 2gb of gddr5.

But amd has 4GB and 6 on 980ti > 4Gb. In capacity but not in speed.


----------



## toncij

Quote:


> Originally Posted by *blue1512*
> 
> 4 GB this 4 GB that. They even ignored that this is 4 GB of HBM and just claimed that they worked the same as 4GB of GDDR5, while they are different architectures and behave differently.
> 
> This is 6GB of GDDR5
> 
> 
> And this is 4 GB of HBM
> 
> 
> Still want to compare them?


Well... there is a thing they're right about - 4GB of HBM is 4GB exactly the same way 4GB of GDDR5 or GDDR3 is 4GB.









Bandwidth does not replace capacity.

The other thing that is actually the problem is that indicator of 4GB used on GPU-Z senzor does not nearly indicate your game needs 4GB of VRAM.


----------



## blue1512

Quote:


> Originally Posted by *toncij*
> 
> Well... there is a thing they're right about - 4GB of HBM is 4GB exactly the same way 4GB of GDDR5 or GDDR3 is 4GB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bandwidth does not replace capacity.
> 
> The other thing that is actually the problem is that indicator of 4GB used on GPU-Z senzor does not nearly indicate your game needs 4GB of VRAM.


The fact is that the games don't request the card to buffer but the DRIVER is. And of course the driver will use 4GB of HBM differently from 4GB of GDDR5. Just that simple fact is enough to nullify any direct comparison between them. Do 4 oranges equal to 4 apples?

In the funnel example, in a smaller one with bigger exhausted you will rarely see the liquid "buffers" Does that ring any bell?


----------



## Thoth420

Quote:


> Originally Posted by *blue1512*
> 
> The fact is that the games don't request the card to buffer but the DRIVER is. And of course the driver will use 4GB of HBM differently from 4GB of GDDR5. Just that simple fact is enough to nullify any direct comparison between them. Do 4 oranges equal to 4 apples?


----------



## toncij

Quote:


> Originally Posted by *blue1512*
> 
> The fact is that the games don't request the card to buffer but the DRIVER is. And of course the driver will use 4GB of HBM differently from 4GB of GDDR5. Just that simple fact is enough to nullify any direct comparison between them. Do 4 oranges equal to 4 apples?
> 
> In the funnel example, in a smaller one with bigger exhausted you will rarely see the liquid "buffers" Does that ring any bell?


Have you tried it?

Because - you manually load resources, it does not happen on random. To saturate VRAM of 4GB you need to have insane amount of resources. Drivers do, that is correct, offload VRAM at their convenience, but this will hardly happen in the way these do so that your critical resource gets hit for most games.

For example, TitanX offloads VRAM to RAM at ~10,5-11,5 GB (it really also depends on resources) and is able to do so only to up of about 2-4 GB of RAM when it moves to page file.

I'd love to test that if someone here has access to FuryX. Would help us understand what happens and when. I can't get one yet for myself... next availability is 3-5 weeks away.


----------



## harney

Quote:


> Originally Posted by *blue1512*
> 
> 4 GB this 4 GB that. They even ignored that this is 4 GB of HBM and just claimed that they worked the same as 4GB of GDDR5, while they are different architectures and behave differently.
> 
> This is 6GB of GDDR5
> 
> 
> And this is 4 GB of HBM
> 
> 
> Still want to compare them?


Well if my gddr 5 has red smiling faces then i will stick with gddr thx

Is this sesame street definition of how vram works if so i think the production team needs sacking mis educating kids the horror


----------



## rdr09

Quote:


> Originally Posted by *harney*
> 
> Well if my gddr 5 has red smiling faces then i will stick with gddr thx
> 
> Is this sesame street definition of how vram works if so i think the production team needs sacking mis educating kids the horror


you know which manufacturer shipped the first gpu with GDDR5, right?


----------



## Ganf

Quote:


> Originally Posted by *FreeElectron*
> 
> Was it stutter (frametime) tested to show that it has no issue?
> lolwat?


Yeah, they were tested for stutter. Fantastic tests too.







Some of them show frametimings on par with Nvidia, others are completely borked with no rhyme or reason, having some of the Nvidia cards fail miserably in the tests also, and no attempt to try and find where the fault is.

There's maybe 3-4 reviews out there that were done well, and they're all consistent in their results. Every other review has different results and obvious mistakes despite similar hardware (Haven't seen one yet that didn't test on a 5960x with 16gb of DDR4 etc etc...). It's pathetic.

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html

http://hexus.net/tech/reviews/graphics/84170-amd-radeon-r9-fury-x-4gb/?page=9

Tomshardware is one of the cleaner tests, and comments that the Fury's performance was indistinguishable from the 980ti and Titan X at 4k on SoM. Hexus points out that at those settings all cards have hitching problems, but the Fury X's are a little more noticeable. Hexus speculates that it could be VRAM issues, but when you look at Tom's frametime graph you can see it's just the usual trademark AMD frametime spikes. Nothing to do with the VRAM.


----------



## Kand

Quote:


> Originally Posted by *pengs*
> 
> Oh hai, a Titan X in a _gaming loop_
> 
> 
> 
> looks for nearest exit


That's the RAM hitting 100c, not the VRMs.

Difference.


----------



## blue1512

On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.



And guys at Guru3D are overvolting the core like hell.
So far so good for "overclocker's dream"


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> 
> 
> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"


95MHz over stock isn't really going to set the world on fire is it?


----------



## Blameless

Quote:


> Originally Posted by *Kand*
> 
> That's the RAM hitting 100c, not the VRMs.
> 
> Difference.


This is even worse, as GDDR5 is normally rated for 85-95C, while VRM components are generally rated for 125C+.

Memory ICs are generally easier to cool though.


----------



## Malinkadink

Quote:


> Originally Posted by *Kane2207*
> 
> 95MHz over stock isn't really going to set the world on fire is it?


Being that it's AMD it just might







(couldn't resist)


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> 95MHz over stock isn't really going to set the world on fire is it?


I'm seriously disappointed with some OCN's members reading skill. Didn't I write HBM? Couldn't you see a 20% OC from 500MHz to 600 MHz on mem?


----------



## DFroN

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> 
> 
> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"


Could you link to where people are overvolting please?

Couldn't find it


----------



## harney

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> 
> 
> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"


Its a start at least for the fury what is the 3d fire score for a Ti default and Ti Oc on same cpu ram ect as above


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> 
> 
> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"


I'd like to see a validated version of that score.

And they say they are using CCC to overclock the memory?
Quote:


> Again when writing this review the only means of changing the settings on the Fury X is through the Catalyst Control Center. We can change the frequencies for both GPU and memory as well as the power limit in this software, but it is not possible to change the voltage settings.


Link for the overvolting?


----------



## hamzta09

Quote:


> Originally Posted by *DividebyZERO*
> 
> These monitors are mostly relevant in 'FPS competitive shooters,people have been attaining 120fps there for quite a while by turning detail low. Or in the case of CSGO triple that amount.


They're relevant in ANYTHING. Not just FPS games. Simply scrolling a document means you can read it on a 120hz monitor, not so much on a 60hz

I cant play any game on a 60hz monitor due to the EXTREME motionblur/ghosting on them. A "1ms" 60hz display is complete and utter crap compared to 120hz in anything. Heck Minecraft alone is a blurfest on a 60hz monitor.

http://www.testufo.com/#test=framerates-marquee
http://www.testufo.com/#test=framerates-text

I get a headache looking at this on a 60hz monitor.
http://www.testufo.com/#test=framerates

At 120hz with BBR/Lightboost you can follow this guys eyes with your own eyes. No blur.
http://www.testufo.com/#test=photo&photo=quebec.jpg&pps=960&pursuit=0&height=0

So 120hz isnt just for "fps gaming" its for anything thats in motion. Simply moving your mousepointer around is so much nicer.
Quote:


> Originally Posted by *Alatar*
> 
> 1080p is definitely still relevant because 120Hz and 144Hz monitors are still a thing that people use.
> 
> And honestly even the 980Ti is far from maintaining those framerates at 1080p.


Dont bother, these people think 1080p is for scrubs and they rather play at 4K with smearing and 60hz.


----------



## Ganf

Quote:


> Originally Posted by *Kand*
> 
> That's the RAM hitting 100c, not the VRMs.
> 
> Difference.


Yeah, the difference is your average Thermal Limit for GDDR5 is below that, whereas VRM's can take it.

http://www.micron.com/products/dram/gddr5
Quote:


> Originally Posted by *Kane2207*
> 
> 95MHz over stock isn't really going to set the world on fire is it?


You missed the important part of that picture.
*The HBM is OC'd by 100mhz.*


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> I'm seriously disappointed with some OCN's members reading skill. Didn't I write HBM? Couldn't you see a 20% OC from 500MHz to 600 MHz on mem?


Huh?
Quote:


> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"rolleyes.gif


I'm sorry, that statement made me think you were referring to overclock performance in general - to which I stated less than 10% over stock isn't something to write home about.

I'll agree with the overclockers dream element when we see a good 200MHz plus on the core.


----------



## hamzta09

Quote:


> Originally Posted by *Ganf*
> 
> Yeah, the difference is your average Thermal Limit for GDDR5 is below that, whereas VRM's can take it.
> 
> http://www.micron.com/products/dram/gddr5


Doesnt Titan X come with a crappy reference cooler?

Wonder what it would be like with an AIO or triple slot fan.


----------



## Noufel

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> These monitors are mostly relevant in 'FPS competitive shooters,people have been attaining 120fps there for quite a while by turning detail low. Or in the case of CSGO triple that amount.
> 
> 
> 
> They're relevant in ANYTHING. Not just FPS games. Simply scrolling a document means you can read it on a 120hz monitor, not so much on a 60hz
> 
> I cant play any game on a 60hz monitor due to the EXTREME motionblur/ghosting on them. A "1ms" 60hz display is complete and utter crap compared to 120hz in anything. Heck Minecraft alone is a blurfest on a 60hz monitor.
> 
> http://www.testufo.com/#test=framerates-marquee
> http://www.testufo.com/#test=framerates-text
> 
> I get a headache looking at this on a 60hz monitor.
> http://www.testufo.com/#test=framerates
> 
> At 120hz with BBR/Lightboost you can follow this guys eyes with your own eyes. No blur.
> http://www.testufo.com/#test=photo&photo=quebec.jpg&pps=960&pursuit=0&height=0
> 
> So 120hz isnt just for "fps gaming" its for anything thats in motion. Simply moving your mousepointer around is so much nicer.
> Quote:
> 
> 
> 
> Originally Posted by *Alatar*
> 
> 1080p is definitely still relevant because 120Hz and 144Hz monitors are still a thing that people use.
> 
> And honestly even the 980Ti is far from maintaining those framerates at 1080p.
> 
> Click to expand...
> 
> Dont bother, these people think 1080p is for scrubs and they rather play at 4K with smearing and 60hz.
Click to expand...

all this, when you try 144hz you can't simply go back to 60hz


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> Doesnt Titan X come with a crappy reference cooler?
> 
> Wonder what it would be like with an AIO or triple slot fan.


EVGA Superclocked 980ti was at 75c stock on the back of the PCB, big difference.

275w is 275w folks, doesn't matter whose card it is, heat's gotta go somewhere. None of these cards are gonna be cool to the touch.


----------



## blue1512

Quote:


> Originally Posted by *Forceman*
> 
> I'd like to see a validated version of that score.
> 
> And they say they are using CCC to overclock the memory?
> Link for the overvolting?


Yes, they used the buggy CCC to overclock mem. AMD didn't allow that in normal CCC, but from this event I believe that the next AB will put the HBM to the moon.

As for overvolt, they are using this
http://forums.guru3d.com/showthread.php?t=399542


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> As for overvolt, they are using this
> http://forums.guru3d.com/showthread.php?t=399542


I meant links to overclocking results with the overvolt. How well is it working?


----------



## blue1512

Quote:


> Originally Posted by *Forceman*
> 
> I meant links to overclocking results with the overvolt. How well is it working?


Ask them, I just said that they are overvolting


----------



## sugarhell

Lol he is lying. None has overvolt with AB.


----------



## Neon Lights

Quote:


> Originally Posted by *blue1512*
> 
> Ask them, I just said that they are overvolting


Where exactly did you read on the Guru3d forums that someone is overvolting Fury Xs? I have searched those forums an did not find anything.


----------



## blue1512

Quote:


> Originally Posted by *Neon Lights*
> 
> Where exactly did you read on the Guru3d forums that someone is overvolting Fury Xs? I have searched those forums an did not find anything.


*Ask* them, dude. They are a forum, not a secret group of cult.


----------



## Neon Lights

Quote:


> Originally Posted by *blue1512*
> 
> *Ask* them, dude. They are a forum, not a secret group of cult.


Of the hundreds of members? I mean you have to know this from somewhere, name your source then.


----------



## Tivan

Quote:


> Originally Posted by *blue1512*
> 
> *Ask* them, dude. They are a forum, not a secret group of cult.


Why would we go make a forum thread on guru3d to ask if there's anyone who has a working tool that can overvolt the FuryX, when no one anywhere seems to have one.

We're asking you since you casually implied that someone might have found a way to overvolt, though going by your reaction I'll take you just guessed. (or implied that they're overvolting cards that aren't the FuryX, but we do that too.)

Which in turn I'd take to mean that there's nobody overvolting FuryX's yet, at least not with a public tool. (the MSI AfterBurner devs and people like that might of course have an experimental build with voltage control that'll lead to a release sometime, but yeah)

No offense, don't worry about it~


----------



## blue1512

Quote:


> Originally Posted by *Tivan*
> 
> Why would we go make a forum thread on guru3d to ask if there's anyone who has a working tool that can overvolt the FuryX, when no one anywhere seems to have one.
> 
> We're asking you since you casually implied that someone might have found a way to overvolt, though going by your reaction I'll take you just guessed.
> 
> Which in turn I'd take to mean that there's nobody overvolting FuryX's yet, at least not with a public tool. No offense, don't worry about it~


No worry. It will come to light soon, so just take it as a hint first


----------



## 2010rig

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> *And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream*"


Quote:


> Originally Posted by *blue1512*
> 
> *Ask* them, dude. They are a forum, not a secret group of cult.


If they are "overvolting the core like hell", you would think you'd have at least 1 link to back up your statement.

Yet another user not to take seriously.


----------



## Neon Lights

Quote:


> Originally Posted by *Tivan*
> 
> Which in turn I'd take to mean that there's nobody overvolting FuryX's yet, at least not with a public tool. (the MSI AfterBurner devs and people like that might of course have an experimental build with voltage control that'll lead to a release sometime, but yeah)


Does not look good (in regards to Afterburner): http://forums.guru3d.com/showpost.php?p=5106990&postcount=44


----------



## GorillaSceptre

Quote:


> Originally Posted by *blue1512*
> 
> No worry. It will come to light soon, so just take it as a hint first


Wth are you talking about?


----------



## Kylar182

I've looked everywhere and I can't find the max refresh rate output of this card. 980s-Titanx's are supposedly capped at 240. I have 3 ROG Swift 144hz x 1440p monitors and would swap if they had a high refresh rate cap.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> No worry. It will come to light soon, so just take it as a hint first


This is the same type of comment made on any Fury X hype thread 10 days ago lol


----------



## szeged

"just wait guys, you wont be dissapointed"


----------



## Neon Lights

Quote:


> Originally Posted by *szeged*
> 
> "just wait guys, you wont be dissapointed"


Have fun with your Titan Xs.


----------



## looniam

because i'll forget what i am waiting for . .


----------



## ZealotKi11er

Quote:


> Originally Posted by *szeged*
> 
> "just wait guys, you wont be dissapointed"


It's not like anything new is going to come out.


----------



## szeged

Quote:


> Originally Posted by *Neon Lights*
> 
> Have fun with your Titan Xs.


im having lots of fun ty.


----------



## ZealotKi11er

Quote:


> Originally Posted by *szeged*
> 
> im having lots of fun ty.


You have been having fun while fan-boys are killing each over GTX980 Ti and Fury X.


----------



## Neon Lights

Quote:


> Originally Posted by *szeged*
> 
> im having lots of fun ty.


That is probably why you post in a Fury X thread.


----------



## Kane2207

Quote:


> Originally Posted by *Ganf*
> 
> You missed the important part of that picture.
> *The HBM is OC'd by 100mhz.*


I saw that, my understanding of AMDs implementation of HBM is that 3rd party applications will let you tinker with the speed but it doesn't actually apply any change at the base level. Is this no longer the case?

Wasn't there also a massive thread last week with users defending AMDs decision to lock HBM overclocking anyway? Has that now also changed?


----------



## Noufel

Any news on unlocked voltage on the furyX?


----------



## Thoth420

Quote:


> Originally Posted by *hamzta09*
> 
> They're relevant in ANYTHING. Not just FPS games. Simply scrolling a document means you can read it on a 120hz monitor, not so much on a 60hz
> 
> I cant play any game on a 60hz monitor due to the EXTREME motionblur/ghosting on them. A "1ms" 60hz display is complete and utter crap compared to 120hz in anything. Heck Minecraft alone is a blurfest on a 60hz monitor.
> 
> http://www.testufo.com/#test=framerates-marquee
> http://www.testufo.com/#test=framerates-text
> 
> I get a headache looking at this on a 60hz monitor.
> http://www.testufo.com/#test=framerates
> 
> At 120hz with BBR/Lightboost you can follow this guys eyes with your own eyes. No blur.
> http://www.testufo.com/#test=photo&photo=quebec.jpg&pps=960&pursuit=0&height=0
> 
> So 120hz isnt just for "fps gaming" its for anything thats in motion. Simply moving your mousepointer around is so much nicer.
> Dont bother, these people think 1080p is for scrubs and they rather play at 4K with smearing and 60hz.


I agree. The only game I play at 60hz is Skyrim everything else is noticeably better at 120hz. I notice no real difference between 144 and 120 but 144 seems to be all the rage.

I also can't go back to 1080 and there are very limited options at the moment for 2k past 60hz. If you name it I probably already tried it and keeping my ACER G Sync without using an Nvidia card is just pointless. More adaptive sync 1440 120hz+ displays need to roll out that don't suffer issues rendering them pointless for their main task..gaming.


----------



## Neon Lights

Quote:


> Originally Posted by *Thoth420*
> 
> I agree. The only game I play at 60hz is Skyrim everything else is noticeably better at 120hz. I notice no real difference between 144 and 120 but 144 seems to be all the rage.
> 
> I also can't go back to 1080 and there are very limited options at the moment for 2k past 60hz. If you name it I probably already tried it and keeping my ACER G Sync without using an Nvidia card is just pointless. More adaptive sync 1440 120hz+ displays need to roll out that don't suffer issues rendering them pointless for their main task..gaming.


I am playing in "240Hz", haha. Only Full HD but second best contrast ratio of all computer monitors. Eizo FG2421


----------



## Ganf

Quote:


> Originally Posted by *Kane2207*
> 
> I saw that, my understanding of AMDs implementation of HBM is that 3rd party applications will let you tinker with the speed but it doesn't actually apply any change at the base level. Is this no longer the case?
> 
> Wasn't there also a massive thread last week with users defending AMDs decision to lock HBM overclocking anyway? Has that now also changed?


It has if people are overclocking it by 20% with no noticeable drawbacks and seeing a benefit in frames. However that pic is from hardware.info's review of the card. I wasn't paying attention to the fact that their logo is in the bottom right. Nothing new has developed on this end. From all evidence available it still looks like HBM won't overclock well and provide no benefits while doing so.


----------



## Neon Lights

Quote:


> Originally Posted by *Ganf*
> 
> It has if people are overclocking it by 20% with no noticeable drawbacks and seeing a benefit in frames. However that pic is from hardware.info's review of the card. I wasn't paying attention to the fact that their logo is in the bottom right. Nothing new has developed on this end. From all evidence available it still looks like HBM won't overclock well and provide no benefits while doing so.


What is wrong if a hardware.info logo is in the screenshot?


----------



## sugarhell

Quote:


> Originally Posted by *Ganf*
> 
> It has if people are overclocking it by 20% with no noticeable drawbacks and seeing a benefit in frames. However that pic is from hardware.info's review of the card. I wasn't paying attention to the fact that their logo is in the bottom right. Nothing new has developed on this end. From all evidence available it still looks like HBM won't overclock well and provide no benefits while doing so.


HBM based freq is 500 mhz. The 100mhz oc is not a small amount. We talk about a 4096 bit bus interface. Still the clock speed from HBM it should max around 650-700


----------



## Kane2207

Quote:


> Originally Posted by *Neon Lights*
> 
> What is wrong if a hardware.info logo is in the screenshot?


Nothing, here's the quote from their review:
Quote:


> Officially there is no option to adjust the memory clocks, but (probably due to a bug in the drivers) we did see a slider to adjust memory clock frequency after every other few reboots. Hence, we could also raise the memory clock.


So it looks more likely a bug than an actual repeatable overclock result. Either way, overclocking HBM appears to bring little or no gain.


----------



## Neon Lights

Quote:


> Originally Posted by *Kane2207*
> 
> Nothing, here's the quote from their review:
> Quote:
> 
> 
> 
> Officially there is no option to adjust the memory clocks, but (probably due to a bug in the drivers) we did see a slider to adjust memory clock frequency after every other few reboots. Hence, we could also raise the memory clock.
> 
> 
> 
> So it looks more likely a bug than an actual repeatable overclock result. Either way, overclocking HBM appears to bring little or no gain.
Click to expand...

Ah, okay. Thank you.
Quote:


> In the end we managed to get a 3DMark Fire Strike score of 16963 points on overclocked settings, a nice increase on the standard 14098 points we achieved.


Why do you say that there is "little or no gain"? More than 2000 points difference probably would not come from only 95MHz more GPU clock, would they?


----------



## looniam

Quote:


> Originally Posted by *Noufel*
> 
> Any news on unlocked voltage on the furyX?


http://forums.guru3d.com/showpost.php?p=5106990&postcount=44


----------



## Kane2207

Quote:


> Originally Posted by *Neon Lights*
> 
> Why do you say that there is "little or no gain"? More than 2000 points difference probably would not come from only 95MHz more GPU clock, would they not?


I believe the core clock could account for most if that 2000 points, 95MHz is still around 10% over stock.

There were several threads last week where users reported very minimal gains when overclocking the VRAM on Hawaii, I would assume Fiji is similar seeing as they're both GCN.

The typical thing with VRAM appears to be to go either fast or wide, and HBM is wide.

That's not to say overclocking VRAM brings no benefits, if you're looking to bench where every frame counts then adding speed there will add to the bench no doubt, it's just a secondary consideration after you've rung every last drop out of the core.

Even with that extra 100MHz and the 95 on the core, it's worth noting that the 980ti they benched scored around 3000 points more.

A voltage unlock and custom water is probably the way to go with Fiji.


----------



## Neon Lights

Quote:


> Originally Posted by *looniam*
> 
> http://forums.guru3d.com/showpost.php?p=5106990&postcount=44


Posted that already. Is there really only one person working on Riva Tuner?


----------



## Neon Lights

Quote:


> Originally Posted by *Kane2207*
> 
> A voltage unlock and custom water is probably the way to go with Fiji.


What I am going to do.


----------



## Noufel

Quote:


> Originally Posted by *looniam*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> Any news on unlocked voltage on the furyX?
> 
> 
> 
> http://forums.guru3d.com/showpost.php?p=5106990&postcount=44
Click to expand...

Thnx








So we'll see what can the furyX do with full voltage controle on win10 in few weeks


----------



## toncij

Quote:


> Originally Posted by *Kane2207*
> 
> I believe the core clock could account for most if that 2000 points, 95MHz is still around 10% over stock.
> 
> There were several threads last week where users reported very minimal gains when overclocking the VRAM on Hawaii, I would assume Fiji is similar seeing as they're both GCN.
> 
> The typical thing with VRAM appears to be to go either fast or wide, and HBM is wide.
> 
> That's not to say overclocking VRAM brings no benefits, if you're looking to bench where every frame counts then adding speed there will add to the bench no doubt, it's just a secondary consideration after you've rung every last drop out of the core.
> 
> Even with that extra 100MHz and the 95 on the core, it's worth noting that the 980ti they benched scored around 3000 points more.
> 
> A voltage unlock and custom water is probably the way to go with Fiji.


Well, fast and wide do not account for the same effect, tho, they might. But little difference comes from the fact we need way more GPU perf than memory.


----------



## Thoth420

Quote:


> Originally Posted by *Neon Lights*
> 
> I am playing in "240Hz", haha. Only Full HD but second best contrast ratio of all computer monitors. Eizo FG2421


I considered giving it a try but after not seeing much difference past 120hz in actual use and decided not to bother. I would love it if they made a 120hz 1440


----------



## Forceman

Quote:


> Originally Posted by *Kane2207*
> 
> I believe the core clock could account for most if that 2000 points, 95MHz is still around 10% over stock.


Tough to compare different systems, but here's a Fury X @ 1125 on a 5960X @ 4.4 (compared the 1145/4.5) that had a graphics score of 16838 versus 19321 for that HBM result, which is about a 15% increase. But I'd still like to see a validated result. You'd think they'd have a run a stock memory run to compare.

http://www.3dmark.com/fs/5224465


----------



## rusirius

I had read this entire thread. one thing comes to mind when NVIDIA released the 980ti was intentionaly to disrupt FURY X. Well there is a price gap between gtx970 and gtx980 and AMD has introduced nano.NVIDIA will respond with a nano like card, something along these lines. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125706


----------



## ToTheSun!

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Neon Lights*
> 
> I am playing in "240Hz", haha. Only Full HD but second best contrast ratio of all computer monitors. Eizo FG2421
> 
> 
> 
> I considered giving it a try but after not seeing much difference past 120hz in actual use and decided not to bother. I would love it if they made a 120hz 1440
Click to expand...

Not to mention the FG2421 is not very reliable in terms of QC, and it has an average 90% sRGB coverage after calibration.

It was a neat monitor when it came out, but i think people should let it die a market death.


----------



## harney

Quote:


> Originally Posted by *rusirius*
> 
> I had read this entire thread. one thing comes to mind when NVIDIA released the 980ti was intentionaly to disrupt FURY X. Well there is a price gap between gtx970 and gtx980 and AMD has introduced nano.NVIDIA will respond with a nano like card, something along these lines. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125706


Wow impressed you read the whole thread

Yes there will be more small cards but most likely what you have posted here and maybe more will arrive mainly 970 / 980 ....personally i have a hunch nvidia is going going to go for a left hook blow on amd and reduce all there range US & EU worldwide ..

I have started to notice the 970 dropping slowly along with the 980 here in the uk so i am hoping the Ti will follow shortly too ...


----------



## rusirius

Quote:


> Originally Posted by *harney*
> 
> Wow impressed you read the whole thread
> 
> Yes there will be more small cards but most likely what you have posted here and maybe more will arrive mainly 970 ....personally i have a hunch nvidia is going going to go for a left hook blow on amd and reduce all there range US & EU worldwide ..


wrong quote


----------



## HeadlessKnight

From HardOCP review

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11#.VZB54PmqpBc
Quote:


> The 980 Ti is not strong enough to push that many pixels with an acceptable level of image quality, and certainly the Fury X is not. *Only the TITAN X comes the closest as a single-GPU video card to allowing an "OK" 4K gaming experience.*












I find that part pretty funny. They are talking like the Titan X is significantly faster than 980 Ti, and that 3% advantage clock for clock is going to make a world of difference that is going to transfer your experience from "unacceptable" to "OK". Unless the 980 Ti struggles with VRAM capacity, the maximum playable settings shouldn't change between the two cards. It is either that Nvidia wrote them a huge cheque under the hood, or they are simply kissing Nvidia's ass, so they continue to get free GPUs.
It might not be 100% on topic but it is lol worthy.


----------



## rusirius

Quote:


> Originally Posted by *HeadlessKnight*
> 
> From HardOCP review
> 
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11#.VZB54PmqpBc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find that part pretty funny. They are talking like the Titan X is significantly faster than 980 Ti, and that 3% advantage clock for clock is going to make a world of difference that is going to transfer your experience from "unacceptable" to "OK". Unless the 980 Ti struggles with VRAM capacity, the maximum playable settings shouldn't change between the two cards. It is either that Nvidia wrote them a huge cheque under the hood, or they are simply kissing Nvidia's ass, so they continue to get free GPUs.
> It might not be 100% on topic but it is lol worthy.


4k gaming is not there yet on a single card built on 28nm. game changer on smaller node. so they are correct in saying not yet.


----------



## 2010rig

Quote:


> Originally Posted by *HeadlessKnight*
> 
> From HardOCP review
> 
> http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/11#.VZB54PmqpBc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find that part pretty funny. They are talking like the Titan X is significantly faster than 980 Ti, and that 3% advantage clock for clock is going to make a world of difference that is going to transfer your experience from "unacceptable" to "OK". Unless the 980 Ti struggles with VRAM capacity, the maximum playable settings shouldn't change between the two cards. It is either that Nvidia wrote them a huge cheque under the hood, or they are simply kissing Nvidia's ass, so they continue to get free GPUs.
> It might not be 100% on topic but it is lol worthy.


You do realize that [H] has a history of being pro AMD, right?

They've even held AMD sponsored events in the past.


----------



## Kane2207

Quote:


> Originally Posted by *Forceman*
> 
> Tough to compare different systems, but here's a Fury X @ 1125 on a 5960X @ 4.4 (compared the 1145/4.5) that had a graphics score of 16838 versus 19321 for that HBM result, which is about a 15% increase. But I'd still like to see a validated result. You'd think they'd have a run a stock memory run to compare.
> 
> http://www.3dmark.com/fs/5224465


Yep, my mistake, I was looking at the combined overclock result on hardware.info rather than the graphics score.

I assume hardware.info's quoted ti score is on the same setup and they're quoting the combined score for the ti they reference.


----------



## rusirius

Quote:


> Originally Posted by *2010rig*
> 
> You do realize that [H] has a history of being pro AMD, right?
> 
> They've even held AMD sponsored events in the past.[true statement.
> Quote:
> 
> 
> 
> Originally Posted by *2010rig*
> 
> You do realize that [H] has a history of being pro AMD, right?
> 
> They've even held AMD sponsored events in the past.
> 
> 
> 
> QFT
Click to expand...


----------



## rusirius

Quote:


> Originally Posted by *rusirius*
> 
> QFT


KYLE has been a part of gaming evolved presentaions. KYLE calling the out is not something to dissmiss.


----------



## Neon Lights

Quote:


> Originally Posted by *ToTheSun!*
> 
> Not to mention the FG2421 is not very reliable in terms of QC, and it has an average 90% sRGB coverage after calibration.
> 
> It was a neat monitor when it came out, but i think people should let it die a market death.


When playing video games, color accuracy is negligible. If you ask me, apart from the Full HD resolution, it is the best monitor for playing games at the moment.


----------



## Kane2207

Quote:


> Originally Posted by *2010rig*
> 
> You do realize that [H] has a history of being pro AMD, right?
> 
> They've even held AMD sponsored events in the past.


If you search Kitguru you can find a review of the 7970 where they laud it as the undisputed king of single GPU performance too.

There is a disturbing trend where any review site that doesn't lean positively in AMD's favour is automatically labeled as biased. It's lead to many a circular argument on OCN.

The other thread regarding coil whine and pump noise descended into chaos quite quickly where claims of bias were banded around, which is slightly odd. These reports are from what I would assume are AMDs most faithful followers seeing as they purchased the card day one, unless we're really claiming that Nvidia fanboys are buying $649 GPUs just to pick a fault with them to paint AMD in a bad light?

That would be quite the conspiracy theory if OCN users believe that is the case.


----------



## rusirius

https://www.youtube.com/watch?v=1Uzhf9lcflg amd with kyle bennett. [H]ocp was the harshest on the web.


----------



## Tivan

Quote:


> Originally Posted by *Kane2207*
> 
> There is a disturbing trend where any review site that doesn't lean positively in AMD's favour is automatically labeled as biased. It's lead to many a circular argument on OCN.


I think there's a worrying trend of hardware news sites jumping on anything they can hype or criticize for clicks, but it really isn't that worrying. It's just what happens if you aren't actually employed on a fixed wage for a site. I don't think this leads to circular arguments on OCN, though.
Quote:


> The other thread regarding coil whine and pump noise descended into chaos quite quickly where claims of bias were banded around, which is slightly odd.


I'd blame the fact that the article had a serious clickbait headline and the fact that the thread opener was quite biased in his wording with regard to not giving AMD basic respect, at points.
Basically, that thread was seeing 10 times more posts than was good for it, due to the provocative nature and being posted in the news section, increasing exposure. It just wasn't set up to become an organized effort in verifying who has or doesn't have issues with the acoustics. Especially at the point where people started arguing that youtube videos are factual evidence and that anecdotal evidence of owners is impossible to make any use off.


----------



## Ganf

Quote:


> Originally Posted by *rusirius*
> 
> https://www.youtube.com/watch?v=1Uzhf9lcflg amd with kyle bennett. [H]ocp was the harshest on the web.


Pretty hilarious that people are willing to call reviewers biased without reading the review to find that they're damning the product they're supposed to be biased in favor of.









Lack of effort, is how I would label most of the reviews. It's like they can't be bothered to compare their results with their previous work and determine if anything is off, needs to be re-evaluated or if there is something unusual about the results. You see them screwing up just as many Nvidia benchmarks as you do AMD, but no one cares about the Nvidia screwups because they're buying the cards anyways. Point out where they screwed up an AMD test though and you're a shill.

One month a 980 is testing out 20% above a 290x, the next month the 290x is within 10%...


----------



## rusirius

Quote:


> Originally Posted by *Ganf*
> 
> Pretty hilarious that people are willing to call reviewers biased without reading the review to find that they're damning the product they're supposed to be biased in favor of.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lack of effort, is how I would label most of the reviews. It's like they can't be bothered to compare their results with their previous work and determine if anything is off, needs to be re-evaluated or if there is something unusual about the results. You see them screwing up just as many Nvidia benchmarks as you do AMD, but no one cares about the Nvidia screwups because they're buying the cards anyways. Point out where they screwed up an AMD test though and you're a shill.
> 
> One month a 980 is testing out 20% above a 290x, the next month the 290x is within 10%...


EXACTLY Kyle called it like he saw it. He said some harsh things. Kyle wont be getting AMD cards anymore.


----------



## blue1512

Another HBM overclock








http://www.techpowerup.com/gpuz/details.php?id=9pdzh


----------



## Ganf

Quote:


> Originally Posted by *blue1512*
> 
> Another HBM overclock
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


Means nothing without the benchmarks showing that it's actually in effect and changing the performance of the card.


----------



## rusirius

Quote:


> Originally Posted by *blue1512*
> 
> Another HBM overclock
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


When unwinder says he doesent have a card to make overvolting happen believe him not some gpuz screenshot.


----------



## blue1512

Quote:


> Originally Posted by *rusirius*
> 
> When unwinder says he doesent have a card to make overvolting happen believe him not some gpuz screenshot.


Is there anything in my post related to unwinder? FYI he is behind afterburner but he is not the only one who can master it. And it's not a GPUz screenshot. It's a GPUz *validation link.*
Quote:


> Originally Posted by *Ganf*
> 
> Means nothing without the benchmarks showing that it's actually in effect and changing the performance of the card.


I thought I post the test of Hardware.info, right? They have 1145/600MHz (9% on core, *20% on mem*) stable and *20% increase score in Firestrike*


----------



## rusirius

Quote:


> Originally Posted by *blue1512*
> 
> Is there anything in my post related to unwinder? FYI he is behind afterburner but he is not the only one who can master it. And it's not a GPUz screenshot. It's a GPUz *validation link.*
> I thought I post the test of Hardware.info, right? They have 1145/600MHz (9% on core, *20% on mem*) stable and *20% increase score in Firestrike*


i believe him he writes the programs. sorry his cedability s well known.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> Is there anything in my post related to unwinder? FYI he is behind afterburner but he is not the only one who can master it.
> I thought I post the test of Hardware.info, right? They have 1145/600MHz (9% on core, *20% on mem*) stable and *20% increase score in Firestrike*


But we have no idea how much that increase in HBM is contributing to that score, plus the method of enabling the slider is to reboot your PC a random number of times in the hope of inducing the bug.

Even then, it could just be a trivial graphical glitch. There's no indication that the memory overclock is actually being applied at this time.

You need a user with a stock core clock to trigger the glitch, run the bench and then run the bench a second time without the bug to see definitively whether the memory clock actually has an impact and to what degree.


----------



## rusirius

Quote:


> Originally Posted by *Kane2207*
> 
> But we have no idea how much that increase in HBM is contributing to that score, plus the method of enabling the slider is to reboot your PC a random number of times in the hope of inducing the bug.
> 
> Even then, it could just be a trivial graphical glitch. There's no indication that the memory overclock is actually being applied at this time.
> 
> You need a user with a stock core clock to trigger the glitch, run the bench and then run the bench a second time without the bug to see definitively whether the memory clock actually has an impact and to what degree.


it is a bug you got it right.


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> But we have no idea how much that increase in HBM is contributing to that score, plus the method of enabling the slider is to reboot your PC a random number of times in the hope of inducing the bug.
> 
> Even then, it could just be a trivial graphical glitch. There's no indication that the memory overclock is actually being applied at this time.
> 
> You need a user with a stock core clock to trigger the glitch, run the bench and then run the bench a second time without the bug to see definitively whether the memory clock actually has an impact and to what degree.


Quote:


> Originally Posted by *rusirius*
> 
> it is a bug you got it right.


I can't believe that I'm posting in OCN.
AMD locked HBM and there is a way to OC, is it not big enough for overlockers? Utterly disappointed.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> I can't believe that I'm posting in OCN.
> AMD locked HBM and there is a way to OC, is it not big enough for overlockers? Utterly disappointed.


No, you don't seem to understand that a random GPUZ screen isn't conclusive proof of a HBM overclock.


----------



## rusirius

Quote:


> Originally Posted by *blue1512*
> 
> I can't believe that I'm posting in OCN.
> AMD locked HBM and there is a way to OC, is it not big enough for overlockers? Utterly disappointed.


unless unwinder unlocks it dont believe it. HE is saying he does not have a fiji card.


----------



## Ganf

Quote:


> Originally Posted by *blue1512*
> 
> Is there anything in my post related to unwinder? FYI he is behind afterburner but he is not the only one who can master it. And it's not a GPUz screenshot. It's a GPUz *validation link.*
> I thought I post the test of Hardware.info, right? They have 1145/600MHz (9% on core, *20% on mem*) stable and *20% increase score in Firestrike*


You posted one benchmark with no stock benchmark to compare against on the same machine. To determine whether overclocking the HBM had any effect at all, if you are going to use 3dmark which is the worst possible benchmark your could use to show an improvement in VRAM speed, you need to show the graphics score at stock settings, a graphics score with the memory overclocked only, and a graphics score with both the HBM and core OC'd to determine if the improvement scales with the GPU.

You're missing over half of the relevant information to make any credible claim.


----------



## blue1512

Quote:


> Originally Posted by *rusirius*
> 
> unless unwinder unlocks it dont believe it. HE is saying he does not have a fiji card.


Why did u bring unwinder here by the way? We are talking about HBM.
Quote:


> Originally Posted by *Kane2207*
> 
> No, you don't seem to understand that a random GPUZ screen isn't conclusive proof of a HBM overclock.


You didn't even bother check the validation link, right? It's a conclusive proof. And the clock must be stable somewhat to pass the validation.
http://www.techpowerup.com/gpuz/details.php?id=9pdzh


----------



## blue1512

Quote:


> Originally Posted by *Ganf*
> 
> You posted one benchmark with no stock benchmark to compare against on the same machine. To determine whether overclocking the HBM had any effect at all, if you are going to use 3dmark with is the worst possible benchmark your could use to show an improvement in VRAM speed, you need to show the graphics score at stock settings, a graphics score with the memory overclocked only, and a graphics score with both the HBM and core OC'd to determine if the improvement scales with the GPU.
> 
> You're missing over half of the relevant information to make any credible claim.


I said I did post the test, just you didn't bother to dig this thread
http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results
Quote:


> We ended up being able to change the GPU speeds by 95MHz to 1145 MHz and the memory by 100MHz. The latter seems a bit on the low side, but it is actually 20% on top of the standard frequencies this card uses.
> 
> In the end we managed to get a 3DMark Fire Strike score of 16963 points on overclocked settings, a nice increase on the standard 14098 points we achieved.


14098 to 16963 is a 20% jump, which cant be achieved with 95MHz on the core alone


----------



## rusirius

Quote:


> Originally Posted by *blue1512*
> 
> Why did u bring unwinder here by the way? We are talking about HBM.
> You didn't even bother check the validation link, right? It's a conclusive proof. And the clock must be stable somewhat to pass the validation.
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


unwinder say he does not have a card.END OF SCREENSHOTS.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> Why did u bring unwinder here by the way? We are talking about HBM.
> You didn't even bother check the validation link, right? It's a conclusive proof. And the clock must be stable somewhat to pass the validation.
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


The link is irrelevant to a degree. You need to have two benches run, both with stock core, one with HBM overclocked and one without.

If there is then a variation in score outside of the margin of error, I'll then believe it to be true.

A random GPUZ doesn't prove the overclock is actually being applied.


----------



## rusirius

ALL GPU OVERCLOCKING VAILID RESULTS USE HIS WORK. END OF DISCUSSION.


----------



## Ganf

Quote:


> Originally Posted by *blue1512*
> 
> I said I did post the test, just you didn't bother to dig this thread
> http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results
> 14098 to 16963 is a 20% jump, which cant be achieved with 95MHz on the core alone


You're comparing final score, not graphics score. Final score is meaningless on AMD cards in 3dmark because of DX11 CPU overhead. The cards get destroyed in the combined test which makes up a majority of the score result, and can be easily manipulated by bouncing clocks around until you find a favorable combination between CPU and GPU.

Not implying the scores were manipulated, just pointing out that there is more to this than meets the eye.

http://www.vmodtech.com/main/wp-content/uploads/2015/06/23/amd/fsde.jpg

Similar machine tested at stock GPU settings, minus 100mhz on the CPU core. If you know anything about Firestrike, AMD cards and overclocking you know that the VRAM speed has nothing to do with the combined test, and that 10% on the card and 2% on the CPU doesn't make for a 20% increase in the combined score. Ever.

Firestrike has not been patched to work properly with HBM, they haven't even added the card to their list of supported hardware.


----------



## rt123

Quote:


> Originally Posted by *rusirius*
> 
> unwinder say he does not have a card.END OF SCREENSHOTS.


Quote:


> Originally Posted by *rusirius*
> 
> ALL GPU OVERCLOCKING VAILID RESULTS USE HIS WORK. END OF DISCUSSION.


I don't completely believe what the guy is saying, but..

Your Unwinder is the end & be all rhetoric is not correct. There are other Overclocking utilities, Sapphire Trixx & Asus Gpu Tweak come to mind.


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> The link is irrelevant to a degree. You need to have two benches run, both with stock core, one with HBM overclocked and one without.
> 
> If there is then a variation in score outside of the margin of error, I'll then believe it to be true.
> 
> A random GPUZ doesn't prove the overclock is actually being applied.


----------



## rusirius

Quote:


> Originally Posted by *Ganf*
> 
> You're comparing final score, not graphics score. Final score is meaningless on AMD cards in 3dmark because of DX11 CPU overhead. The cards get destroyed in the combined test which makes up a majority of the score result, and can be easily manipulated by bouncing clocks around until you find a favorable combination between CPU and GPU.
> 
> Not implying the scores were manipulated, just pointing out that there is more to this than meets the eye.
> 
> http://www.vmodtech.com/main/wp-content/uploads/2015/06/23/amd/fsde.jpg
> 
> Similar machine tested at stock GPU settings, minus 100mhz on the CPU core. If you know anything about Firestrike, AMD cards and overclocking you know that the VRAM speed has nothing to do with the combined test, and that 10% on the card and 2% on the CPU doesn't make for a 20% increase in the combined score. Ever.
> 
> Firestrike has not been patched to work properly with HBM, they haven't even added the card to their list of supported hardware.[/quotAre you trying to define a baeline with hypotheticals that have not been proven as yet? unwinder saId he does not have the card.REEL IT IN.


----------



## rusirius

Quote:


> Originally Posted by *rusirius*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ganf*
> 
> You're comparing final score, not graphics score. Final score is meaningless on AMD cards in 3dmark because of DX11 CPU overhead. The cards get destroyed in the combined test which makes up a majority of the score result, and can be easily manipulated by bouncing clocks around until you find a favorable combination between CPU and GPU.
> 
> Not implying the scores were manipulated, just pointing out that there is more to this than meets the eye.
> 
> http://www.vmodtech.com/main/wp-content/uploads/2015/06/23/amd/fsde.jpg
> 
> Similar machine tested at stock GPU settings, minus 100mhz on the CPU core. If you know anything about Firestrike, AMD cards and overclocking you know that the VRAM speed has nothing to do with the combined test, and that 10% on the card and 2% on the CPU doesn't make for a 20% increase in the combined score. Ever.
> 
> Firestrike has not been patched to work properly with HBM, they haven't even added the card to their list of supported hardware.[/quot
Click to expand...


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> Firestrike has not been patched to work properly with HBM, they haven't even added the card to their list of supported hardware.


Why would they need to patch Firestrike for HBM? The driver should handle any HBM-related adjustments, assuming there are any. If each game/benchmark has to be patched for HBM to work properly, Fury owners are in a world of hurt.


----------



## Ganf

How many broken quotes can we have in a row?









I don't care what unwinder has or has not done, what he does or does not have in his possession. There are plenty of people out there who can edit the BIOS for these cards, Unwinder and his overclocking utility do not need to be involved in any way shape or form. Calm your pants.
Quote:


> Originally Posted by *Forceman*
> 
> Why would they need to patch Firestrike for HBM? The driver should handle any HBM-related adjustments, assuming there are any. If each game/benchmark has to be patched for HBM to work properly, Fury owners are in a world of hurt.


Every benchmark has to have new architectures patched in. You can't numerically score a card's performance without knowing how it is doing what it does. Hardware architectures that 3dmark doesn't recognize are essentially a black box until they're patched in. It will try to guess, because guessing is better than an error message, but that doesn't mean the score is accurate.

Why do you think benchmarks of unknown GPU's get flagged as invalid? There's no way to guarantee if they're being scored properly.


----------



## blue1512

Hey Ganf. Did you see that scorching 19k graphics score of the OC'd card in my link?


----------



## Ganf

Quote:


> Originally Posted by *blue1512*
> 
> Hey Ganf. Did you see that scorching 19k graphics score of the OC'd card in my link?


Scorching? Sure.

980 ti's score 10% higher than that at stock, a Fury X OC'd by 10% should match a stock 980 ti. The score is borked.


----------



## rdr09

Quote:


> Originally Posted by *rt123*
> 
> I don't completely believe what the guy is saying, but..
> 
> Your Unwinder is the end & be all rhetoric is not correct. There are other Overclocking utilities, Sapphire Trixx & Asus Gpu Tweak come to mind.


you are making too much sense. you got ignored.

i don't even use afterburner to oc my gpus. lol


----------



## AmericanLoco

Quote:


> Originally Posted by *Ganf*
> 
> How many broken quotes can we have in a row?
> 
> 
> 
> 
> 
> 
> 
> 
> Every benchmark has to have new architectures patched in. You can't numerically score a card's performance without knowing how it is doing what it does. Hardware architectures that 3dmark doesn't recognize are essentially a black box until they're patched in. It will try to guess, because guessing is better than an error message, but that doesn't mean the score is accurate.
> 
> Why do you think benchmarks of unknown GPU's get flagged as invalid? There's no way to guarantee if they're being scored properly.


What? That's complete nonsense. The whole point of high level APIs like OpenGL and DirectX (up to Version 11 at least) were to mask the underlying hardware from the application. The driver and the API handle everything and the application itself is pretty much entirely unaware of what's going on under the hood.

There will be no "HBM" patch for 3DMark.


----------



## rt123

Quote:


> Originally Posted by *rdr09*
> 
> you are making too much sense. you got ignored.
> 
> i don't even use afterburner to oc my gpus. lol


I know. Lol.









Unwinder is god....


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Ganf*
> 
> Scorching? Sure.
> 
> *980 ti's score 10% higher than that at stock*, a Fury X OC'd by 10% should match a stock 980 ti. The score is borked.


Lol, wut? I've never seen a stock 980Ti get anywhere near 19k GPU score in FS. 19k GPU score requires at least 1450MHz from a 980Ti. Typically they score 15k or so at stock...


----------



## Ganf

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, wut? I've never seen a stock 980Ti get anywhere near 19k GPU score in FS. 19k GPU score requires at least 1450MHz from a 980Ti. Typically they score 15k or so at stock...


I always forget that 3dmark has a habit of screwing up reporting clocks. Thanks for reminding me to check outside of their webpage.

On that note: That would put my Air cooled 290x a mere 12% behind a Fury X. Still think it's scaling properly?

http://www.3dmark.com/3dm/7548301?
Quote:


> Originally Posted by *AmericanLoco*
> 
> What? That's complete nonsense. The whole point of high level APIs like OpenGL and DirectX (up to Version 11 at least) were to mask the underlying hardware from the application. The driver and the API handle everything and the application itself is pretty much entirely unaware of what's going on under the hood.
> 
> There will be no "HBM" patch for 3DMark.


3dmark "cleans up" results on a card by card basis to avoid fudging the numbers. Need we go into how many scandals there have been over the years with both AMD and Nvidia doping the results on synthetic benchmarks, and how that is one of the reasons people prefer game tests?

Edit: That's it for me for the night. I screwed something up trying to overclock my PCI-e bus in between posts. I'll be fixing that now if you don't mind.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, wut? I've never seen a stock 980Ti get anywhere near 19k GPU score in FS. 19k GPU score requires at least 1450MHz from a 980Ti. Typically they score 15k or so at stock...


Guru3D's did 17000 in their review.



And 20K at 1450.



So if the Fury gets that kind of gain just from overclocking the memory, that is interesting.


----------



## hamzta09

Quote:


> Originally Posted by *Forceman*
> 
> Guru3D's did 17000 in their review.
> 
> 
> 
> And 20K at 1450.
> 
> 
> 
> So if the Fury gets that kind of gain just from overclocking the memory, that is interesting.


Arent those images 980 Ti?

What did stock Fury get?


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> Arent those images 980 Ti?
> 
> What did stock Fury get?


Yes they are stock and overclocked 980Ti scores for comparison. That HBM overclocked Fury was 19.3K graphics. Stock was around 16K.


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> Arent those images 980 Ti?
> 
> What did stock Fury get?


Stock Fury is 1k behind that, and Kitguru's scored similarly. Common theme between the two is that as far as I can tell the cards didn't throttle.

http://www.kitguru.net/components/graphic-cards/zardon/nvidia-gtx980-ti-review/8/

People think I'm kidding when I say that these "review" sites screw up Nvidia benchmarks too.


----------



## hamzta09

Quote:


> Originally Posted by *Forceman*
> 
> Yes they are stock and overclocked 980Ti scores for comparison. That HBM overclocked Fury was 19.3K graphics. Stock was around 16K.


Are there any OC'd Fury X + Game benches?

I prefer seeing realworld results than 3dmark.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hamzta09*
> 
> Are there any OC'd Fury X + Game benches?
> 
> I prefer seeing realworld results than 3dmark.


3DMark is best to show full benefit of overclocking without CPU playing too much part.


----------



## Offender_Mullet

The Linus Tech Tips 'review' was pretty awful, content-wise. Should have just waited for their replacement to come.

I found something worse though.......much worse............


----------



## ZealotKi11er

Quote:


> Originally Posted by *Offender_Mullet*
> 
> The Linus Tech Tips 'review' was pretty awful, content-wise. Should have just waited for their replacement to come.
> 
> I found something worse though.......much worse............


Got to get those Fury X clicks. I was also expecting a review and got nothing.


----------



## Final8ty

Quote:


> Originally Posted by *AMDMatt;28241851*
> Some more benches for ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Alien Isolation*
> 
> 5960x @4.9Ghz
> 4xFury X @1125/500Mhz
> 15.15
> 2160P
> 
> 1x Fury X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2x Fury X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3x Fury X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4x Fury X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Dirt Rally*
> 
> 5960x @4.9Ghz
> 1x Fury X @1125/500Mhz
> 15.15
> 2160P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5960x @4.9Ghz
> 2x Fury X @1125/500Mhz
> 15.15
> 2160P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5960x @4.9Ghz
> 3x Fury X @1125/500Mhz
> 15.15
> 2160P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5960x @4.9Ghz
> 4x Fury X @1125/500Mhz
> 15.15
> 2160P
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> What happened to the minimum frame rate in Alien Isolation?


Quote:


> Originally Posted by *AMDMatt;28242652*
> It's just a benchmark anomaly so nothing to worry about at all. If you look at other results in the bench thread you'll see it's fairly common. You'll never see minimum fps that low whilst playing the actual game.


http://forums.overclockers.co.uk/showpost.php?p=28242652&postcount=1601

More to come.


----------



## Final8ty

*Sniper Elite 3*
Quote:


> Originally Posted by *AMDMatt;28241246*
> 5960x @4.9Ghz
> 4x FuryX @1125/500Mhz
> 15.15
> 1080P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 10:13:50
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 302.8
> Minimum FPS: 30.4
> Maximum FPS: 1240.3
> 
> Number Of Frames: 22899
> Average Frame: 3.302ms
> Minimum Frame: 0.806ms
> Maximum Frame: 32.886ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 1920
> Resolution Height: 1080
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


Quote:


> Originally Posted by *AMDMatt;28241269*
> 5960x @4.9Ghz
> 3x FuryX @1125/500Mhz
> 15.15
> 1080P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 10:29:39
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 233.9
> Minimum FPS: 33.9
> Maximum FPS: 1132.9
> 
> Number Of Frames: 17705
> Average Frame: 4.275ms
> Minimum Frame: 0.883ms
> Maximum Frame: 29.528ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 1920
> Resolution Height: 1080
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


Quote:


> Originally Posted by *AMDMatt;28241314*
> 5960x @4.9Ghz
> 2x FuryX @1125/500Mhz
> 15.15
> 1080P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 10:42:46
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 156.2
> Minimum FPS: 23.2
> Maximum FPS: 1199.7
> 
> Number Of Frames: 11834
> Average Frame: 6.401ms
> Minimum Frame: 0.834ms
> Maximum Frame: 43.075ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 1920
> Resolution Height: 1080
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


Quote:


> Originally Posted by *AMDMatt;28241445*
> Thanks Kaap now corrected.
> 
> 5960x @4.9Ghz
> 1x FuryX @1125/500Mhz
> 15.15
> DX
> 1080P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 10:52:56
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 78.4
> Minimum FPS: 28.6
> Maximum FPS: 263.3
> 
> Number Of Frames: 5940
> Average Frame: 12.754ms
> Minimum Frame: 3.798ms
> Maximum Frame: 34.978ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 1920
> Resolution Height: 1080
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF
> 
> 
> 
> 5960x @4.9Ghz
> 1x FuryX @1125/500Mhz
> 15.15
> Mantle
> 1080P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (Mantle)
> ================================================================
> Created: 2015-06-28 at 11:11:35
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 84.4
> Minimum FPS: 23.0
> Maximum FPS: 920.1
> 
> Number Of Frames: 6392
> Average Frame: 11.855ms
> Minimum Frame: 1.087ms
> Maximum Frame: 43.416ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name:
> Feature Level: Mantle
> Dedicated VRAM: 4096MB
> 
> Resolution Width: 1920
> Resolution Height: 1080
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


Quote:


> Originally Posted by *AMDMatt;28241549*
> 5960x @4.9Ghz
> 1x FuryX @1125/500Mhz
> 15.15
> Mantle
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (Mantle)
> ================================================================
> Created: 2015-06-28 at 11:32:51
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 51.9
> Minimum FPS: 16.6
> Maximum FPS: 880.8
> 
> Number Of Frames: 3933
> Average Frame: 19.269ms
> Minimum Frame: 1.135ms
> Maximum Frame: 60.289ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name:
> Feature Level: Mantle
> Dedicated VRAM: 4096MB
> 
> Resolution Width: 2560
> Resolution Height: 1440
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF
> 
> 
> 
> 5960x @4.9Ghz
> 1x FuryX @1125/500Mhz
> 15.15
> DX11
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 11:35:01
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 48.3
> Minimum FPS: 14.2
> Maximum FPS: 59.3
> 
> Number Of Frames: 3666
> Average Frame: 20.684ms
> Minimum Frame: 16.859ms
> Maximum Frame: 70.647ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 2560
> Resolution Height: 1440
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF
> 
> 
> 
> 5960x @4.9Ghz
> 1x FuryX @1125/500Mhz
> 15.15
> DX11
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 11:41:53
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 96.4
> Minimum FPS: 16.6
> Maximum FPS: 1154.3
> 
> Number Of Frames: 7304
> Average Frame: 10.375ms
> Minimum Frame: 0.866ms
> Maximum Frame: 60.357ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 2560
> Resolution Height: 1440
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


Quote:


> Originally Posted by *AMDMatt;28241609*
> 5960x @4.9Ghz
> 3x FuryX @1125/500Mhz
> 15.15
> DX11
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 11:56:41
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 144.2
> Minimum FPS: 16.5
> Maximum FPS: 1125.1
> 
> Number Of Frames: 10923
> Average Frame: 6.933ms
> Minimum Frame: 0.889ms
> Maximum Frame: 60.602ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 2560
> Resolution Height: 1440
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF
> 
> 
> 
> 5960x @4.9Ghz
> 4x FuryX @1125/500Mhz
> 15.15
> DX11
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Sniper Elite 3 Benchmark Report (DirectX)
> ================================================================
> Created: 2015-06-28 at 12:02:14
> Version 1.15a Build Version: 2014.12.09.001
> ================================================================
> 
> Average FPS: 190.6
> Minimum FPS: 19.8
> Maximum FPS: 1154.3
> 
> Number Of Frames: 14418
> Average Frame: 5.246ms
> Minimum Frame: 0.866ms
> Maximum Frame: 50.429ms
> 
> Machine Name: HOME
> Monitors: 1
> Operating System: Windows 8 Professional (build 9200), 64-bit
> System RAM: 16253MB
> CPU: Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Number of Cores: 8
> 
> GPU Name: AMD Radeon (TM) R9 Series
> Feature Level: DX11.0
> Dedicated VRAM: 3072MB
> 
> Resolution Width: 2560
> Resolution Height: 1440
> Texture Detail: ULTRA
> Shadows Detail: ULTRA
> Draw Distance: ULTRA
> Anti-aliasing: HIGH
> Supersampling: 4.0x
> Anisotropic Level: 16
> Obscurance Fields: ON
> Tessellation: ON
> Ambient Occlusion: ON
> Motion Blur: ON
> Vertical Sync: OFF
> Reduce Mouse Lag: OFF
> Stereo 3D: OFF


http://forums.overclockers.co.uk/showthread.php?t=18627619&page=10


----------



## Final8ty

*Dirt 3*
Quote:


> Originally Posted by *AMDMatt;28242272*
> 5960x @ 4.9Ghz
> 4x Fury X @1050/500Mhz
> 15.15
> 
> 2160P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?t=18651443&page=5


----------



## Final8ty

*Dirt Showdown*
Quote:


> Originally Posted by *AMDMatt;28242197*
> 5960X @4.9Ghz
> 4x FuryX @1050/500Mhz
> 15.15
> 2160P
> 
> 1x Fury X @1115/500Mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2x Fury X @1050/500Mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4x Fury X @1050/500Mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?t=18589127&page=12


----------



## rt123

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 3DMark is best to show full benefit of overclocking without CPU playing too much part.


It is flawed if you compare AMD vs Nvidia using 3DMark though.
If measuring the performance increase vs stock on the same brand card, then it might be okay.


----------



## BoredErica

Did anything interesting actually got said in the past 300 posts?


----------



## manifest3r

Quote:


> Originally Posted by *Darkwizzie*
> 
> Did anything interesting actually got said in the past 300 posts?


Probably not, just the productive "AMD is so gud" "No nVidia is bettur" posts...


----------



## error-id10t

It can be OC wasn't that news, was away over the weekend and only looked at few last pages but that seems to have happened?


----------



## Standards

Quote:


> Originally Posted by *Darkwizzie*
> 
> Did anything interesting actually got said in the past 300 posts?


Post purchase rationalizations and brand loyalty.


----------



## Kane2207

Quote:


> Originally Posted by *Darkwizzie*
> 
> Did anything interesting actually got said in the past 300 posts?


The level of damage control has been quite interesting.


----------



## toncij

Quote:


> Originally Posted by *Final8ty*
> 
> http://forums.overclockers.co.uk/showpost.php?p=28242652&postcount=1601
> 
> More to come.


No valid scores... hmm. Why would anyone not validate a score and use an older version of 3DMark? Also, test date is missing hour and minutes.

This is how it should look like: 

And this is how it looks like:
http://forums.overclockers.co.uk/showpost.php?p=28244479&postcount=1683

There is a graph also missing on the right... and the "combined score" does not match total level.


----------



## harney

Quote:


> Originally Posted by *toncij*
> 
> No valid scores... hmm. Why would anyone not validate a score and use an older version of 3DMark? Also, test date is missing hour and minutes.
> 
> This is how it should look like:
> 
> And this is how it looks like:
> http://forums.overclockers.co.uk/showpost.php?p=28244479&postcount=1683
> 
> There is a graph also missing on the right... and the "combined score" does not match total level.


The chap on OCUK has a valid point that yes wasn't the fury suppose to be 640gb then changed so it could be an issue with some HBM hence the lock ....mmmmmm interesting


----------



## toncij

Quote:


> Originally Posted by *harney*
> 
> The chap on OCUK has a valid point that yes wasn't the fury suppose to be 640gb then changed so it could be an issue with some HBM hence the lock ....mmmmmm interesting


Those were rumors... I wouldn't take those for granted. People make things up and when they do, they go for "logic" - logic was:
http://wccftech.com/amd-fiji-r9-390x-specs-leak/

They expected 1,25GHz, etc. 128GB*4 for 4 stacks...

Reality is a bit different.


----------



## XenoRad

I've traditionally been an nVidia customer but the last two cards I got were from AMD -partly because I was curious about the performance on the Red Team and partly because they were a better deal at the time I was shopping. Currently I have a 290x that I'm very pleased with though performance in certain games that I play like GTA 5 could be better.

With this launch I think both AMD and nVidia have compelling products at the high-end. On the one hand nVidia has a solid performer with the 980 TI and 6 GB VRAM, on the other the Fury X is an interesting design that keeps its own up to and especially at 4K (regardless of it having "only" 4 GB VRAM). It's a pity that the Fury doesn't have an HDMI 2.0 port or any DVI port. I can't say that I'll need any for the foreseeable future but does look like an omission that would have been easy to avoid.

Otherwise AMD could have done more with the Fury but probably not in this time frame. I suspect that the Fury X2 (it that will be its name) will be more in line with our expectations with more RAM and more connectivity.

If I were to buy a new GPU in the immediate future I'd have a hard time deciding between the 980 TI and the Fury X:

- I like the fact that the Fury X is water cooled and a smaller card but I'm not sure where I'd mount the cooler and radiator in my case;
- I like that the 980 TI has 6 GB but in practice this currently doesn't give it any advantage over the Fury's 4 GB even at 4K;
- Both the 980 TI and Fury X trade blows in several games that I play and I'm not sure how the performance will change with yet to be released games and driver updates;
- I'll most likely also change my monitor to a 1440p one and it will come with either FreeSync or G-Sync so this will be something extra to consider when choosing the GPU brand;
- The Fury X doesn't support the full DX 12.1 specification but whether that will be relevant in the next 2-3 years is still unknown;
- I've read about some concerns regarding the 980 TI's temperature around the VRMs and RAM while overclocked and under full load;
- The 980 TI is slightly more expensive where I live.

All in all I think it's a tough choice if you don't have a brand preference.


----------



## Partogi

@Xeno

But 980 ti has far better OC potential. Do you know that?


----------



## Anateus

Was there any specific reason for not adding the HDMI 2.0?


----------



## Standards

Quote:


> Originally Posted by *Partogi*
> 
> @Xeno
> 
> But 980 ti has far better OC potential. Do you know that?


That's kind of a misnomer. The 980ti OCs well for sure, but no one has voltage control over the fury x yet. I doubt it OCs incredibly, but it's still speculation at this point.


----------



## Themisseble

Quote:


> Originally Posted by *Standards*
> 
> That's kind of a misnomer. The 980ti OCs well for sure, but no one has voltage control over the fury x yet. I doubt it OCs incredibly, but it's still speculation at this point.


here is 1080P
GTX 980Ti 1400MHz/7300MHz vs R9 Fury X 1200MHz/500MHz(1000)
Not bad at all for Fury X.

R9 Fury X has much better DX12 API overhead score. Also Fury X has much more horse power in compute and card is design for VR.
Show us some VR and DX12 games then we can decide. I dont want to judge but if we look past AMD is much better than NVIDIA...

Basically GCN core was incredible and still is. 4 year old core is made for low lvl API and still fighting new Maxwell core. Where is AMD new core? GCN 2.0? Maybe preparing for 14nm?...


----------



## harney

Quote:


> Originally Posted by *Themisseble*
> 
> here is 1080P
> GTX 980Ti 1400MHz/7300MHz vs R9 Fury X 1200MHz/500MHz(1000)
> Not bad at all for Fury X.


Where?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Themisseble*
> 
> here is 1080P
> GTX 980Ti 1400MHz/7300MHz vs R9 Fury X 1200MHz/500MHz(1000)
> Not bad at all for Fury X.


eee?


----------



## Ganf

Quote:


> Originally Posted by *Partogi*
> 
> @Xeno
> 
> But 980 ti has far better OC potential. Do you know that?




Someone seems to have forgotten literally every other difference between overclocking on AMD and Nvidia. Enjoy the actual performance numbers when the voltage gets unlocked.


----------



## Themisseble

Quote:


> Originally Posted by *harney*
> 
> Where?


https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url

Overclocking
GCN does OC also well hitting 1300-1400 on 7850 ( stock was at 860) was easy but the problem is that you really OC only for benchmark and you wont use it for normal gaming, specially with 650$+ card,


----------



## Ganf

Quote:


> Originally Posted by *Themisseble*
> 
> https://translate.google.com/translate?sl=es&tl=en&js=y&prev=_t&hl=es&ie=UTF-8&u=http%3A%2F%2Fwww.hispazone.com%2FReview%2F1077%2FAMD-Radeon-R9-Fury-X-Series.html&edit-text=&act=url


They say that, but then they benchmarked the overclocking at 1156. I don't get that site.


----------



## blue1512

Is that "12.1 feature full support" a real thing for nVidia? I just saw this


SO, MS haven't officially accepted nVidia's proposal on 12_1 features, right?


----------



## toncij

Quote:


> Originally Posted by *Themisseble*
> 
> here is 1080P
> GTX 980Ti 1400MHz/7300MHz vs R9 Fury X 1200MHz/500MHz(1000)
> Not bad at all for Fury X.
> 
> R9 Fury X has much better DX12 API overhead score. Also Fury X has much more horse power in compute and card is design for VR.
> Show us some VR and DX12 games then we can decide. I dont want to judge but if we look past AMD is much better than NVIDIA...
> 
> Basically GCN core was incredible and still is. 4 year old core is made for low lvl API and still fighting new Maxwell core. Where is AMD new core? GCN 2.0? Maybe preparing for 14nm?...


FuryX at 1200? Do you have that validated anywhere at all?

And - once again - NO - overhead score has absolutely nothing to do with direct performance comparison.

That just means that, where it was a limiting factor, AMD will stop sucking, not get magically faster everywhere.


----------



## Olivon

Quote:


> Originally Posted by *toncij*
> 
> This looks nothing short of a malicious or really incompetent testing result. You can't make the card that hot by mistake....


Damien Triolet is one of the most serious GPU reviewer in the world (one of the best GPU reviewers in the business - *link*) :
He didn't used Furmark but 45 minutes full load 3DMark 11 scene 1.
Test was made on a closed Cooler Master RC-690 II Advanced case.
And once again, backplate got only aesthetics purpose. Others reviewers need maybe to put as much serious as Damien is bringing on his reviews.
His reviews are really high level and few GPU reviewers can pretend to reach such a level of perfectness


----------



## friend'scatdied

I just don't think they were able to hit the proper clock targets with this card (see: bulldozer). Stock clocks needed to be 1200-1250MHz to be competitive.

Why did AMD force 0xAF in the "leaked" Fury X vs. 980 Ti 4K charts? How did that confer a benefit to Fury X, and does that hold up today? FWIR the performance hit from 16x AF should be minimal nowadays...


----------



## toncij

Quote:


> Originally Posted by *Olivon*
> 
> Damien Triolet is one of the most serious GPU reviewer in the world (one of the best GPU reviewers in the business - *link*) :
> He didn't used Furmark but 45 minutes full load 3DMark 11 scene 1.
> Test was made on a closed Cooler Master RC-690 II Advanced case.
> And once again, backplate got only aesthetics purpose. Others reviewers need maybe to put as much serious as Damien is bringing on his reviews.
> His reviews are really high level and few GPU reviewers can pretend to reach such a level of perfectness


There isn't much science in measuring heat, to be honest. How come others don't measure such insane temperatures?


----------



## Serandur

Quote:


> Originally Posted by *Ganf*
> 
> 
> 
> Someone seems to have forgotten literally every other difference between overclocking on AMD and Nvidia. Enjoy the actual performance numbers when the voltage gets unlocked.


What, exactly, is Fiji's stock load voltage? Maxwell isn't a great overclocker because of voltage control; my 980 Ti does over 1400 MHz (though haven't yet pushed to see how far) at its stock ~1.17v. At least nearly 20% over stock boost speeds at stock voltage.


----------



## Ganf

Quote:


> Originally Posted by *Serandur*
> 
> What, exactly, is Fiji's stock load voltage? Maxwell isn't a great overclocker because of voltage control; my 980 Ti does over 1400 MHz (though haven't yet pushed to see how far) at its stock ~1.17v. At least nearly 20% over stock boost speeds at stock voltage.


Stock voltage is pointless. Voltage overhead, and how much overvolting increases performance, are the determining factors. You've got people who've been using their 290x's 24/7 at 1.35v, myself included. My card is on air. How does that correlate to Nvidia?

Anyone else got any questions about the difference between overclocking AMD and Nvidia?


----------



## friend'scatdied

Quote:


> Originally Posted by *Serandur*
> 
> What, exactly, is Fiji's stock load voltage? Maxwell isn't a great overclocker because of voltage control; my 980 Ti does over 1400 MHz (though haven't yet pushed to see how far) at its stock ~1.17v. At least nearly 20% over stock boost speeds at stock voltage.


Indeed; likewise.

If voltage was truly a barrier AMD had the opportunity to push up the voltage bins to get the Fury X to even marginally higher clock speeds out-of-box.

Again, reminds me of bulldozer. Missed clock targets possibly consequent of overoptimism about the silicon or the power envelope.


----------



## friend'scatdied

I don't think low-volume non-enterprise video cards drive significant movement in share prices.


----------



## Serandur

Quote:


> Originally Posted by *Ganf*
> 
> Stock voltage is pointless. Voltage overhead, and how much overvolting increases performance, are the determining factors. You've got people who've been using their 290x's 24/7 at 1.35v, myself included. My card is on air. *How does that correlate to Nvidia?*
> 
> Anyone else got any questions about the difference between overclocking AMD and Nvidia?


Impossible on the stock (maybe even modded) BIOS for Maxwell, power hungry as heck on either brand, has inconsistent results on final clock speeds for 290Xs according to a brief Google search.

Fiji may react differently, but Hawaii with those volts consumes quite a bit more power than even GM200 and still doesn't overclock that impressively versus it (from the respective stock capabilities of both for reference obviously).

If DX11 driver CPU overhead is really the reason for Fiji's lower 1080p/1440p performance as well, overclocking won't do much to help.


----------



## rt123

Quote:


> Originally Posted by *friend'scatdied*
> 
> Indeed; likewise.
> 
> If voltage was truly a barrier AMD had the opportunity to push up the voltage bins to get the Fury X to even marginally higher clock speeds out-of-box.
> 
> Again, reminds me of bulldozer. Missed clock targets possibly consequent of overoptimism about the silicon or the power envelope.


Except if Bulldozer was able to get within 10% of the competing Intel CPU, it wouldn't have been half as big as the failure it was.

Stop trying to label the FuryX as a Bulldozer like fail, it isn't.


----------



## friend'scatdied

Quote:


> Originally Posted by *rt123*
> 
> Except if Bulldozer was able to get within 10% of the competing Intel CPU, it wouldn't have been half as big as the failure it was.
> 
> Stop trying to label the FuryX as a Bulldozer like fail, it isn't.


Where did I imply that Fury was a "Bulldozer like fail"?

We KNOW that Bulldozer was supposed to launch with higher clock speeds. We do not know this for Fury X, but I'm merely guessing that 1050MHz OOB wasn't AMD's plan.


----------



## Ganf

Quote:


> Originally Posted by *Serandur*
> 
> Impossible on the stock (maybe even modded) BIOS for Maxwell, power hungry as heck on either brand, has inconsistent results on final clock speeds for 290Xs according to a brief Google search.
> 
> Fiji may react differently, but Hawaii with those volts consumes quite a bit more power than even GM200 and still doesn't overclock that impressively versus it (from the respective stock capabilities of both for reference obviously).
> 
> If it's really the reason for Fiji's lower 1080p/1440p performance as well, overclocking won't do much for DX11 CPU overhead.


You get more parity in AMD overclocking though. Nvidia will often see performance increases 5-12% below what the overclock increase was, AMD can get close to a 1 to 1 parity pretty easily. Don't ask me why, I'm not an expert in the differences between GCN and Maxwell architectures, I just know that AMD owners who overclock 15% tend to get the full 15% unless they've got a bottleneck somewhere else. Nvidia owners who overclock 20-25% will likewise see a 15% increase in performance.

That difference doesn't mean that lucky winners of the silicon lottery will "beat" Maxwell, it just means that Fiji has the ability to "match" it.

I don't get the big deal about power consumption, personally. Then again I smile whenever I think about the fact that my 290x pulls 450w sooo.... That's me.









I won't comment on DX11 performance until this time next month when it gets updated with the "DX12" features in 11.3. We have no idea who is going to get what out of that little surprise. Nvidia may come out of it like a champ with huge boosts to performance.


----------



## SpeedyVT

Quote:


> Originally Posted by *Ganf*
> 
> You get more parity in AMD overclocking though. Nvidia will often see performance increases 5-12% below what the overclock increase was, AMD can get close to a 1 to 1 parity pretty easily. Don't ask me why, I'm not an expert in the differences between GCN and Maxwell architectures, I just know that AMD owners who overclock 15% tend to get the full 15% unless they've got a bottleneck somewhere else. Nvidia owners who overclock 20-25% will likewise see a 15% increase in performance.
> 
> That difference doesn't mean that lucky winners of the silicon lottery will "beat" Maxwell, it just means that Fiji has the ability to "match" it.
> 
> I don't get the big deal about power consumption, personally. Then again I smile whenever I think about the fact that my 290x pulls 450w sooo.... That's me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I won't comment on DX11 performance until this time next month when it gets updated with the "DX12" features in 11.3. We have no idea who is going to get what out of that little surprise. Nvidia may come out of it like a champ with huge boosts to performance.


Overclocking is typically something that scales according to the MHz raised. Perhaps NVidia's arch has reached it's pinnacle clock rate. From any pinnacle overclock you loose efficiency which reduces overall performance per clock.


----------



## rt123

Quote:


> Originally Posted by *friend'scatdied*
> 
> Where did I imply that Fury was a "Bulldozer like fail"?
> 
> We KNOW that Bulldozer was supposed to launch with higher clock speeds. We do not know this for Fury X, but I'm merely guessing that 1050MHz OOB wasn't AMD's plan.


When you said " it reminds you of Bulldozer", it implies that you think its like Bulldozer. Maybe its different in your mind, but the first thing that a lot of people think when they here about AMD's Bulldozer, is that it was a devastating failure. Alright, maybe you didn't mean it in that way.

Also the problem with Bulldozer is that it didn't have Good IPC, which AMD intended to make up for with clock speed. FuryX doesn't have bad performance at stock clocks, its just out clocked by the competition. If you put the 980Ti or even the TitanX at the same clock speed as FuryX, FuryX will probably win or the difference will be very little.


----------



## Ganf

Quote:


> Originally Posted by *iLeakStuff*
> 
> You dont understand. Nvidia have always had high stock value, what you see is a lot of ups and downs, natural fluctuations. That doesnt show anything.
> 
> The AMD graph however clearly shows how the market reacted to each step of the launch of the card.
> Open your eyes and stop defending AMD. Maybe you are then able to see it.


Really? Because I'm pretty sure they also tanked 10% after the Titan X launched, but that was for entirely separate reasons. Nvidia doesn't have anything else going on, so the market speculators are playing silly buggers with the price. Their stock price is largely determined by their enterprise products and stuff like the Tegra being integrated into the auto industry, etc....

So let me ask you, what are stock prices indicative of? If it's supposed to relate to the success of their consumer GPU's, the Titan X was a flop from day 1.


----------



## blue1512

Quote:


> Originally Posted by *iLeakStuff*
> 
> Open your eyes and stop defending AMD. Maybe you are then able to see it.


This guy is so amusing. How old are you? Do you even know how the SE works btw?? They don't care about whether the review is good or bad. When the card was released, there would be nothing to expect, a great time to DUMP. Chain reaction, they all sold until the price reached an equilibrium. Even if furyX blowed TX away, AMD stock would still go down, because that's how the market work.


----------



## Themisseble

AMD please give us R9 NANO, 175W TDP, FIJI 3,5K GCN cores.


----------



## Tivan

Quote:


> Originally Posted by *Serandur*
> 
> If DX11 driver CPU overhead is really the reason for Fiji's lower 1080p/1440p performance as well, overclocking won't do much to help.


Going by the 1080p firestrike gains found from overclocking the VRAM, but lack thereoff at 4k firestrike, I'd make the speculation that latency of the VRAM plays a major role with framerates/times when shooting for 1080p typical high framerates.

I mean 500mhz does mean one cycle every 2ns. Which introduces more latency (well, latency variation; but peak latency is higher.), than the reduced latency from putting the vram closer to the die.

Whether AMD can do something with the driver about it, or only overclocking VRAM can help, we don't know yet.


----------



## friend'scatdied

Quote:


> Originally Posted by *rt123*
> 
> When you said " it reminds you of Bulldozer", it implies that you think its like Bulldozer. Maybe its different in your mind, but the first thing that a lot of people think when they here about AMD's Bulldozer, is that it was a devastating failure. Alright, maybe you didn't mean it in that way.
> 
> Also the problem with Bulldozer is that it didn't have Good IPC, which AMD intended to make up for with clock speed. *FuryX doesn't have bad performance at stock clocks, its just out clocked by the competition.* If you put the 980Ti or even the TitanX at the same clock speed as FuryX, FuryX will probably win or the difference will be very little.


There are actually other aspects of this launch that remind me of Bulldozer (e.g. the hype train), but I don't think this is quite the failure the FX-8150 was by any means.

The bold is why I suspect AMD aimed for a higher clock speed target. Whether it's the silicon or power envelope or budget, Fury X doesn't have very impressive out-of-box clock speeds and that causes it to fall a wee bit short of the stiff and relevant competitors. Again, this is simply conjecture but AMD hasn't had very good luck with hitting targets on their fabs.

A 1200-1300MHz part would have meant that AMD could claim at the very least performance parity at 4K.


----------



## GorillaSceptre

Does the Fury X have a native H.265/264 encoder/decoder?


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Does the Fury X have a native H.265/264 encoder/decoder?


Yeah pretty sure it does.


----------



## PontiacGTX

btw Did someone tested which Feature level of DIrectx 12 has Fury X since is a sligly better improved architecture there might be something new


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah pretty sure it does.


I can't seem to find anything official on it..


----------



## Orivaa

It has the VCE, like the 200 and 300 series does, which supports H.264. Whether or not it supports HEVC, I do not know.


----------



## infranoia

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Does the Fury X have a native H.265/264 encoder/decoder?


http://techreport.com/review/28499/amd-radeon-fury-x-architecture-revealed/2
Quote:


> The UVD block responsible for video decoding has gotten an important upgrade: it can now accelerate the decoding of H.265/HEVC-encoded video in hardware, which is crucial for 4K video in particular. This hardware can also decode Google's VP8 format, but AMD says it's still investigating the software side of that equation. Full acceleration of the latest video formats gives Fiji a leg up on its competition.
> 
> That advantage comes with a downside, though. The display outputs in this GPU are not HDMI 2.0-capable, so Fury cards will not be able to drive 4K TVs at 60Hz over HDMI. AMD has a bit of a handicap here that its competition doesn't share.


I had originally read that as an implied cause-->effect statement for the use of HDMI 1.4a, but really it says no such thing.


----------



## rv8000

Quote:


> Originally Posted by *friend'scatdied*
> 
> There are actually other aspects of this launch that remind me of Bulldozer (e.g. the hype train), *but I don't think this is quite the failure* the FX-8150 was by any means.
> 
> The bold is why I suspect AMD aimed for a higher clock speed target. Whether it's the silicon or power envelope or budget, Fury X doesn't have very impressive out-of-box clock speeds and that causes it to fall a wee bit short of the stiff and relevant competitors. Again, this is simply conjecture but AMD hasn't had very good luck with hitting targets on their fabs.
> 
> A 1200-1300MHz part would have meant that *AMD could claim at the very least performance parity at 4K.*


You're still implying it was a failure









What? At its released clocks Fury X is most competitive with the 980ti/Titan X @ 4k, and also tends to be where it leads in the few games it gets ahead of them.


----------



## friend'scatdied

Quote:


> Originally Posted by *rv8000*
> 
> You're still implying it was a failure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What? At its released clocks Fury X is most competitive with the 980ti/Titan X @ 4k, and also tends to be where it leads in the few games it gets ahead of them.


I didn't really suggest that the Fury X falls along the "failure" side of the spectrum (as much as I could say "the GTX 980 Ti isn't quite the failure the FX 5800 Ultra was"







).

It's "most competitive" against the competition at 4K, but it doesn't consistently match or exceed them.

More aggressive clocks would have allowed it to be decisively superior at 4K. The Fury X could have used this kind of win.


----------



## Kane2207

Quote:


> Originally Posted by *rv8000*
> 
> You're still implying it was a failure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What? At its released clocks Fury X is most competitive with the 980ti/Titan X @ 4k, and also tends to be where it leads in the few games it gets ahead of them.


I don't think it's a failure, it's a decent performer with some interesting features. I think a lot of people around here are just a little underwhelmed following the hype before the release based on the leaked marketing slides.

More the fool us really, marketers are gonna market, neither side can claim a moral high ground there


----------



## Kane2207

Quote:


> Originally Posted by *Ganf*
> 
> No custom PCB's is what killed it for me. I would've bought on the basis of improving drivers putting it on par with the 980 ti over time but No Lightning means no sale.


I know I'll get some flack, but the lack of DVI is my main concern. I'm only running a cheapo Korean monitor but it's served me well. I don't want to have to replace it just because I've purchased a new GPU.

Saying that though, some cards are coming with DP>DVI adapters in the box but I haven't seen any comment yet on how well they perform.

I think I'll wait and let the dust settle, see if there's any price drops from either camp and maybe even hold out for the next GPUs on a new node.

There has been some interesting developments over in the Fury X owners club thread, actual benchmarks have been completed using the HBM overclocking bug and the results look interesting, so I'm not prepared to rule this card out quite yet


----------



## toncij

Quote:


> Originally Posted by *Serandur*
> 
> What, exactly, is Fiji's stock load voltage? Maxwell isn't a great overclocker because of voltage control; my 980 Ti does over 1400 MHz (though haven't yet pushed to see how far) at its stock ~1.17v. At least nearly 20% over stock boost speeds at stock voltage.


What? People overclock from basic 1000 to 1300 or 1600 depending if it is air or water and on 1800 for LN2. How do you mean "Maxwell isn't a great overclocker"??


----------



## Seel

Quote:


> Originally Posted by *Kane2207*
> 
> I know I'll get some flack, but the lack of DVI is my main concern. I'm only running a cheapo Korean monitor but it's served me well. I don't want to have to replace it just because I've purchased a new GPU.
> 
> Saying that though, some cards are coming with DP>DVI adapters in the box but I haven't seen any comment yet on how well they perform.
> 
> I think I'll wait and let the dust settle, see if there's any price drops from either camp and maybe even hold out for the next GPUs on a new node.
> 
> There has been some interesting developments over in the Fury X owners club thread, actual benchmarks have been completed using the HBM overclocking bug and the results look interesting, so I'm not prepared to rule this card out quite yet


This is me as well. If it had DVI I would've already bought one. But I really like my 100hz Qnix too much for that.
Regular DP to DVI adapters only allow for 1080p. Active adapters allow for 1440p but I doubt you'd be able to overclock with that, plus they cost like 100$.


----------



## Ganf

Quote:


> Originally Posted by *Kane2207*
> 
> I know I'll get some flack, but the lack of DVI is my main concern. I'm only running a cheapo Korean monitor but it's served me well. I don't want to have to replace it just because I've purchased a new GPU.
> 
> Saying that though, some cards are coming with DP>DVI adapters in the box but I haven't seen any comment yet on how well they perform.
> 
> I think I'll wait and let the dust settle, see if there's any price drops from either camp and maybe even hold out for the next GPUs on a new node.
> 
> There has been some interesting developments over in the Fury X owners club thread, actual benchmarks have been completed using the HBM overclocking bug and the results look interesting, so I'm not prepared to rule this card out quite yet


I've been watching that too. If voltage gets unlocked and the card does well as a result I can still be swayed. Whatever I buy is going into my custom loop anyways so I'll have to find a use for 2 120mm CLC's if it comes to that.

I wonder just how well that CLC can be modded to fit on other cards, are there any detailed pictures of it's contact area?

Edit: Found one.

http://www.legitreviews.com/amd-radeon-r9-fury-x-complete-teardown_166851



Doesn't look like AMD's standard mounting spacing does it?


----------



## CrazyElf

Quote:


> Originally Posted by *Kane2207*
> 
> I know I'll get some flack, but the lack of DVI is my main concern. I'm only running a cheapo Korean monitor but it's served me well. I don't want to have to replace it just because I've purchased a new GPU.
> 
> Saying that though, some cards are coming with DP>DVI adapters in the box but I haven't seen any comment yet on how well they perform.
> 
> I think I'll wait and let the dust settle, see if there's any price drops from either camp and maybe even hold out for the next GPUs on a new node.
> 
> There has been some interesting developments over in the Fury X owners club thread, actual benchmarks have been completed using the HBM overclocking bug and the results look interesting, so I'm not prepared to rule this card out quite yet


I'm in the same boat to you. I really wanted to support AMD this round, but I think it's looking like this card is a serious disappointment.

There have been active DP to DVI dual link connectors for years.

http://www.newegg.com/Product/Product.aspx?Item=N82E16812607011

Apart from their cost, they never really work 100% smoothly. They certainly are not as good as a native solution in DP.

Between the two:

AMD

Still some room for drivers (maybe single digits), but GCN is mature. There might be some room though with GCN 1.2. I expect though that AMD will have more room than Maxwell, which is largely unchanged since the 750Ti from early 2014. Either way, it's probably more room than Nvidia has. Heck, at 4k, it might even reach parity or in the long run, beat the Titan X where it has enough VRAM.
Crossfire does scale better (80% roughly versus 70%) for SLI.
If there are applications that use very high bandwidth, but low VRAM, then AMD will win.

Nvidia:

More power efficient, so more OC headroom. My gut feeling is that the Fury is going to be perhaps 1300 MHz tops with unlocked voltage with luck on a silicon lottery. A good (by silicon lottery standards) 980Ti will do 1500 MHz+, and it's possible the Lightning/Classified/HOF might add to that. Even if Nvidia is 10-15% slower clock for clock (that's because Nvidia cards after a certain point don't scale linearly, while AMD cards generally do), the Nvidia card is faster. I suspect that 1700 MHz+ might be possible with a card like the Lightning, even on air.
Extra VRAM might be useful, especially at 4k down the line.
Custom PCBs will be coming and are already out in some areas.

Nvidia is the better option, even if the drivers don't have much room for improvement (due to Maxwell being mature). My big worry is if Nvidia pulls off the same "won't optimize" business practices on Maxwell once Pascal comes out.

Edit:
I think that after the initial sales, there will be pressure on AMD to lower the price. At under $600 and perhaps $550 USD, I think the merits of each card need to be considered more closely.

You could argue that neither card would age well and that if there are 16nm GPUs late next year, it will be a big improvement.

Quote:


> Originally Posted by *Ganf*
> 
> I've been watching that too. If voltage gets unlocked and the card does well as a result I can still be swayed. Whatever I buy is going into my custom loop anyways so I'll have to find a use for 2 120mm CLC's if it comes to that.
> 
> I wonder just how well that CLC can be modded to fit on other cards, are there any detailed pictures of it's contact area?
> 
> Edit: Found one.
> 
> http://www.legitreviews.com/amd-radeon-r9-fury-x-complete-teardown_166851
> 
> 
> 
> Doesn't look like AMD's standard mounting spacing does it?


It looks non-standard. Probably due to the large die and the HBM modules, it had to be non-standard. That could make it hard to adapt these AIOs to fit.

A better solution might be to just wait and see if the Fury Pro does almost as well (which if the card is ROP bottlenecked it should), and just enjoy the cost savings. It could be as little as $400 USD (same as 290 price, because of the expected price decreases of the Fury X, which will bring it down to 290X pricing) and because you're not paying for a CLC.


----------



## Serandur

Quote:


> Originally Posted by *toncij*
> 
> What? People overclock from basic 1000 to 1300 or 1600 depending if it is air or water and on 1800 for LN2. How do you mean "Maxwell isn't a great overclocker"??


I meant it is a great overclocker, but not because of voltage control (as in, if Fiji needs more voltage to go past 10%, it already doesn't seem to overclock nearly as well as Maxwell which gets far on even stock voltage). I agree with you 100% the architecture is a fantastic overclocker and it does it with surprisingly little voltage and extra heat production too. That's what I meant by the rest of the sentence with "because of voltage control".

"Maxwell" isn't just an architecture name, it's a description of its capabilities.


----------



## rv8000

Quote:


> Originally Posted by *friend'scatdied*
> 
> I didn't really suggest that the Fury X falls along the "failure" side of the spectrum (as much as I could say "the GTX 980 Ti isn't quite the failure the FX 5800 Ultra was"
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> It's "most competitive" against the competition at 4K, but it doesn't consistently match or exceed them.
> 
> More aggressive clocks would have allowed it to be decisively superior at 4K. The Fury X could have used this kind of win.


I don't know what reviews you are reading, but 80% of *4k gaming benchmarks* put the fury x at 980ti level, the rest it either beats it or loses to it by a margin of 2-3 fps. If this isn't "performance parity" I don't what you are on.


----------



## mcg75

Thread has been derailed long enough guys, please get it back on track.

Anything that has nothing to do with Fury-X reviews is off topic. That includes stock prices and comparisons of old cards which is being done to rile up members from both camps.

Thread will reopen shortly.

Anybody who continues to derail will be removed from the thread.

Thanks.


----------



## mcg75

Thread re-opened.


----------



## iSlayer

Quote:


> Originally Posted by *rv8000*
> 
> I don't know what reviews you are reading, but 80% of *4k gaming benchmarks* put the fury x at 980ti level, the rest it either beats it or loses to it by a margin of 2-3 fps. If this isn't "performance parity" I don't what you are on.


While the Fury X does catch up to the 980 Ti at 4k that isn't much consolation when it still performs worse. The 4GBs of VRAM isn't too enticing and with aftermarket 980 Tis inbound, the gap between the Fury X and 980 Ti will only grow.


----------



## rv8000

Quote:


> Originally Posted by *iSlayer*
> 
> While the Fury X does catch up to the 980 Ti at 4k that isn't much consolation when it still performs worse. The 4GBs of VRAM isn't too enticing and with aftermarket 980 Tis inbound, the gap between the Fury X and 980 Ti will only grow.


Was simply pointing out his statement was wrong. AIB 980ti's are already available/reviewd albeit limited atm. The 980ti is faster the majority of the time, but saying a Fury X does not have the same performance and sometimes better/worse @ 4k is flat out wrong.


----------



## friend'scatdied

Quote:


> Originally Posted by *iSlayer*
> 
> While the Fury X does catch up to the 980 Ti at 4k that isn't much consolation when it still performs worse.


Exactly: it still performs slightly worse _overall_ even at 4K. 3% slower on average is still 3% slower (see: Titan X still technically being faster than the 980 Ti).

Even a measly 50MHz increase to the base core speed for 1100MHz out-of-the-box might have helped when AMD was touting this card as a 4K performer.


----------



## Sashimi

Here's what I see so far

Below 4k: 980 Ti > Fury X
4k: 980 Ti = Fury X
4k+: 980 Ti = Fury X (Speculating Fury X will be a more powerful renderer however will then be dragged down by VRAM limits back down to 980 Ti level.)

The closed loop watercooling system also has its own pros and cons compare to the 980 Ti's traditional air cooling, so I fail to see how it adds value to the card despite it costing more to make.

Pros:
Keeps the card cool and quiet.

Cons:
Need space to mount.
Redundant for people with custom loops of their own and can cost them extra cash to buy third party block.

All things considered, I can't see how it's a good purchase at current pricing.


----------



## 2010rig

Quote:


> Originally Posted by *friend'scatdied*
> 
> Exactly: it still performs slightly worse _overall_ even at 4K. 3% slower on average is still 3% slower (see: Titan X still technically being faster than the 980 Ti).
> 
> Even a measly 50MHz increase to the base core speed for 1100MHz out-of-the-box might have helped when AMD was touting this card as a 4K performer.


With 1100 MHz stock, it'd become an overclocker's nightmare








Quote:


> On average, we'd comfortably say 1160MHz was an "average" clock speed.


----------



## Casey Ryback

Quote:


> Originally Posted by *friend'scatdied*
> 
> Exactly: it still performs slightly worse _overall_ even at 4K. 3% slower on average is still 3% slower (see: Titan X still technically being faster than the 980 Ti).
> 
> Even a measly 50MHz increase to the base core speed for 1100MHz out-of-the-box might have helped when AMD was touting this card as a 4K performer.


Oh guys get a grip, people are actually noticing a 3% difference in fps now?

As the person you quoted said, their performance is so close at 4K, and I also think pretty good at 1440p.

You want a 1080P card then they have the 390/390X.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> All things considered, I can't see how it's a good purchase at current pricing.


It's not the greatest purchase at this time, neither is the 980ti with the AIO.

It's a niche product and will sell out anyway, as will the evga hybrid.

This is why a majority of potential fury buyers are waiting for the fury cards using air coolers.

Pros
Lower cost with no AIO.
Competitive hi res performance
AIB cooler (excluding nano afaik)
Low power draw/small form factor (nano)

Cons

Lower 1080p performance


----------



## hamzta09

Quote:


> Originally Posted by *Casey Ryback*
> 
> Oh guys get a grip, people are actually noticing a 3% difference in fps now?
> 
> As the person you quoted said, their performance is so close at 4K, and also pretty good at 1440p.
> 
> You want a 1080P card then they have the 390/390X.


The Fury X is still worse, esp if you count Texture Packs/modding.

And another "1080p = 60 fps" guy... yawn


----------



## mtcn77




----------



## friend'scatdied

Quote:


> Originally Posted by *Casey Ryback*
> 
> Oh guys get a grip, people are actually noticing a 3% difference in fps now?
> 
> As the person you quoted said, their performance is so close at 4K, and also pretty good at 1440p.


Well, the key distinction between the 980 Ti vs. Titan X 3% and the Fury X vs. 980 Ti 3% is the latter comparison is more "trading blows" and not consistent. The actual leads frequent into 10%+ in the latter comparisons which is certainly a perceptible difference depending on the games one plays.

If AMD were able to launch with more aggressive clocks (they weren't), the Fury X might not have lost by those crucial figures in several titles (e.g. GTA V, Project Cars, BF4). Wouldn't be surprised if there were internal memos with expectations of 1200-1300MHz stock for Fiji.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> It's not the greatest purchase at this time, neither is the 980ti with the AIO.
> 
> It's a niche product and will sell out anyway, as will the evga hybrid.
> 
> This is why a majority of potential fury buyers are waiting for the fury cards using air coolers.
> 
> Pros
> Lower cost with no AIO.
> Competitive hi res performance
> AIB cooler (excluding nano afaik)
> Low power draw/small form factor (nano)
> 
> Cons
> 
> Lower 1080p performance


Air cooled card Fury will be clocked lower I believe, however that might be fine depending on price. Will have to see when it is officially released.

I do wish to point out once again as some already have, that 1080p @ 120hz or 144hz will require a high end card nonetheless, therefore flagships cannot disregard "low" resolution as irrelevant.


----------



## iinversion

Quote:


> Originally Posted by *Casey Ryback*
> 
> It's not the greatest purchase at this time, neither is the 980ti with the AIO.
> 
> It's a niche product and will sell out anyway, as will the evga hybrid.
> 
> This is why a majority of potential fury buyers are waiting for the fury cards using air coolers.
> 
> Pros
> Lower cost with no AIO.
> Competitive hi res performance
> AIB cooler (excluding nano afaik)
> Low power draw/small form factor (nano)
> 
> Cons
> 
> Lower 1080p performance


There will be no aftermarket Fury X designs. This has been stated many times.

Yes, there will be plenty of regular Fury aftermarket designs but they will naturally have less performance anyway? Based on the shader count compared to the Fury X and the 290X, it will likely be a mid way bridge(but slightly closer to the Fury X) between the two before OC'ing is concerned.

You can't compare the regular Fury and the Fury X with it's AIO to the 980 Ti and the 980 Ti with it's AIO. The Fury and Fury X have different performance whereas the 980 TI w/ and w/o AIO do not.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> I do wish to point out once again as some already have, that 1080p @ 120hz or 144hz will require a high end card nonetheless, therefore flagships cannot disregard "low" resolution as irrelevant.


I never said it was irrelevant.

It just happens to be how these products perform, can't do anything about it lol.

Buy a 980/980ti?


----------



## Sashimi

Aired Fury might have a market if they priced it correctly, but that will be another story.

Fury X was priced to trade blows with 980 Ti but it is all things considered out-classed despite the late entrance.


----------



## Casey Ryback

Quote:


> Originally Posted by *iinversion*
> 
> There will be no aftermarket Fury X designs. This has been stated many times.
> 
> Yes, there will be plenty of regular Fury aftermarket designs but they will naturally have less performance anyway? Based on the shader count compared to the Fury X and the 290X, it will likely be a mid way bridge(but slightly closer to the Fury X) between the two before OC'ing is concerned.
> 
> You can't compare the regular Fury and the Fury X with it's AIO to the 980 Ti and the 980 Ti with it's AIO. The Fury and Fury X have different performance whereas the 980 TI w/ and w/o AIO do not.


I never said there would be aftermarket fury designs...............

Why are you









Most users won't care about 10% lower performance if they save $150+

Only people on the internet obsessed with benchmarks will actually care about 5fps.

They haven't taken the crown, and won't. Most people don't care, let's move on.


----------



## MisterFred

I'm still kind of amazed someone now supports PLP (and, I suppose, other mixed-monitor eyefinity). Heck, if yields are small enough, maybe that's the market that's rushing to buy Furies. All those people with 4960x1600 setups.









Wonder if Nvidia will ever add that feature.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> I never said it was irrelevant.
> 
> It just happens to be how these products perform, can't do anything about it lol.
> 
> Buy a 980/980ti?


I was referring to your previous comment:
Quote:


> Originally Posted by *Casey Ryback*
> 
> You want a 1080P card then they have the 390/390X.


Sorry if i didn't make it clear.


----------



## iinversion

Quote:


> Originally Posted by *Casey Ryback*
> 
> I never said there would be aftermarket fury designs...............
> 
> Why are you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Most users won't care about 10% lower performance if they save $150+
> 
> Only people on the internet obsessed with benchmarks will actually care about 5fps.
> 
> They haven't taken the crown, and won't. Most people don't care, let's move on.


Based on the shader counts it will be more than 10%, closer to 20%.

If the Fury X is 45% faster than the 290X, then the regular Fury would be 27% faster than a 290X, assuming scaling remains the same before OC'ing is concerned.


----------



## Casey Ryback

Quote:


> Originally Posted by *iinversion*
> 
> Based on the shader counts it will be more than 10%, closer to 20%.
> 
> If the Fury X is 45% faster than the 290X, then the regular Fury would be 27% faster than a 290X, assuming scaling remains the same before OC'ing is concerned.


Well if the price is right it'll be a good card for 1440p I imagine.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> Most users won't care about 10% lower performance if they save $150+
> 
> Only people on the internet obsessed with benchmarks will actually care about 5fps.


But we are these people and we care. This forum is full of people like us because it was made for people like us to discuss about these things. If we are to move on just like that then there will be no OCN.


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> Well if the price is right it'll be a good card for 1440p I imagine.


Yup pricing will be the key.

Unlike previous gen, power efficiency in this gen is close enough between nVidia and AMD, at least near flagship level.


----------



## friend'scatdied

Quote:


> Originally Posted by *mtcn77*


I'd like to see a thorough review of Fury X crossfire performance.

It's possible that Fury X2 would be a very significant card, keeping AMD as the holder of the technical "fastest single card" title even above a theoretical GTX 990/Titan X2.


----------



## Sashimi

No doubt it will be interesting. How well they will crossfire, OC with better software, driver improvements and how much impact will DX12 have on its performance vs the opposition is still unknown. Maybe it will evolve into a winner in the coming months.


----------



## BoredErica

Crossfire does not work with Skyrim. Crossfire isn't an option.


----------



## Offler

Quote:


> Originally Posted by *mtcn77*


According image above and according similar reviews...

FuryX
Config Core 4096:256:64

980ti
Config core 2816:176:96

TitanX
Config core 3072:192:96

Max boost for Nvidias isnt on wikipedia, therefore I would like to see what frequency were those GPUs running.

The bottleneck on AMD chip is clear to me, but I am quite amazed that 64 ROPs on 1050Mhz are giving only 5% less performance than 96 ROPs on 980ti on 1000Mhz.


----------



## infranoia

Quote:


> Originally Posted by *Darkwizzie*
> 
> Crossfire does not work with Skyrim. Crossfire isn't an option.


That there's a logical fallacy. "A does not equal C, therefore A does not equal anything."

Skyrim is chewed up and spit back out by any modern single GPU. "Crossfire is irrelevant for Skyrim" would be more accurate.

Not to mention that Skyrim isn't the only game in the known universe.

/spoken as one who has started their 1500 hours on Skyrim with 5850's in Crossfire before moving to a 290x.


----------



## BoredErica

Quote:


> Originally Posted by *infranoia*
> 
> That there's a logical fallacy. "A does not equal C, therefore A does not equal anything."
> 
> Skyrim is chewed up and spit back out by any modern single GPU. "Crossfire is irrelevant for Skyrim" would be more accurate.
> 
> Not to mention that Skyrim isn't the only game in the known universe.
> 
> /spoken as one who has started their 1500 hours on Skyrim with 5850's in Crossfire before moving to a 290x.


LOL

Dude.

Crossfire does not work with Skyrim. Crossfire isn't an option... FOR ME.

Do I really have to spell out the obvious? You're trying too hard.


----------



## 1337LutZ

Quote:


> Originally Posted by *Darkwizzie*
> 
> LOL
> 
> Dude.
> 
> Crossfire does not work with Skyrim. Crossfire isn't an option... FOR ME.
> 
> Do I really have to spell out the obvious? You're trying too hard.


Tbh I read your post the same way.


----------



## Tivan

Quote:


> Originally Posted by *Darkwizzie*
> 
> LOL
> 
> Dude.
> 
> Crossfire does not work with Skyrim. Crossfire isn't an option... FOR ME.
> 
> Do I really have to spell out the obvious? You're trying too hard.


I think there's a misunderstanding here.

Infranoia is trying to imply that Skyrim doesn't need crossfire, from his experience. Though you clearly have a different experience, and I can imagine Skyrim can get _very_ GPU hungry with multiple mods.
But yeah I think you're right in this. = D

Nothing against Infranoia though. Just different usage cases!


----------



## Olivon

*CrossFire Fury X Review - Hardware.fr*


Quote:


> Note that with its current drivers, AMD has had the bad idea to tackle Hairworks limiting via its drivers on the maximum tessellation level allowed in The Witcher III. An optimization we disabled for this test, although AMD probably has arguments to present the face of CD Projekt technical choices and especially Nvidia this graphic effect. We return to this subject through a news in the coming days.
> 
> Still on The Witcher III, we could verify that the Temporal AA was the source of the problems faced by multi-GPU Radeon


Quote:


> First, as we have already seen several times in bi-GPU and 4K, the Radeon generally have a small advantage over the GeForce at the felt. We assume here that the SLI link is reaching its limits, whether the frame pacing algorithms Nvidia are not fully functional in very high resolutions.
> 
> However, in any game we did not encounter any real fluidity problem on GeForce GTX 980 Ti SLI, unlike the Radeon R9 Fury X CFX suffering in 3 games


Quote:


> If all this does not scare you, then performance is indeed at the rendezvous: the Radeon R9 Fury X CFX displayed excellent scaling and surpass the GeForce GTX 980 Ti SLI ... at least in general. Unfortunately, some games are problematic.
> 
> Compared to Nvidia, AMD still takes far too long to optimize its drivers for certain securities. While this is often the case in games that incorporate Nvidia effects on which we can rely for not forgot to leave some sticks to hang destinations wheels competition. But we are convinced that AMD can make better and faster on this.
> 
> Then, with Evolve and especially Dying Light, we could observe realistic and playable situations where it seems obvious that the memory of 4 GB per GPU is insufficient. To solve the problem, the only solution is to reduce the current level of detail textures, and it remains to see if AMD can improve the behavior of Fury X with future drivers.


*Translated*


----------



## BoredErica

Quote:



> Originally Posted by *Tivan*
> 
> I think there's a misunderstanding here.
> 
> Infranoia is trying to imply that Skyrim doesn't need crossfire, from his experience. Though you clearly have a different experience, and I can imagine Skyrim can get very GPU hungry with multiple mods.
> But yeah I think you're right in this. = D


There are many ways to play Skyrim without requiring much GPU horsepower. The only implied message of my post that makes any sense at all is that I'm making a personal statement about my own preferences and playstyle and setup.

One of his points is that Skyrim isn't the only game in the world - a point that doesn't even need to be brought up.


----------



## Tivan

Quote:


> Originally Posted by *Darkwizzie*
> 
> There are many ways to play Skyrim without requiring much GPU horsepower. The only implied message of my post that makes any sense at all is that I'm making a personal statement about my own preferences and playstyle and setup.
> 
> One of his points is that Skyrim isn't the only game in the world - a point that doesn't even need to be brought up.


Yeah well, I think infranoia wanted to start a different conversation, and so he did. = D
(Just saying there's no reason to discuss, if you're not disagreeing with each other.)


----------



## Final8ty

Quote:


> Originally Posted by *rv8000*
> 
> I find that hard to believe when guru3D thermal imaging results show 50c at the same point, sure the ambients are different but a 50c delta???? Sounds like they had a malfunctioning pump/fan or something of the sorts. Bet that resulted in a buttload of throttling in their tests. Makes me wonder why Guru3D benchmarks are with +/-5% of the 980ti most of the time and other places aren't.
> 
> I've also noticed there are certain games where the Fury is right with the 980ti and titan, and leagues of ahead of the 290x, and other times where its 30% behind the 980ti and the gap between the 980/290x is so small. I wonder if some of these results are more directly related to the poor overhead in drivers for certain games, makes me a bit more hopeful for newer drivers and w10 in some cases.
> 
> Regardless looking forward to having my card come in where I can test it to my hearts content/


100c VRMs, and the coolant, pipes and pump and the entire cooling system at 90-100c going by that image and the GPU at 60c, impossible, the coolant can not be hotter than the heat source, so let say the biggest heat source is the VRMs there is no way the VRMs could heat up the entire loop which has a 500w heat dissipation capacity to 90-100c.

And on top of that in the case of the GPU you can not transfer heat to something that is already hotter than itself so the coolant at 90-100c would be actively heating up the GPU so it would not be running at anywhere near 60c it would be over 130c.
And the PCIE connectors at 90-100c while Guru3D had them at 49c

Sorry but there testing is bull.


----------



## Sashimi

Quote:


> Originally Posted by *Tivan*
> 
> Yeah well, I think infranoia wanted to start a different conversation, and so he did. = D
> (Just saying there's no reason to discuss, if you're not disagreeing with each other.)


Lol I read dark's initial post as a half joking comment exaggerating his love for Skyrim.

But yes Skyrim can get extremely GPU heavy especially with ENB and 4k textures and what not it can swallow 4GB VRAM like jelly. SLI is definitely recommended if you go down that route.


----------



## thekasafist

I have the Acer Predator Gsync the 1440p IPS panel one. I will admit it's probably the nicest piece of technology I have thus far for my gaming needs. I could not be happier with it and I am surprised to find that my GTX 670 SLI runs games at 1440p pretty well especially on this monitor! I think you won't be disappointed. Other than it is pretty darn pricey I still think it was well worth it and I wouldn't double think it at all.


----------



## curlyp

Quote:


> Originally Posted by *thekasafist*
> 
> I have the Acer Predator Gsync the 1440p IPS panel one. I will admit it's probably the nicest piece of technology I have thus far for my gaming needs. I could not be happier with it and I am surprised to find that my GTX 670 SLI runs games at 1440p pretty well especially on this monitor! I think you won't be disappointed. Other than it is pretty darn pricey I still think it was well worth it and I wouldn't double think it at all.


What type of FPS are you experiencing?


----------



## DFroN

Quote:


> Originally Posted by *Olivon*
> 
> *CrossFire Fury X Review - Hardware.fr*
> 
> Quote:
> 
> 
> 
> If all this does not scare you, then performance is indeed at the rendezvous: the Radeon R9 Fury X CFX displayed excellent scaling and surpass the GeForce GTX 980 Ti SLI ... at least in general. Unfortunately, some games are problematic.
> 
> Compared to Nvidia, AMD still takes far too long to optimize its drivers for certain securities. *While this is often the case in games that incorporate Nvidia effects on which we can rely for not forgot to leave some sticks to hang destinations wheels competition*. But we are convinced that AMD can make better and faster on this.
Click to expand...

That settles it.


----------



## fewness

Is this a truly representative scenario for TitanX and 980Ti SLI users? The cards often throttle due to temps to 1000MHz in SLI?


----------



## infranoia

Quote:


> Originally Posted by *Tivan*
> 
> I think there's a misunderstanding here.
> 
> Infranoia is trying to imply that Skyrim doesn't need crossfire, from his experience. Though you clearly have a different experience, and I can imagine Skyrim can get _very_ GPU hungry with multiple mods.
> But yeah I think you're right in this. = D
> 
> Nothing against Infranoia though. Just different usage cases!
> 
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> There are many ways to play Skyrim without requiring much GPU horsepower. The only implied message of my post that makes any sense at all is that I'm making a personal statement about my own preferences and playstyle and setup.
> 
> One of his points is that Skyrim isn't the only game in the world - a point that doesn't even need to be brought up.
Click to expand...

Not really trying to rock the boat, but c'mon-- you have to admit that post was a bit of a nuke from orbit. Skyrim worked in 5850 Crossfire for me-- the scaling was not ideal, granted, but it was greater than 1x. I had DOF issues with certain ENBs, but it did _work_.

I get your point, it has a raft of issues compared to Crossfire in other games, and I found myself waiting too long for the frame pacing fix for DX9, which sucked.

So yeah. We're not disagreeing; I just called you out on a bit of hyperbole is all. And OCN needs less hyperbole. Crossfire works in Skyrim (at least with 2 GPUs) with a bit of effort.

http://www.nexusmods.com/skyrim/mods/19102/?
http://enbseries.enbdev.com/forum/viewtopic.php?f=23&t=509
http://www.overclock.net/t/1164844/skyrim-crossfire-issues-complete-list-of-known-fixs-here-12-1a-and-11-12-cap2-links-inside/600_100

[EDIT] But yeah, all the effort above basically had me looking at single-GPU equivalents to the 5850 pair, just for this game. Didn't want the hassle. But I actually didn't mind the whole Crossfire experience with other titles, for the most part it worked, and worked well. I don't hear too many complaints about the 295x2, frame pacing is improved considerably and I'm seriously looking forward to dual-Fiji.

And Skyrim on a single Fiji will haul ass, even on a Fury X2. DX9 is ancient and the sooner it's put out to pasture the better.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fewness*
> 
> Is this a truly representative scenario for TitanX and 980Ti SLI users? The cards often throttle due to temps to 1000MHz in SLI?


Base clock is 1000MHz so it's still fine. Just means in SLI you need better cooling.


----------



## friend'scatdied

Interesting comparison might be Fury X Crossfire vs. 980 Ti CLC SLi, with the latter brought to factory 980 Ti specs.

Though the Maxwell boost would still be a small wildcard depending on the silicon.


----------



## 4everAnoob

Any more details available yet for the Fury Nano?


----------



## ZealotKi11er

Quote:


> Originally Posted by *4everAnoob*
> 
> Any more details available yet for the Fury Nano?


Does it really matter? Its same architecture as fury @ 175W. It's a GTX970-980 class card.


----------



## friend'scatdied

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does it really matter? Its same architecture as fury @ 175W. It's a GTX970-980 class card.


It does, though. Its performance and price _could_ pressure both of those cards.


----------



## ZealotKi11er

Quote:


> Originally Posted by *friend'scatdied*
> 
> It does, though. Its performance and price _could_ pressure both of those cards.


Depends. It might be better @ 4K but 1080p it's hard to beat Nvidia.


----------



## iSlayer

Quote:


> Originally Posted by *rv8000*
> 
> Was simply pointing out his statement was wrong. AIB 980ti's are already available/reviewd albeit limited atm. The 980ti is faster the majority of the time, but saying a Fury X does not have the same performance and sometimes better/worse @ 4k is flat out wrong.


It's still worse? It's in the same tier and its not bad, just worse.
Quote:


> Originally Posted by *friend'scatdied*
> 
> Exactly: it still performs slightly worse _overall_ even at 4K. 3% slower on average is still 3% slower (see: Titan X still technically being faster than the 980 Ti).
> 
> Even a measly 50MHz increase to the base core speed for 1100MHz out-of-the-box might have helped when AMD was touting this card as a 4K performer.


^
Quote:


> Originally Posted by *Sashimi*
> 
> Here's what I see so far
> 
> Below 4k: 980 Ti > Fury X
> 4k: 980 Ti = Fury X
> 4k+: 980 Ti = Fury X (Speculating Fury X will be a more powerful renderer however will then be dragged down by VRAM limits back down to 980 Ti level.)
> 
> The closed loop watercooling system also has its own pros and cons compare to the 980 Ti's traditional air cooling, so I fail to see how it adds value to the card despite it costing more to make.
> 
> Pros:
> Keeps the card cool and quiet.
> 
> Cons:
> Need space to mount.
> Redundant for people with custom loops of their own and can cost them extra cash to buy third party block.
> 
> All things considered, I can't see how it's a good purchase at current pricing.


Eh the performance is fine and while the 980 Ti is the clear victor, for those who buy along brand lines, both have relatively close cards to choose from.
Quote:


> Originally Posted by *Casey Ryback*
> 
> Oh guys get a grip, people are actually noticing a 3% difference in fps now?
> 
> As the person you quoted said, their performance is so close at 4K, and I also think pretty good at 1440p.
> 
> You want a 1080P card then they have the 390/390X.


Well, when I pay $650 for a GPU I expect it to be better than its equal priced competitor. Why would I buy the worse product?

@2010rig I'm sure you'll find some insightful logic in this post.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does it really matter? Its same architecture as fury @ 175W. It's a GTX970-980 class card.


Probably aimed at matching price point of that 980 Metal or whatever was floating around I presume.


----------



## Themisseble

Quote:


> Originally Posted by *Thoth420*
> 
> Probably aimed at matching price point of that 980 Metal or whatever was floating around I presume.


Its Fiji - should be faster than R9 390X which is = 1440P = GTX 980. So R9 Nano may be faster and it will be more efficient than GTX 980.

If you look at Fury X 220W in gaming against GXT 980Ti 233W...

Oc-ed GTX 980Ti and OC-ed Fury X
http://www.eurogamer.net/articles/digitalfoundry-2015-radeon-r9-fury-x-review

Fury X does pretty good job, but drivers sucks.

Please compare 1080P vs 2160P - R9 390X vs R9 Fury X - and GTX 980Ti vs GTx 980.

- R9 390X is mostly about 10% slower but in FC4 it beats Fury X.
- Fury X 1080P in FC4 is slower than in 1440P

very odd benchmarks


----------



## hamzta09

Quote:


> Originally Posted by *Themisseble*
> 
> If you look at Fury X 220W in gaming against GXT 980Ti 233W...


What

Code:



Code:


GTX 980 Ti   375W
R9 Fury X       407W
GTX 980 Ti OC    421W
R9 Fury OC  427W


----------



## BoredErica

Quote:


> Originally Posted by *infranoia*
> 
> Not really trying to rock the boat, but c'mon-- you have to admit that post was a bit of a nuke from orbit. Skyrim worked in 5850 Crossfire for me-- the scaling was not ideal, granted, but it was greater than 1x. I had DOF issues with certain ENBs, but it did work.
> 
> I get your point, it has a raft of issues compared to Crossfire in other games, and I found myself waiting too long for the frame pacing fix for DX9, which sucked.
> 
> So yeah. We're not disagreeing; I just called you out on a bit of hyperbole is all. And OCN needs less hyperbole. Crossfire works in Skyrim (at least with 2 GPUs) with a bit of effort.
> 
> http://www.nexusmods.com/skyrim/mods/19102/?
> http://enbseries.enbdev.com/forum/viewtopic.php?f=23&t=509
> http://www.overclock.net/t/1164844/skyrim-crossfire-issues-complete-list-of-known-fixs-here-12-1a-and-11-12-cap2-links-inside/600_100
> 
> [EDIT] But yeah, all the effort above basically had me looking at single-GPU equivalents to the 5850 pair, just for this game. Didn't want the hassle. But I actually didn't mind the whole Crossfire experience with other titles, for the most part it worked, and worked well. I don't hear too many complaints about the 295x2, frame pacing is improved considerably and I'm seriously looking forward to dual-Fiji.
> 
> And Skyrim on a single Fiji will haul ass, even on a Fury X2. DX9 is ancient and the sooner it's put out to pasture the better.


Crossfire is not an option, that doesn't mean any single AMD card is not an option. However, I think 980ti with 6gb of vram is what's going to sway me. I wonder if FO4 will support 144 fps properly and/or have good sli/crossfire support.

I recall Logan saying he couldn't get crossfire to work and he's big on modding the game too.

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Skyrim

I rely on PCper for my GPU info and Techpowerup for a lot of graphs. In that link it looks like the 295x2 is slower than the Fury X on 1440p and while it's faster on 4k, the frame consistency is totally insane.

If it was hyperbole it was not intended to be hyperbole. Maybe fixes will work but even if it does I still don't want to go down that route. One 980ti will cut it and I wouldn't have to deal with whatever issues I might have had with dual GPU solution.


----------



## Thoth420

Quote:


> Originally Posted by *Themisseble*
> 
> Its Fiji - should be faster than R9 390X which is = 1440P = GTX 980. So R9 Nano may be faster and it will be more efficient than GTX 980.
> 
> If you look at Fury X 220W in gaming against GXT 980Ti 233W...
> 
> Oc-ed GTX 980Ti and OC-ed Fury X
> http://www.eurogamer.net/articles/digitalfoundry-2015-radeon-r9-fury-x-review
> 
> Fury X does pretty good job, but drivers sucks.
> 
> Please compare 1080P vs 2160P - R9 390X vs R9 Fury X - and GTX 980Ti vs GTx 980.
> 
> - R9 390X is mostly about 10% slower but in FC4 it beats Fury X.
> - Fury X 1080P in FC4 is slower than in 1440P
> 
> very odd benchmarks


I'm talking about a card rumored that doesn't currently exist not a 980 or a 980Ti. When Titan X and 980Ti rumors "leaks" came out there was talk of a 980 Metal.


----------



## BoredErica

Quote:


> Originally Posted by *Thoth420*
> 
> I'm talking about a card rumored that doesn't currently exist not a 980 or a 980Ti. When Titan X and 980Ti rumors "leaks" came out there was talk of a 980 Metal.


Knucklehead suggested that it was merely a bad translation - the Ti supposedly written as 'Titanium', then poorly translated into 'metal'.


----------



## Blameless

Quote:


> Originally Posted by *Darkwizzie*
> 
> Knucklehead suggested that it was merely a bad translation - the Ti supposedly written as 'Titanium', then poorly translated into 'metal'.


This is most likely.


----------



## Kane2207

Quote:


> Originally Posted by *Darkwizzie*
> 
> Knucklehead suggested that it was merely a bad translation - the Ti supposedly written as 'Titanium', then poorly translated into 'metal'.


It was a poor translation, it's actually a reference to their next GPU since the pricing will be 'mental' when Nvidia realise they're uncontested at the top end of the market


----------



## Sashimi

Quote:


> Originally Posted by *Kane2207*
> 
> It was a poor translation, it's actually a reference to their next GPU since the pricing will be 'mental' when Nvidia realise they're uncontested at the top end of the market


Sorry couldn't resist being a smart-ass, but "mental" to "metal" isn't a mistranslation, it's a straight misread.









I like the way you think though.


----------



## Kane2207

Quote:


> Originally Posted by *Sashimi*
> 
> Sorry couldn't resist being a smart-ass, but "mental" to "metal" isn't a mistranslation, it's a straight misread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like the way you think though.


I bet you get invited to a lot of parties


----------



## Sashimi

Quote:


> Originally Posted by *Kane2207*
> 
> I bet you get invited to a lot of parties


I wish. I'm known to cause more awkward silences in conversations than anyone else so all I can do is hang around these forums nowadays.
















Edit: Sorry I should stop before this thread gets temporarily closed again.


----------



## rdr09

Quote:


> Originally Posted by *Darkwizzie*
> 
> Crossfire is not an option, that doesn't mean any single AMD card is not an option. However, I think 980ti with 6gb of vram is what's going to sway me. I wonder if FO4 will support 144 fps properly and/or have good sli/crossfire support.
> 
> I recall Logan saying he couldn't get crossfire to work and he's big on modding the game too.
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested/Skyrim
> 
> I rely on PCper for my GPU info and Techpowerup for a lot of graphs. In that link it looks like the 295x2 is slower than the Fury X on 1440p and while it's faster on 4k, the frame consistency is totally insane.
> 
> If it was hyperbole it was not intended to be hyperbole. Maybe fixes will work but even if it does I still don't want to go down that route. One 980ti will cut it and I wouldn't have to deal with whatever issues I might have had with dual GPU solution.


I play Skyrim in 4K with 2 290s just fine. Not sure if it is using both gpus. never bothered to check 'cause it's a beauty.


----------



## lol.69

Eteknix review is out in crossfire:
http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/

Tge results are a bit different from other reviews...near 100% scaling


----------



## Ganf

Quote:


> Originally Posted by *lol.69*
> 
> Eteknix review is out in crossfire:
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/
> 
> Tge results are a bit different from other reviews...near 100% scaling


Now if they fixed the tearing on crossfire eyefinity setups, they'll have a win.

AMD did say something about huge improvements to crossfire and boasted they would have near perfect scaling at GDC. I have a feeling they made it a priority since Nvidia started pushing NVlink, which is going to put Nvidia's panties in a twist because NVlink is proprietary and enterprise techs were pretty pissed about that and the proposed performance increase from Maxwell to pascal was nothing as they had been led to believe before the GTC conference this spring.

Now if AMD can crack the shell on everyone wanting to code in CUDA they'll have a fighting chance in enterprise solutions again.


----------



## Noobism

Quote:


> Originally Posted by *lol.69*
> 
> Eteknix review is out in crossfire:
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/
> 
> Tge results are a bit different from other reviews...near 100% scaling


Very nice to see, I haven't seen other reviews with SLI-Xfire. But how do the furies generally do?


----------



## Xuper

Quote:


> Originally Posted by *lol.69*
> 
> Eteknix review is out in crossfire:
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/
> 
> Tge results are a bit different from other reviews...near 100% scaling


look at this , Is that even Possible ?


----------



## Ganf

Quote:


> Originally Posted by *Noobism*
> 
> Very nice to see, I haven't seen other reviews with SLI-Xfire. But how do the furies generally do?


http://www.hardware.fr/focus/111/crossfire-radeon-r9-fury-x-fiji-vs-gm200-round-2.html

Hardware.fr had similar results. Fury X wins in almost every game against sli 980ti's

Gap can still be made up on the 980ti's parts' with aftermarket cards, but then again we're also waiting for unlocked voltage on the Furies.

Fun times.


----------



## RagingCain

Quote:


> Originally Posted by *Xuper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lol.69*
> 
> Eteknix review is out in crossfire:
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/
> 
> Tge results are a bit different from other reviews...near 100% scaling
> 
> 
> 
> look at this , Is that even Possible ?
Click to expand...

Yeah, it has higher TFLOPs than the GTX 980 Ti.

8.6 vs. 5.6 TFLOPs.

Higher compute performance though doesn't seem to translate into higher game performance though for AMD.

As a programmer, that seems to be more of a software optimization issue more than anything.


----------



## Offler

Quote:


> Originally Posted by *Xuper*
> 
> look at this , Is that even Possible ?


Fiji has 4000 GCN cores, Xfire means 8000 which is really amount of "small compute cluster". Therefore I would expect high performance in compute performance... If such amount of HW resources is able to saturate all requests coming from application and no backlog remains, it could explain abnormal results.

Edit:to RagingCain:

Yes and no. FuryX would definitely benefit from higher amount of ROPs and more vRAM, especially in games.

That indirectly confirms to me that biggest issue with FuryX is amount of ROPs and size of vRAM. Thre results above can be real simply because they are not limited by render output and/or absence of SW optimizations.


----------



## friend'scatdied

If they can fix the 99%tile frametimes they have a multi-GPU winner.

Provided your case has room for two separate 120mm radiators and a ridiculous amount of tubing.









And provided you don't need more than 4GB of video memory.


----------



## RagingCain

Quote:


> Originally Posted by *Offler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Xuper*
> 
> look at this , Is that even Possible ?
> 
> 
> 
> 
> 
> 
> Fiji has 4000 GCN cores, Xfire means 8000 which is really amount of "small compute cluster". Therefore I would expect high performance in compute performance... If such amount of HW resources is able to saturate all requests coming from application and no backlog remains, it could explain abnormal results.
> 
> Edit:to RagingCain:
> 
> Yes and no. FuryX would definitely benefit from higher amount of ROPs and more vRAM, especially in games.
> 
> That indirectly confirms to me that biggest issue with FuryX is amount of ROPs and size of vRAM. Thre results above can be real simply because they are not limited by render output and/or absence of SW optimizations.
Click to expand...

Sure, so would most cards, but it's a powerful compute engine that's under utilized in comparison to the competition. That's where drivers come in.


----------



## joeh4384

I think crossfire is superior to SLI when it works, however AMD is pretty awful about supporting new releases. We went nearly 4 months without crossfire profiles this year.


----------



## rdr09

Quote:


> Originally Posted by *Ganf*
> 
> Now if they fixed the tearing on crossfire eyefinity setups, they'll have a win.
> 
> AMD did say something about huge improvements to crossfire and boasted they would have near perfect scaling at GDC. I have a feeling they made it a priority since Nvidia started pushing NVlink, which is going to put Nvidia's panties in a twist because NVlink is proprietary and enterprise techs were pretty pissed about that and the proposed performance increase from Maxwell to pascal was nothing as they had been led to believe before the GTC conference this spring.
> 
> Now if AMD can crack the shell on everyone wanting to code in CUDA they'll have a fighting chance in enterprise solutions again.


which games exhibit tearing? and what rez and Hz?

https://www.youtube.com/watch?v=baMuk4YmOCw&feature=youtu.be


----------



## JackCY

Way too small ROP count for a card of this class.

Also would be nice to see reviews that test different parts of the GPU like tessellation and performance in applications not games.
Performance/power ratio seems as bad as always with AMD :/


----------



## Ganf

Quote:


> Originally Posted by *rdr09*
> 
> which games exhibit tearing? and what rez and Hz?
> 
> https://www.youtube.com/watch?v=baMuk4YmOCw&feature=youtu.be


Can't watch videos right now.

Nor can I give a list. Crossfire and Eyefinity aren't on my radar, I just remember reading a few threads about it being completely pooched a few months back, and since crossfire hasn't been receiving any love I assumed that this was still the case.


----------



## Noobism

Quote:


> Originally Posted by *Ganf*
> 
> http://www.hardware.fr/focus/111/crossfire-radeon-r9-fury-x-fiji-vs-gm200-round-2.html
> 
> Hardware.fr had similar results. Fury X wins in almost every game against sli 980ti's
> 
> Gap can still be made up on the 980ti's parts' with aftermarket cards, but then again we're also waiting for unlocked voltage on the Furies.
> 
> Fun times.


Yea that's good news indeed. I'm waiting to see the results of unlocked voltages.


----------



## Thoth420

Quote:


> Originally Posted by *Darkwizzie*
> 
> Knucklehead suggested that it was merely a bad translation - the Ti supposedly written as 'Titanium', then poorly translated into 'metal'.


Gotcha


----------



## looniam

just an "update" about the voltage control in AB -unwinder still doesn't have a card but there is . .testing(?).

http://forums.guru3d.com/showthread.php?t=400333


----------



## Wezzor

Shouldn't AMD release a new WHQL driver soon?


----------



## RagingCain

Quote:


> Originally Posted by *Wezzor*
> 
> Shouldn't AMD release a new WHQL driver soon?


Coming Soon™


----------



## Xuper

Quote:


> Originally Posted by *RagingCain*
> Yeah, it has higher TFLOPs than the GTX 980 Ti.
> 
> 8.6 vs. 5.6 TFLOPs.
> 
> Higher compute performance though doesn't seem to translate into higher game performance though for AMD.
> 
> As a programmer, that seems to be more of a software optimization issue more than anything.


Look at those number :
XFX Fury X 4GB OC = 6435
XFX Fury X 4GB = 3166
How much OC ? for instance : 10% and this 10% makes huge jump? i think this ( XFX Fury x 4GB OC) is typo.it's might be CF.


----------



## hamzta09

Quote:


> Originally Posted by *Offler*
> 
> Fiji has 4000 GCN cores, Xfire means 8000 which is really amount of "small compute cluster".


That image doesnt show Xfire..?


----------



## ZealotKi11er

Quote:


> Originally Posted by *joeh4384*
> 
> I think crossfire is superior to SLI when it works, however AMD is pretty awful about supporting new releases. We went nearly 4 months without crossfire profiles this year.


Time is not a problem. I can wait 4-6 months. Problem is we dont get much driver update after 2 months that the game is out. FC4 CFX is still broken. Witcher 3 CFX still not fully working. Will they ever fix them?


----------



## infranoia

Quote:


> Originally Posted by *Xuper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> Yeah, it has higher TFLOPs than the GTX 980 Ti.
> 
> 8.6 vs. 5.6 TFLOPs.
> 
> Higher compute performance though doesn't seem to translate into higher game performance though for AMD.
> 
> As a programmer, that seems to be more of a software optimization issue more than anything.
> 
> 
> 
> Look at those number :
> XFX Fury X 4GB OC = 6435
> XFX Fury X 4GB = 3166
> How much OC ? for instance : 10% and this 10% makes huge jump? i think this ( XFX Fury x 4GB OC) is typo.it's might be CF.
Click to expand...

Quote:


> Originally Posted by *hamzta09*
> 
> That image doesnt show Xfire..?


Literally _every_ other graph in the review shows "XFX Fury X 4GB CF", the review is a Crossfire Fury X review, and there is no mention of OC in their testing criteria, and the caption to that graph states "The compute performance proves that there is almost perfect scaling from the Crossfire set-up."

Clearly a typo. They had "*O*pen*C*L" on the brain when they typed it out.


----------



## Rei86

Quote:


> Originally Posted by *joeh4384*
> 
> I think crossfire is superior to SLI when it works, however AMD is pretty awful about supporting new releases. We went nearly 4 months without crossfire profiles this year.


Even in beta drivers they don't have a crossfire profile for something like The Witcher 3?

I always end up purchasing two more GPUs when I do purchase them and having one almost all the time doing nothing would put me in the straight pissed off mood for having spend that extra money on a paper weight. This is one of the concerns of when I was looking to purchase these over the 980Ti/Titan X. Good thing I decided to just spend it on other crap.


----------



## Themisseble

Quote:


> Originally Posted by *lol.69*
> 
> Eteknix review is out in crossfire:
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/
> 
> Tge results are a bit different from other reviews...near 100% scaling


R9 390X beats TITAN X in hitman 4K benchmatrk lolololol
R9 390X beats GTX 980 at 1080P (5wins vs 1 win) - Thats why AMD does not need new GCN core








Fury X beats TITAN X (4 wins vs 2 wins at 4K)


----------



## RagingCain

Quote:


> Originally Posted by *infranoia*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Xuper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> Yeah, it has higher TFLOPs than the GTX 980 Ti.
> 
> 8.6 vs. 5.6 TFLOPs.
> 
> Higher compute performance though doesn't seem to translate into higher game performance though for AMD.
> 
> As a programmer, that seems to be more of a software optimization issue more than anything.
> 
> 
> 
> Look at those number :
> XFX Fury X 4GB OC = 6435
> XFX Fury X 4GB = 3166
> How much OC ? for instance : 10% and this 10% makes huge jump? i think this ( XFX Fury x 4GB OC) is typo.it's might be CF.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *hamzta09*
> 
> That image doesnt show Xfire..?
> 
> Click to expand...
> 
> Literally _every_ other graph in the review shows "XFX Fury X 4GB CF", the review is a Crossfire Fury X review, and there is no mention of OC in their testing criteria, and the caption to that graph states "The compute performance proves that there is almost perfect scaling from the Crossfire set-up."
> 
> Clearly a typo. They had "*O*pen*C*L" on the brain when they typed it out.
Click to expand...

That would be my guess, CFX and OC, since the scaling would be above 100% which is not possible.


----------



## Ganf

Quote:


> Originally Posted by *Themisseble*
> 
> As you can see sometimes Fury X is faster at 1440P than 1080P + Non OC Fury beats OC fury = driver problem or benchmark failed. So this benchmark is not good.


Wait, hold on, let me actually read those numbers....

.......

So it's not CF/SLI benchmarks, which is what that eteknix review is about, and thus doesn't show the difference in scaling, and their Fury card clocks like crap even considering the voltage is locked.

I mean that is a terribad OC...

Well that was an informative waste of time.


----------



## toncij

Quote:


> Originally Posted by *RagingCain*
> 
> Yeah, it has higher TFLOPs than the GTX 980 Ti.
> 
> 8.6 vs. 5.6 TFLOPs.
> 
> Higher compute performance though doesn't seem to translate into higher game performance though for AMD.
> 
> As a programmer, that seems to be more of a software optimization issue more than anything.


As a programmer doing rendering too, you're spot-on, but I'd rather choose another word rather than "issue". It is a design "issue", a design perk let's say. AMD and NVidia have a different design regarding core grouping, units number, task scheduling perks, etc. For some workloads AMD had always had an upper hand, while NVidia did for other. Then a driver also plays its part, but in general with certain conditions AMD may win over. May... since in current gen games NVidia still wins. AMD started showing advantage with some games that can put their architecture at work...


----------



## infranoia

Quote:


> Originally Posted by *Themisseble*
> 
> As you can see sometimes Fury X is faster at 1440P than 1080P + Non OC Fury beats OC fury = driver problem or benchmark failed. So this benchmark is not good.


LOL

Assassin's Creed, stock *45.8*, OC *45.4*
FarCry 4, stock *78.9*, OC *78.9*

Yeah... no.


----------



## Kane2207

Quote:


> Originally Posted by *Themisseble*
> 
> R9 390X beats TITAN X in hitman 4K benchmatrk lolololol
> R9 390X beats GTX 980 at 1080P (5wins vs 1 win) - Thats why AMD does not need new GCN core
> 
> 
> 
> 
> 
> 
> 
> 
> Fury X beats TITAN X (4 wins vs 2 wins at 4K)


Hitman has always favoured AMD and every review I've seen has the Fury X closing the gap to the 980ti/Titan X @4K but ultimately losing out in the long run. As for the gap narrowing, that could just as likely be Nvidias performance dropping off using a 384bit bus as it could Fury X suddenly coming into it's stride. Fury and GCN appear to scale quite linearly, so I'm more inclined to believe Nvidias performance takes a hit at 4K rather than the Fury X finding magical performance at higher resolutions.

At this point I don't know why someone would try to spin Fury X as beating either of Nvidias top two cards, everything points to the contrary. There'll always be some outlier case, but the average taken across multiple games, across multiple review sites indicates Maxwell out performs AMDs top offering.

Hopefully AMD find some special sauce in a driver update and review sites will revisit but I wonder how much optimisation is left in GCN, the arch is now four years old and driver optimisations are finite.


----------



## Themisseble

Quote:


> Originally Posted by *Kane2207*
> 
> Hitman has always favoured AMD and every review I've seen has the Fury X closing the gap to the 980ti/Titan X but ultimately loses in the long run. As for the gap narrowing, that could just as likely be Nvidias performance dropping off using a 384bit bus as it could Fury X suddenly coming into it's stride. Fury and GCN appear to scale quite linearly, so I'm more inclined to believe Nvidias performance takes a hit at 4K rather than the Fury X finding magical performance at higher resolutions.
> 
> At this point I don't know why someone would try to spin Fury X as beating either of Nvidias top two cards, everything points to the contrary. There'll always be some outlier case, but the average taken across multiple, games across multiple review sites indicates Maxwell out performs AMDs top offering.
> 
> Hopefully AMD find some special sauce in a driver update and review sites will revisit but I wonder how much optimisation is left in GCN, the arch is now four years old and driver optimisations are finite.


You are wrong about that.
Fury X is going to win in the long run. Just like R9 290X destroyed GTX 780Ti... and now R9 290X (new drivers + better cooling = R9 390X) is destroying GTX 980?.. after two years AMD will be still improving their drivers for GCN core and it will destroy GTX 980Ti... maybe ( both cards have 5.6TFLops)

Look, most people dont get AMD. Specially NVIDIA fanboys.. AMD is doing best for costumers. GCn is here for 4 years and still battling new Maxwell core. AMD midrange cards are better priced than NVIDIAs. 7970 GHZ is now battling GTX 780 in new games... why would i weant to have new GCN core? Why? to spend more of my money? I think I will let AMD to optimize it.

http://www.techspot.com/review/917-far-cry-4-benchmarks/page3.html

look FC4 - great game. 7970 GHz is battling TITAN.

I care mostly about midrange GPU- I could go with GTX 660 or R9 270X - now R9 270X is about 50% faster in FC4. I dont use MSAA i prefer screen scaling or higher res. with high-med settings. How well does GTX 660 vs 7870 in hgiher res? those who went GTX 660 SLI over 7870 CF were fools. 7750 CF (2x 55W) would beat GTX 660.


----------



## Kane2207

Quote:


> Originally Posted by *Themisseble*
> 
> You are wrong about that.
> Fury X is going to win in the long run. Just like R9 290X destroyed GTX 780Ti... and now R9 290X (new drivers + better cooling = R9 390X) is destroying GTX 980?.. after two years AMD will be still improving their drivers for GCN core and it will destroy GTX 980Ti... maybe ( both cards have 5.6TFLops)
> 
> Look, most people dont get AMD. Specially NVIDIA fanboys.. AMD is doing best for costumers. GCn is here for 4 years and still battling new Maxwell core. AMD midrange cards are better priced than NVIDIAs. 7970 GHZ is now battling GTX 780 in new games... why would i weant to have new GCN core? Why? to spend more of my money? I think I will let AMD to optimize it.


OK, OK, driver optimisations are an infinite resource, AMD has a bottomless pit of them on their magical architecture









Did you even read what I typed before you jumped down my throat to defend AMD?

And while you're busy predicting the future on how the graphics market is going to pan out on what may or may not appear in driver updates weeks, months or even years down the line, could you please spare 5 minutes to give me the numbers in tonights Euro Millions lottery?


----------



## Slink3Slyde

That old AMD delusion. Neither AMD nor Nvidia gives a damn about you except for your money. Hate to break it to you.

Also there is no Easter bunny.


----------



## Kane2207

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Also there is no Easter bunny.




You monster!


----------



## Themisseble

Quote:


> Originally Posted by *Slink3Slyde*
> 
> That old AMD delusion. Neither AMD nor Nvidia gives a damn about you except for your money. Hate to break it to you.
> 
> Also there is no Easter bunny.


LoL,
I just had to say it. Look at it. 7970 battling TITAN - that must be hard for all TITAN users.


----------



## Ganf

Quote:


> Originally Posted by *Kane2207*
> 
> OK, OK, driver optimisations are an infinite resource, AMD has a bottomless pit of them on their magical architecture
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you even read what I typed before you jumped down my throat to defend AMD?
> 
> And while you're busy predicting the future on how the graphics market is going to pan out on what may or may not appear in driver updates weeks, months or even years down the line, could you please spare 5 minutes to give me the numbers in tonights Euro Millions lottery?


For AMD? Yes, they're just about a bottomless resource.

AMD found consistent performance updates for 5 years on VLIW without any significant changes to it's design like GCN has received, and yet they still continued to try and squeeze out some more from the architecture for 6 months after it topped out before giving up and informing owners that they would be listing the cards under legacy support.

5 years is a hell of a long time to support your hardware in this industry, let alone continue to improve it consistently.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Themisseble*
> 
> LoL,
> I just had to say it. Look at it. 7970 battling TITAN - that must be hard for all TITAN users.


I've looked at those numbers many many, times I had a whole thread on Kepler vs Maxwell in newer games. Tis true, Kepler is old and knackered, especially when you look only at reference clocks and dont take into account o/c. AMD have improved drivers too. Good for them.

But perhaps if AMD had been smart enough to try play the game Nvidia and Intels way, or to see it coming at least, they wouldnt be so screwed right now and neither would we.

Who knows.

No tooth fairy either.


----------



## Kane2207

Quote:


> Originally Posted by *Ganf*
> 
> For AMD? Yes, they're just about a bottomless resource.
> 
> AMD found consistent performance updates for 5 years on VLIW without any significant changes to it's design like GCN has received, and yet they still continued to try and squeeze out some more from the architecture for 6 months after it topped out before giving up and informing owners that they would be listing the cards under legacy support.
> 
> 5 years is a hell of a long time to support your hardware in this industry, let alone continue to improve it consistently.


Hey, I did say hopefully they'll find some, and there's HBM optimisations to consider too. I never said they _can't_ optimise it further, just pointed out that it's 4 years old and optimisations are a finite resource









I think some people are trying to spin it that we'll see the kind of epic optimisations like we did with the 7970, which was the introdcution of a new arch. I'm not sure we'll see anything that spectacular again until AMD go to whatever they have up their sleeves past GCN.


----------



## Ganf

Quote:


> Originally Posted by *Kane2207*
> 
> Hey, I did say hopefully they'll find some, and there's HBM optimisations to consider too. I never said they _can't_ optimise it further, just pointed out that it's 4 years old and optimisations are a finite resource
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think some people are trying to spin it that we'll see the kind of epic optimisations like we did with the 7970, which was the introdcution of a new arch. I'm not sure we'll see anything that spectacular again until AMD go to whatever they have up their sleeves past GCN.


Well, HBM is a new architecture, and they seem to have already pulled one rabbit out of the hat with Crossfire scaling. I'll sit back and wait until the 980ti Lightning drops. Maybe another rabbit will pop out, maybe not. Either way my 290x Lightning is a good overclocker, I couldn't ask for a better $300 placeholder.

It could get me by until Pascal if I wasn't feeling weird about buying a $550 processor and a $300 GPU....


----------



## GorillaSceptre

This driver argument is a bit silly imo.

I'll preface this by saying in hindsight, the 290x was probably the best buy for a GPU in recent memory (assuming you picked one up for a reasonable price).

Even though i disagree about the 290x "destroying" the 780Ti, i would say the 290x is the better card. But by the time the 290x caught up to the 780Ti, (and they're still within reach of each other in performance), does anyone even care?

It took what, over a year for the 290x to match/beat the 780Ti? At this stage, i think the owners of those cards have already gotten what they wanted out of them, and now everyone's looking at the 980Ti/FX. So in the end, i don't think the 290x "won" anything. It's now comparable to a card that had the same performance for over a year.

Not to mention that the same scenario might not even play out this time with the Fury and the Ti. Last time AMD had more vram, now the opposite is true. I don't think picking a Fury over a Ti based on future drivers is a wise choice.


----------



## Tivan

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Even though i disagree about the 290x "destroying" the 780Ti, i would say the 290x is the better card. But by the time the 290x caught up to the 780Ti, (and they're still within reach of each other in performance), does anyone even care?


I big yes from this second hand market buyer!

Maybe this time there won't be a FuryX coin mining craze and they'll keep their value better than 290s/79xxs

Which would in turn help AMD not get screwed in their sales channels as hard. (edit: on that note, I rather appreciate the skillfuness with which they presented the 300 series, even vs second hand offerings. That was actually really well conceived an idea, and well executed with the drivers and clocks and all. And bonus VRAM.)

If they keep their prices and inventory movement anywhere near where they want em, they might be able to get more sales out of driver improvements. Maybe release a 3xxX here and there at the right timings.

Considering they don't have the funds to make new chips at Nvidia rate, they really gotta capitalize on driver improvements in that manner, anyway. (I seem to remember hearing AMD saying that they're intending to do something along those lines, actually. Like, a focus on software, or something. But not sure anymore!)


----------



## GorillaSceptre

Quote:


> Originally Posted by *Tivan*
> 
> I big yes from this second hand market buyer!
> 
> Maybe this time there won't be a FuryX coin mining craze and they'll keep their value better than 290s/79xxs
> 
> Which would in turn help AMD not get screwed in their sales channels as hard. (edit: on that note, I rather appreciate the skillfuness with which they presented the 300 series, even vs second hand offerings. That was actually really well conceived an idea, and well executed with the drivers and clocks and all. And bonus VRAM.)
> 
> If they keep their prices and inventory movement anywhere near where they want em, they might be able to get more sales out of driver improvements. Maybe release a 3xxX here and there at the right timings.
> 
> Considering they don't have the funds to make new chips at Nvidia rate, they really gotta capitalize on driver improvements in that manner, anyway. (I seem to remember hearing AMD saying that they're intending to do something along those lines, actually. Like, a focus on software, or something. But not sure anymore!)


Yeah, buying second hand is a different story.

If you're buying second hand then you have the luxury of hindsight depending on when you buy. You will already know how the cards stack up against each other down the road.

I'm talking about the people buying now, and at this moment the 980Ti is the better performer. I'm personally waiting for the Fury X to get voltage control and i'll make my decision then.

Waiting a year to see *if* drivers make the difference is pointless to me, at that point Pascal will nearly be out.


----------



## infranoia

Linear Crossfire scaling is going to play right into a very strong dual-Fiji card. I keep harping on about it when it sounds like most mooks around here have sworn off multi-GPUs, but that's only due to Nvidia's 80% market and sucky SLI scaling.

The 295x2 has been a crazy good performer far longer than it deserved, and I'm not sure anyone should write off the Fiji X2 until it rears its head. Even at 295x2 launch prices it would be a good investment, the final huge re-entry burn for 28nm to last us through the first couple crappy generations of a new process.


----------



## gamervivek

Quote:


> Originally Posted by *Kane2207*
> 
> Hitman has always favoured AMD and every review I've seen has the Fury X closing the gap to the 980ti/Titan X @4K but ultimately losing out in the long run. As for the gap narrowing, that could just as likely be Nvidias performance dropping off using a 384bit bus as it could Fury X suddenly coming into it's stride. Fury and GCN appear to scale quite linearly, so I'm more inclined to believe Nvidias performance takes a hit at 4K rather than the Fury X finding magical performance at higher resolutions.
> 
> At this point I don't know why someone would try to spin Fury X as beating either of Nvidias top two cards, everything points to the contrary. There'll always be some outlier case, but the average taken across multiple games, across multiple review sites indicates Maxwell out performs AMDs top offering.
> 
> Hopefully AMD find some special sauce in a driver update and review sites will revisit but I wonder how much optimisation is left in GCN, the arch is now four years old and driver optimisations are finite.


It beats 980Ti at 4k in Tom's, Sweclockers and hardwareluxx, and ties it in other reviews. With fewer ROPs than nvidia, this is the overturning of the earlier AMD vs. nvidia contest during AMD's VLIW era where they would do worse at higher resolutions with fewer ROPs but with similar advantage in shaders. And now with a lower clockspeed to boot. Pretty impressive.

So AMD are doing better with fewer ROPs and HBM wasn't merely there for power consumption unless you believe they are doing rather well with delta compression as well.

Of course AMD would do even better with more games like Hitman and reviewers like techspot.

http://www.techspot.com/articles-info/977/bench/Hitman.png


----------



## BoredErica

If Fury X beats 980ti at 4k, then 4k DSR/etc down to 1080p would mean Fury X still beats 980ti, right?


----------



## Blackops_2

Quote:


> Originally Posted by *Darkwizzie*
> 
> If Fury X beats 980ti at 4k, then 4k DSR/etc down to 1080p would mean Fury X still beats 980ti, right?


Yes but what happens when you factor in OCing headroom? Fury will lose inevitably as it stands right now. VSR would work though. I've been using DSR on my 780s lately and am really enjoying but i took a huge performance hit in most games, understandably so though.

What i keep finding more impressive and what will aid Fury in the long run is optimization, i know this could be said for everything. But looking at how far the 290x has come i'm thoroughly impressed. When it launched it was clearly faster than the 780 at comparable clocks but not by much. 780 could clock higher as well so it was a non issue to me. I chose 780s. Almost a year later it's now trading blows with the 780Ti. I know to some degree that is aggravating because of how long it took. But Fury could end up seeing substantial gains down the road. Especially considering we're dealing with HBM this time around.


----------



## gamervivek

Quote:


> Originally Posted by *Darkwizzie*
> 
> If Fury X beats 980ti at 4k, then 4k DSR/etc down to 1080p would mean Fury X still beats 980ti, right?


Yes, I think one of the reasons AMD gave out that slide showing negligible performance hit with 4k VSR vs. 4k was to drive forth this point, however reviewers haven't paid much attention to it.


----------



## Serandur

About the Kepler vs GCN performance degradation thing and how it relates to Maxwell vs GCN 1.2... it doesn't really.

There's no real doubt; Kepler's not keeping up relative to its former GCN competitors like it used to. However, we don't really know the reasons why or if they're applicable to Maxwell.

1. Nvidia could in fact be crippling or neglecting Kepler with their latest games and though it's a shady thing to do, the incentive to do so may not be as strong with Maxwell versus Pascal because even the mid-range GP104 will smoke GM200 without disparate driver support. Due to 28nm limitations, that was not the case with GM204 vs GK110.

The behavior is disgusting if true, but still doesn't mean Maxwell will be dropped like Kepler next year even if it is. Also, it seems quite likely Pascal may just be a refined Maxwell on 16nm with HBM2 and FP64 capabilities re-added, not a whole new microarchitecture. Certain optimizations may therefore carry over to Maxwell easily.

2. Kepler may just have run out of steam due to several architectural weakspots and the influence of the newer consoles (utilizing GCN). However, Maxwell is a new architecture that addresses Kepler's weakpoints versus GCN such as compute, VRAM, and ROP limitations.

*Maxwell is not Kepler.* Maxwell is a revamped, post-GCN core uarch that cuts out the fluff (FP64) and was designed for maximum gaming efficiency at a time when Nvidia had a better understanding of what's important for the new console gen. By maximum, I mean GM200 is pushing the limits of what TSMC can currently manufacture with a well-balanced uarch that has no consideration for Tesla workloads, only consumer applications.

3. AMD may have improved GCN drivers significantly. Those gains are finite and actually reflect poorly on AMD for not doing that earlier if true.

It could be a combination of the three to unknown levels each, but it's not a sure thing Maxwell will just fall behind at all, imo.


----------



## maltamonk

Quote:


> Originally Posted by *Serandur*
> 
> 3. AMD may have improved GCN drivers significantly. Those gains are finite and *actually reflect poorly on AMD for not doing that earlier if true.
> *
> .


What? That's definitely a new one.


----------



## Serandur

Quote:


> Originally Posted by *maltamonk*
> 
> What? That's definitely a new one.


How is it a positive thing _if_ they intentionally held back performance/support to drip-feed later or were incapable of properly supporting their products until several years after release?


----------



## looniam

Quote:


> Originally Posted by *maltamonk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Serandur*
> 
> 3. AMD may have improved GCN drivers significantly. Those gains are finite and *actually reflect poorly on AMD for not doing that earlier if true.
> *
> .
> 
> 
> 
> What? That's definitely a new one.
Click to expand...

here:

2012 AMD and NVIDIA Driver Performance Summary Review

the first half of 2012 (when GCN was released) was an utter fail for AMD drivers.


----------



## maltamonk

Quote:


> Originally Posted by *looniam*
> 
> here:
> 
> 2012 AMD and NVIDIA Driver Performance Summary Review
> 
> the first half of 2012 (when GCN was released) was an utter fail for AMD drivers.


So you are eluding they did that on purpose?


----------



## looniam

Quote:


> Originally Posted by *maltamonk*
> 
> So you are eluding they did that on purpose?


not eluding anything - just that you seemed surprised. so i was getting you up to speed


----------



## rdr09

Quote:


> Originally Posted by *Serandur*
> 
> How is it a positive thing _if_ they intentionally held back performance/support to drip-feed later or were incapable of properly supporting their products until *several years* after release?


ok.


----------



## ZealotKi11er

Quote:


> Originally Posted by *looniam*
> 
> not eluding anything - just that you seemed surprised. so i was getting you up to speed


The drivers back than had other problems not related to performance. Once 12.11 kicked in GCN matched Kepler with much lower clocks 925MHz HD 7970 vs 11XX GTX680.


----------



## maltamonk

Quote:


> Originally Posted by *looniam*
> 
> not eluding anything - just that you seemed surprised. so i was getting you up to speed


I'm up to speed. I just don't see how them continuously working to improve them and doing so, makes them look bad.


----------



## Serandur

Quote:


> Originally Posted by *rdr09*
> 
> ok.


The first GCN products released between the tail-end of 2011 (for Tahiti/7970) and 2012. If the huge gains versus Kepler were gained from driver improvements, the 7970 didn't get those improvements (putting it at GK110 level) until the end of 2014; right around 3 years after release.


----------



## rdr09

Quote:


> Originally Posted by *Serandur*
> 
> The first GCN products released between the tail-end of 2011 (for Tahiti/7970) and 2012. If the huge gains versus Kepler were gained from driver improvements, the 7970 didn't get those improvements (putting it at GK110 level) until the end of 2014; right around 3 years after release.


That was just months. Look at the graphics scores . . .

http://www.3dmark.com/compare/3dm11/5059839/3dm11/4519473

that's 11 to 12 driver.


----------



## p4inkill3r

Techspot's review:

http://www.techspot.com/review/1024-and-radeon-r9-fury-x/


----------



## Noobism

Quote:


> Originally Posted by *maltamonk*
> 
> I'm up to speed. I just don't see how them continuously working to improve them and doing so, makes them look bad.


It's just something people like to throw in, which translate into a con. Driver improvement should be expected with any product, not some conspiracy theory that they purposely did it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *p4inkill3r*
> 
> Techspot's review:
> 
> http://www.techspot.com/review/1024-and-radeon-r9-fury-x/


Just looking at Crysis 3 graph. Both 980 Ti and Fury X get 28 fps. This is what they write about it or i am blind.



Quote:


> Jumping to 4K reduced the average frame rate to just 28fps. The Fury X wasn't able to match the performance of the GTX 980 Ti


----------



## Serandur

Quote:


> Originally Posted by *rdr09*
> 
> That was just months. Look at the graphics scores . . .
> 
> http://www.3dmark.com/compare/3dm11/5059839/3dm11/4519473
> 
> that's 11 to 12 driver.


I'm talking specifically about the 7970/280X matching the 780/780Ti/Titan in games. That didn't start getting reported until late 2014 with Far Cry 4. GCN got driver improvements throughout its life of course, but matching GK110 is really recent.

This is only going off of the assumption of one possible reason for that scenario; still very questionable if mature GCN drivers really were a driving factor. We don't know if GCN just pulled ahead, Kepler fell behind, or both nor the exact reasons why of course. Just speculation.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Serandur*
> 
> I'm talking specifically about the 7970/280X matching the 780/780Ti/Titan in games. That didn't start getting reported until late 2014 with Far Cry 4. GCN got driver improvements throughout its life of course, but matching GK110 is really recent.
> 
> This is only going off of the assumption of one possible reason for that scenario; still very questionable if mature GCN drivers really were a driving factor. We don't know if GCN just pulled ahead, Kepler fell behind, or both nor the exact reasons why of course. Just speculation.


I think its with new gen games. It's not that GCN got a ton faster just that Kepler stalled in performance enhancement.


----------



## KyadCK

Quote:


> Originally Posted by *Serandur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> ok.
> 
> 
> 
> The first GCN products released between the tail-end of 2011 (for Tahiti/7970) and 2012. If the huge gains versus Kepler were gained from driver improvements, the 7970 didn't get those improvements (putting it at GK110 level) until the end of 2014; right around 3 years after release.
Click to expand...

Because AMD totally isn't still using GCN and thus still improving GCN, right? They moved on to VSRT (Video Shader Round Two) with the 290X and now we're on IUYA (Imaging Units Yet Again) with the Fury X.

But I dunno what you're talking about with GK110 and the 7970. We're talking mainly about how a 290X mostly wins vs a 780ti now, when it used to lose by like 10%.

If a 7970 is in 780 (and thus GK110) territory now, then man, thats news to me. The driver that really put it on par with the 680 was 12.11, combined with the Ghz Edition cards, which arguably gave AMD back the title.
Quote:


> Originally Posted by *maltamonk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> not eluding anything - just that you seemed surprised. so i was getting you up to speed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm up to speed. I just don't see how them continuously working to improve them and doing so, makes them look bad.
Click to expand...

Because clearly, AMD would not deliver this performance from the get-go, allowing them to easily topple the best cards for two previous generations (680 and 780ti) and charge a boatload more money. They must trick us into thinking they are inferior in raw FPS numbers at launch so that we can be amazed by how well the cards age two to three years down the road when they are already EOL, or would be if not for rebrands.

Equally as clearly, AMD would never give the Fury X it's perfect drivers on day one, allowing it to trounce the 980ti. We must wait for 18-24 months before we get the perfect drivers they had ready to go right now in order to see the full benefit of the card.

Or AMD is just improving things over time and since they share the same arch, even the old cards get a fair boost. You know, whichever. I give it a 50/50 chance.

Ya, I get a bit sarcastic with obviously ludicrous conspiracy theories.


----------



## rdr09

Quote:


> Originally Posted by *Serandur*
> 
> I'm talking specifically about the 7970/280X matching the 780/780Ti/Titan in games. That didn't start getting reported until late 2014 with Far Cry 4. GCN got driver improvements throughout its life of course, but matching GK110 is really recent.
> 
> This is only going off of the assumption of one possible reason for that scenario; still very questionable if mature GCN drivers really were a driving factor. We don't know if GCN just pulled ahead, Kepler fell behind, or both nor the exact reasons why of course. Just speculation.


I have the latest driver on my 7950 and my benchmarks are still the same since 12 driver. We are at 15. Pretty much topped out.


----------



## Serandur

Quote:


> Originally Posted by *KyadCK*
> 
> Because AMD totally isn't still using GCN and thus still improving GCN, right? They moved on to VSRT (Video Shader Round Two) with the 290X and now we're on IUYA (Imaging Units Yet Again) with the Fury X.
> 
> But I dunno what you're talking about with GK110 and the 7970. We're talking mainly about how a 290X mostly wins vs a 780ti now, when it used to lose by like 10%.
> 
> If a 7970 is in 780 (and thus GK110) territory now, then man, thats news to me. The driver that really put it on par with the 680 was 12.11, combined with the Ghz Edition cards, which arguably gave AMD back the title.
> Because clearly, AMD would not deliver this performance from the get-go, allowing them to easily topple the best cards for two previous generations (680 and 780ti) and charge a boatload more money. They must trick us into thinking they are inferior in raw FPS numbers at launch so that we can be amazed by how well the cards age two to three years down the road when they are already EOL, or would be if not for rebrands.
> 
> Equally as clearly, AMD would never give the Fury X it's perfect drivers on day one, allowing it to trounce the 980ti. We must wait for 18-24 months before we get the perfect drivers they had ready to go right now in order to see the full benefit of the card.
> 
> Or AMD is just improving things over time and since they share the same arch, even the old cards get a fair boost. You know, whichever. I give it a 50/50 chance.
> 
> Ya, I get a bit sarcastic with obviously ludicrous conspiracy theories.


Tahiti and Hawaii both improving with GCN-boosting drivers is a given. They're both GCN and they're both part of this scenario. Where Hawaii's pulled ahead recently versus Kepler, so has Tahiti. But if GCN is actually pulling ahead with drivers, those improvements could have and arguably should have been made a long time back if we're also under the assumption Nvidia max out their stuff way earlier on.

Driver development's not magic, you don't just keep indefinitely improving hardware performance with driver development unless there were still some inefficiencies left. If GCN drivers matured long ago and Kepler's just falling behind, then AMD aren't improving drivers so much as simply doing what they should be and supporting products whereas Nvidia are simply neglecting them or the other reason (architectural weaknesses).

"If" is the key word.


----------



## KyadCK

Quote:


> Originally Posted by *Serandur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> Because AMD totally isn't still using GCN and thus still improving GCN, right? They moved on to VSRT (Video Shader Round Two) with the 290X and now we're on IUYA (Imaging Units Yet Again) with the Fury X.
> 
> But I dunno what you're talking about with GK110 and the 7970. We're talking mainly about how a 290X mostly wins vs a 780ti now, when it used to lose by like 10%.
> 
> If a 7970 is in 780 (and thus GK110) territory now, then man, thats news to me. The driver that really put it on par with the 680 was 12.11, combined with the Ghz Edition cards, which arguably gave AMD back the title.
> Because clearly, AMD would not deliver this performance from the get-go, allowing them to easily topple the best cards for two previous generations (680 and 780ti) and charge a boatload more money. They must trick us into thinking they are inferior in raw FPS numbers at launch so that we can be amazed by how well the cards age two to three years down the road when they are already EOL, or would be if not for rebrands.
> 
> Equally as clearly, AMD would never give the Fury X it's perfect drivers on day one, allowing it to trounce the 980ti. We must wait for 18-24 months before we get the perfect drivers they had ready to go right now in order to see the full benefit of the card.
> 
> Or AMD is just improving things over time and since they share the same arch, even the old cards get a fair boost. You know, whichever. I give it a 50/50 chance.
> 
> Ya, I get a bit sarcastic with obviously ludicrous conspiracy theories.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tahiti and Hawaii both improving with GCN-boosting drivers is a given. They're both GCN. But if GCN is actually pulling ahead with drivers, those improvements could have and arguably should have been made a long time back if we're also under the assumption Nvidia max out their stuff way earlier on.
> 
> Driver development's not magic, you don't just keep indefinitely improving hardware performance with driver development unless there were still some inefficiencies left. If GCN drivers matured long ago and Kepler's just falling behind, then AMD aren't improving drivers so much as simply doing what they should be and supporting products whereas Nvidia are simply neglecting them or the other reason (architectural weaknesses).
> 
> "If" is the key word.
Click to expand...

Why not? Intel has been doing exactly that on the CPU side of things since Pentium 3, and CPU Arch design is Programming: Hardmode Activated. Who (of those who have programming experience) hasn't looked back at their programs or scripts and thought "Why in the world did I do it this way? This new way is so much better and quicker!"

There are very few instances where a given design stays in play for as long in the computer world as GCN has/will. The drivers are huge. It is not surprising to me at all that they can still find things that can be done better.


----------



## looniam

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The drivers back than had other problems not related to performance. Once 12.11 kicked in GCN matched Kepler with much lower clocks 925MHz HD 7970 vs 11XX GTX680.


seriously? comparing clock speeds of two different arches? and NO the non- Ghz stock 7970 did not match an OC 680. well, unless you wanna cherry pick tomb raider when it was first released.









Quote:


> Originally Posted by *maltamonk*
> 
> I'm up to speed. I just don't see how them continuously working to improve them and doing so, makes them look bad.


iv'e said it before - it's impressive how forward thinking AMD was with GCN and to be able to optimize performance for so long. it just took them a minute to get it going is all. but i am not going to jump on a "purposely nerfing" bandwagon if that's what you're asking.


----------



## maltamonk

Quote:


> Originally Posted by *looniam*
> 
> seriously? comparing clock speeds of two different arches? and NO the non- Ghz stock 7970 did not match an OC 680. well, unless you wanna cherry pick tomb raider when it was first released.
> 
> 
> 
> 
> 
> 
> 
> 
> iv'e said it before - it's impressive how forward thinking AMD was with GCN and to be able to optimize performance for so long. it just took them a minute to get it going is all. but i am not going to jump on a "purposely nerfing" bandwagon if that's what you're asking.


When you countered me I was replying to someone who was doing just that. Just curious to why you'd do that when we both have the same understanding.


----------



## Serandur

Quote:


> Originally Posted by *KyadCK*
> 
> Why not? Intel has been doing exactly that on the CPU side of things since Pentium 3, and CPU Arch design is Programming: Hardmode Activated. Who (of those who have programming experience) hasn't looked back at their programs or scripts and thought "Why in the world did I do it this way? This new way is so much better and quicker!"
> 
> There are very few instances where a given design stays in play for as long in the computer world as GCN has/will. The drivers are huge. It is not surprising to me at all that they can still find things that can be done better.


I don't recall getting upgraded CPU drivers to improve performance. Potential driver software improvements are possible, but still finite and limited in just how far they can push end-performance without some hardware updates/compliance.

Personally, I don't think it's coincidental that GCN started pulling ahead right after Maxwell launched and think that much of the relative performance difference stems from something on Nvidia's end.


----------



## looniam

Quote:


> Originally Posted by *maltamonk*
> 
> When you countered me I was replying to someone who was doing just that. Just curious to why you'd do that when we both have the same understanding.


ah. i wanted to make sure you were armed with all the facts(?) before you went off to war.

i like to see a fair fight.


----------



## Serandur

Quote:


> Originally Posted by *maltamonk*
> 
> When you countered me I was replying to someone who was doing just that. Just curious to why you'd do that when we both have the same understanding.


I said no such thing, but I did say that _if_ driver improvements are still yielding these kinds of gains after so long, there are only a few reasons as to why AMD didn't have them at this level a long while back and they are not particularly good ones. Given AMD's lacking GPU funding, especially, it would just reinforce the "AMD bad at drivers" stereotype.

I only stated as one potential reason for the possibility of driver maturity being responsible for a noticed occurrence that AMD may have held back on driver development in a way they shouldn't have for one of a couple possible reasons. Their current release schedule (not even a WHQL driver for Fiji's release after all this time) and DX11 overhead only supports that unfortunate point of either intentionally or unintentionally slow development.

I didn't assert it was intentional and instead lean more towards the unintentional consequence of lacking funding thing. Mentioning a possibility is not the same thing as asserting its validity but neither of those outcomes are positive things for AMD or their customers.


----------



## Apokalipse

Quote:


> Originally Posted by *Serandur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maltamonk*
> 
> When you countered me I was replying to someone who was doing just that. Just curious to why you'd do that when we both have the same understanding.
> 
> 
> 
> I said no such thing, but I did say that _if_ driver improvements are still yielding these kinds of gains after so long, there are only a few reasons as to why AMD didn't have them at this level a long while back and they are not particularly good ones. Given AMD's lacking GPU funding, especially, it would just reinforce the "AMD bad at drivers" stereotype.
Click to expand...

The one thing that remains true of all software is that there is always room for improvement.


----------



## maltamonk

Quote:


> Originally Posted by *Serandur*
> 
> I said no such thing, but I did say that _if_ driver improvements are still yielding these kinds of gains after so long, there are only a few reasons as to why AMD didn't have them at this level a long while back and they are not particularly good ones. Given AMD's lacking GPU funding, especially, it would just reinforce the "AMD bad at drivers" stereotype.
> 
> I only stated as one potential reason for the possibility of driver maturity being responsible for a noticed occurrence that AMD may have held back on driver development in a way they shouldn't have for one of a couple possible reasons. Their current release schedule (not even a WHQL driver for Fiji's release after all this time) and DX11 overhead only supports that unfortunate point of either intentionally or unintentionally slow development.
> 
> I didn't assert it was intentional and instead lean more towards the unintentional consequence of lacking funding thing. Mentioning a possibility is not the same thing as asserting its validity but neither of those outcomes are positive things for AMD or their customers.


Please don't take any offense to this: Is English a secondary language for you? That might be the reason for the apparent misunderstanding.


----------



## Blackops_2

Quote:


> Originally Posted by *KyadCK*
> 
> But I dunno what you're talking about with GK110 and the 7970. We're talking mainly about how a 290X mostly wins vs a 780ti now, when it used to lose by like 10%.


This. My reference was originally about Hawaii catching up with GK110 (780Ti) which is impressive, though it took some time.

Lol i just went and checked recent benchmarks of Tahiti vs GK110 just to make sure i hadn't missed something. I was about to be incredibly mad at myself for spending 1000$ on 2x 780 Classified + blocks a year or so ago when i could've just bought another 7970.

Though in reality the 40% or so jump wasn't worth the money. Should've stuck with 7970s clocked them at 1200 and ran with it. I do have high hopes my classified will hit 1300 though.

Is there any update on voltage unlock and OC? That is what i'm most interested in yet it seems noone has gotten voltage control yet. At least not officially.


----------



## Serandur

Quote:


> Originally Posted by *maltamonk*
> 
> Please don't take any offense to this: Is English a secondary language for you? That might be the reason for the apparent misunderstanding.


I don't know if I was the person you were quoting and originally referring to, but I'm not sure what you mean?
Quote:


> Originally Posted by *Apokalipse*
> 
> The one thing that remains true of all software is that there is always room for improvement.


Agreed, but then there's little reason to guessing how anything's performance will change over time. There's certainly little reason for any of us to be adamant on predictions then. Just passing the time, I suppose.


----------



## th3illusiveman

Quote:


> Originally Posted by *Blackops_2*
> 
> This. My reference was originally about Hawaii catching up with GK110 (780Ti) which is impressive, though it took some time.
> 
> Lol i just went and checked recent benchmarks of Tahiti vs GK110 just to make sure i hadn't missed something. I was about to be incredibly mad at myself for spending 1000$ on 2x 780 Classified + blocks a year or so ago when i could've just bought another 7970.
> 
> Though in reality the 40% or so jump wasn't worth the money. Should've stuck with 7970s clocked them at 1200 and ran with it. I do have high hopes my classified will hit 1300 though.
> 
> Is there any update on voltage unlock and OC? That is what i'm most interested in yet it seems noone has gotten voltage control yet. At least not officially.


To be fair, when Nvidia was fully supporting kepler there was a significant gap between the 780 and the 7970, plus kepler did overclock very well. I wonder how they will treat maxwell when their next arch launches. Thats one thing that makes me worried about buying a green card. If we will see Fury X step over the 980 Ti the same way the 290X suddenly does over the 780 Ti.


----------



## Ceadderman

For those of you having Pump/Coil whine issues, you might check the location of the Copper tube paying particular attention to your inboard 8pin. It is possible that the current is jumping over that span creating interference and the whine issues you're experiencing.









~Ceadder


----------



## Blackops_2

Quote:


> Originally Posted by *th3illusiveman*
> 
> To be fair, when Nvidia was fully supporting kepler there was a significant gap between the 780 and the 7970, plus kepler did overclock very well. I wonder how they will treat maxwell when their next arch launches. Thats one thing that makes me worried about buying a green card. If we will see Fury X step over the 980 Ti the same way the 290X suddenly does over the 780 Ti.


Agreed. The gap is still there but it's reduced since i bought my cards. Had i known what i know now i would've gotten 290x lightenings if i could've found them for the same price. Though the 290x had just launched and they were 699$. At the time 290x was still faster than the 780 but wasn't nearly as consistent in high OCs. Meanwhile your reference 780 on water would hit 1200 pretty easy. Which is why i made my decision. I don't necessarily regret it. I love my classifieds. Had i had the foresight to see 40% jump in performance wouldn't really be worth 400$ i'd be in a better position. Though there is always something around the corner. Just comes down to whether you feel like scratching that itch or not. If Fury drops in price part of me would want to pick up a pair, along with the new 1440p 144hz panel coming from Asus.

It will be interesting to see what gains we see long term with Fiji.

I will say the last couple of drivers from the green team have left a bad taste in my mouth. Yet again two driver updates and i'm still thinking about rolling back.


----------



## Ceadderman

Or whether your GPUs have finally reached their expiration date as mine have. 3** series have launched just in time for DX12. I just need to figure out which'ns to pounce on. Fury X or X2. Frankly I'm hoping that X2 is the beesknees. Especially when they get this coil/pump whine figured out.









I've contacted their tech support with my thoughts on what it is. So I've done my part for the war effort.









~Ceadder


----------



## Offler

Guys, its nothing new that AMD graphic drivers need some time to fix issues and increase in performance.

When first GCN cards were released it took about 1 year until they fixed problems with flickering (especially in DX9.0c games). In the end (with decent price) it was also the factor why I replaced my 7970 with ASIC 60% for a R9-290x with asic 80%+, simply because Hawaii was 2nd gen and at time of purchase the drivers were already "mature".

All I have to take care now is the noisy stock cooler.


----------



## infranoia

Quote:


> Originally Posted by *Offler*
> 
> All I have to take care now is the noisy stock cooler.


Quote:


> Sapphire R9-290x 1180MHz/1615MHz


Sorry for the OT, but you lapped the blower and get higher stable clocks than my CLC? Respect.

/digs out sandpaper


----------



## toncij

Anyone here with 5K tested CrossFireX? Would love to hear how that works out with new games at 5K if possible.


----------



## Ninhalem

Anandtech's Review is Up:

http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review


----------



## Serandur

Quote:


> Originally Posted by *Ninhalem*
> 
> Anandtech's Review is Up:
> 
> http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review


Yay, finally. Thanks for posting it.


----------



## keikei

Quote:


> Originally Posted by *Ninhalem*
> 
> Anandtech's Review is Up:
> 
> http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review


A thorough review that sums up the overall situation of the card. A major criticism being the price point. The reviewer also curious to see the non X Fury as its $100 cheaper and air cooled. A 2 weeks wait. Appreciate the post.


----------



## Blackops_2

Quote:


> Originally Posted by *keikei*
> 
> A thorough review that sums up the overall situation of the card. A major criticism being the price point. The reviewer also curious to see the non X Fury as its $100 cheaper and air cooled. A 2 weeks wait. Appreciate the post.


Agreed it was a great review. Wasn't so harsh about the performance either just honest. Which yes i agree the performance is a let down, i'm just saying. As they said in many ways AMD has done a great job they're just slightly short in performance. Something that wouldn't have mattered if the card had came out six months ago when Titan X was just released.

I've said it before and i'll say it again if they would just drop the price they would sell like hotcakes. Which apparently they're already doing (they're sold out most places). Fury X for 550$ and Fury for 450$ would be a heck of a buy. DX12 is coming at the end of the month also. I'd give it 3-4 months and recheck to see how much optimization Fiji has gained.

I also just want some dad gum OC results with voltage! It's killing me. It's a vital component the internet is missing at the moment.


----------



## fcman

Quote:


> Originally Posted by *Blackops_2*
> 
> Agreed it was a great review. Wasn't so harsh about the performance either just honest. Which yes i agree the performance is a let down, i'm just saying. As they said in many ways AMD has done a great job they're just slightly short in performance. Something that wouldn't have mattered if the card had came out six months ago when Titan X was just released.
> 
> I've said it before and i'll say it again if they would just drop the price they would sell like hotcakes. Which apparently they're already doing (they're sold out most places). Fury X for 550$ and Fury for 450$ would be a heck of a buy. DX12 is coming at the end of the month also. I'd give it 3-4 months and recheck to see how much optimization Fiji has gained.
> 
> I also just want some dad gum OC results with voltage! It's killing me. It's a vital component the internet is missing at the moment.


The problem was the 980ti, no one saw it coming ESPECIALLY for $650. If AMD would have released Fury X even a day before the 980ti, the reviews would be glowing... but they didn't. NVIDIA threw down the gauntlet, I have a feeling they knew exactly what was going on at AMD and realized they had no reason not to show their cards.


----------



## maltamonk

Quote:


> Originally Posted by *fcman*
> 
> The problem was the 980ti, no one saw it coming ESPECIALLY for $650. If AMD would have released Fury X even a day before the 980ti, the reviews would be glowing... but they didn't. NVIDIA threw down the gauntlet, I have a feeling they knew exactly what was going on at AMD and realized they had no reason not to show their cards.


Didn't we see it coming though? It was almost the exact thing they did with the 780 ti.


----------



## fcman

Quote:


> Originally Posted by *maltamonk*
> 
> Didn't we see it coming though? It was almost the exact thing they did with the 780 ti.


I guess it would be more accurate to say no one saw it coming that early for that price


----------



## Rei86

Quote:


> Originally Posted by *maltamonk*
> 
> Didn't we see it coming though? It was almost the exact thing they did with the 780 ti.


It was more like the Situation of the OG Titan to the 780, not the Titan Black to the 780Ti

GTX Titan 2-19-13
GTX 780 3-23-13

GTX 780Ti 10-7-13
GTX Titan Black 2-18-14


----------



## hamzta09

Not sure if posted but AMD has said the Memclocks are hardlocked in HW.

http://www.nordichardware.se/Grafik/amd-qminnesfrekvenserna-pa-radeon-fury-x-aer-lasta-i-hardvaraq.html

So any "OC" done in software doesnt apply in terms of mem.


----------



## Olivon

Quote:


> Originally Posted by *fcman*
> 
> The problem was the 980ti, no one saw it coming ESPECIALLY for $650. If AMD would have released Fury X even a day before the 980ti, the reviews would be glowing... but they didn't. NVIDIA threw down the gauntlet, I have a feeling they knew exactly what was going on at AMD and realized they had no reason not to show their cards.


It makes almost no doubt that nVidia know very well what's going on in AMD's house.
AMD/ATI and nVidia are really old opponent. They even made deals together on *price fixing*.
A good newser like *Chris L.*, very reliable, told that Fiji was 20% better than GTX980 end march.
So I guess that nVidia got even better sources and even possibly materials in hands before the official product launch.
And it's probably the same for AMD, they know each other really well IMO.


----------



## obababoy

Quote:


> Originally Posted by *Olivon*
> 
> It makes almost no doubt that nVidia know very well what's going on in AMD's house.
> AMD/ATI and nVidia are really old opponent. They even made deals together on *price fixing*.
> A good newser like *Chris L.*, very reliable, told that Fiji was 20% better than GTX980 end march.
> So I guess that nVidia got even better sources and even possibly materials in hands before the official product launch.
> And it's probably the same for AMD, they know each other really well IMO.


The movie Duplicity comes to mind...!

Anyways, have any breakthrough drivers come out for the Fury X? ...I'm still holding on to a dream that it will shine a bit brighter.


----------



## obababoy

Quote:


> Originally Posted by *KyadCK*
> 
> Why not? Intel has been doing exactly that on the CPU side of things since Pentium 3, and CPU Arch design is Programming: Hardmode Activated. Who (of those who have programming experience) hasn't looked back at their programs or scripts and thought "Why in the world did I do it this way? This new way is so much better and quicker!"
> 
> There are very few instances where a given design stays in play for as long in the computer world as GCN has/will. The drivers are huge. It is not surprising to me at all that they can still find things that can be done better.


I feel like Nvidia hits that plateau faster because they have the resources to do so. They start out at 95%. AMD cards age well because at launch they are only at 80%(totally made up) and after a year or two as AMD learns from drivers and new techniques, the previous generations thrive....Completely hypothetical, but I wonder if I am on to something.


----------



## mav451

Quote:


> Originally Posted by *hamzta09*
> 
> Not sure if posted but AMD has said the Memclocks are hardlocked in HW.
> 
> http://www.nordichardware.se/Grafik/amd-qminnesfrekvenserna-pa-radeon-fury-x-aer-lasta-i-hardvaraq.html
> 
> So any "OC" done in software doesnt apply in terms of mem.


Hang on, so what exactly is going on with those screenshots showing 600Mhz or higher?


----------



## hamzta09

Quote:


> Originally Posted by *mav451*
> 
> Hang on, so what exactly is going on with those screenshots showing 600Mhz or higher?


Dunno.

Are there any Mem-only overclock benches comparing with stock?


----------



## infranoia

Interesting note from Anandtech:
Quote:


> No scenario we've tried that breaks the R9 Fury X leaves it or the GTX 980 Ti running a game at 30fps or better, typically because in order to break the R9 Fury X we have to run with MSAA, which is itself a performance killer.


So they could hit the 4GB buffer limit with Fury X, but in doing so they crippled the competition as well, and they could only hit the limit at otherwise unplayable rates.

At least, for today. Still other caveats there to consider, cracking good read.


----------



## looniam

Quote:


> Originally Posted by *infranoia*
> 
> Interesting note from Anandtech:
> Quote:
> 
> 
> 
> No scenario we've tried that breaks the R9 Fury X leaves it or the GTX 980 Ti running a game at 30fps or better, typically because in order to break the R9 Fury X we have to run with MSAA, which is itself a performance killer.
> 
> 
> 
> So they could hit the 4GB buffer limit with Fury X, but in doing so they crippled the competition as well, and they could only hit the limit at otherwise unplayable rates.
> 
> At least, for today. Still other caveats there to consider, cracking good read.
Click to expand...

i don't believe i got that far yet but . . .
SoM benches
Quote:


> Unfortunately for AMD, the minimum framerate situation isn't quite as good as the averages. These framerates aren't bad - the R9 Fury X is always over 30fps - but even accounting for the higher variability of minimum framerates, they're trailing the GTX 980 Ti by 13-15% with Ultra quality settings. Interestingly at 4K with Very High quality settings the minimum framerate gap is just 3%, in which case what we are most likely seeing is the impact of running Ultra settings with only 4GB of VRAM. The 4GB cards don't get punished too much for it, but for R9 Fury X and its 4GB of HBM, it is beginning to crack under the pressure of what is admittedly one of our more VRAM-demanding games.


----------



## Thoth420

Saturday I find out if my xfx version from release day has a radiator that sings the song of its people...it's people being the tribe of fail if it does. If so...RMA. I have a R9 390 as a backup but my aesthetics were all built around the Fury X.


----------



## provost

Ok, I will be honest about the reasons why I purchased the Furyx over any Maxwell card.

Other than being jaded by Nvidia's lack of driver optimizations for GK110 cards







, the more I looked at the hardware specs of Maxwell (you know, the meat and potatoes, the nuts and bolts... Lol) , the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


----------



## criminal

Quote:


> Originally Posted by *provost*
> 
> Ok, I will be honest about the reasons why I purchased the Furyx over any Maxwell card.
> 
> Other than being jaded by Nvidia's lack of driver optimizations for GK110 cards
> 
> 
> 
> 
> 
> 
> 
> , the more I looked at the hardware specs of Maxwell (you know, the meat and potatoes, the nuts and bolts... Lol) , the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


You get your Fury X yet?


----------



## Thoth420

Quote:


> Originally Posted by *provost*
> 
> Ok, I will be honest about the reasons why I purchased the Furyx over any Maxwell card.
> 
> Other than being jaded by Nvidia's lack of driver optimizations for GK110 cards
> 
> 
> 
> 
> 
> 
> 
> , the more I looked at the hardware specs of Maxwell (you know, the meat and potatoes, the nuts and bolts... Lol) , the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


I also switched because of Nvidias failure drivers. Two whole bad branches and at real bad times...big titles ruined. Not worth the premium for more drivers and earlier releases if they are going to be broken psuedo beta drivers. It is not conspiracy theory at all either: someone show me the last beta driver release from nvidia on their site...pretty sure when I started with my 780 and even when I got my 780Ti it was Beta > Beta > WHQL > Beta > Beta > WHQL rinse repeat for the most part. Now: WHQL(game release) > WHQL(game release) >WHQL(game release) and half of them shouldn't have been certified at all...

I don't expect AMD to be magically better but with Nvidia part of the extra you pay for is the drivers at least it was for me...long dead. As just a gamer who can live with a 10 fps differential over stability...AMD seems smarter since it's upgrade every year either way...


----------



## friend'scatdied

Quote:


> Originally Posted by *provost*
> 
> the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


Uhh what? The optimizations in Maxwell are covered in great detail.

Of course depending on your level of understanding of the improvements, "any sufficiently advanced technology is indistinguishable from magic."

I just think it's disrespectful to the engineers to discredit their advancements in technology.


----------



## provost

Quote:


> Originally Posted by *friend'scatdied*
> 
> Uhh what? The optimizations in Maxwell are covered in great detail.
> 
> Of course depending on your level of understanding of the improvements, "any sufficiently advanced technology is indistinguishable from magic."
> 
> I just think it's disrespectful to the engineers to discredit their advancements in technology.


Yeah, and I don't understand or trust those improvements to be enduring, or better yet the performance is too dependent on variables that are under Nvidia's control than on my hardware... as I said, part of the reason is my lack of trust in Nvidia. Again, I am sharing my opinion about my purchase decision and that's all. no one has to agree with it.

Don't know what "respect to engineers" has to do with this, when I am spending my money. Nvidia's HR can take care of respect internal issues on its own


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> Dunno.
> 
> Are there any Mem-only overclock benches comparing with stock?


Short answer: Yes.

Long answer: Not exactly, because OCing the memory doesn't affect a lot of benchmarks, the memory is already fast. What is obvious though is that people OCing the memory run into stability issues if they go too high, and have to find their stable plateau.

If they were just changing a number that doesn't change performance, we'd have a screenshot of HBM being clocked to over 9000, coupled with a gif of Vegeta wearing an Nvidia logo crushing his scanner.

Edit: That is to say that people who have run straight VRAM speed benchmarks have seen their scores go up significantly. Video game benchmarks? Hardly noticeable, and within the margin of error.


----------



## Asmodian

Quote:


> Originally Posted by *provost*
> 
> Yeah, and I don't understand or trust those improvements to be enduring, or better yet the performance is too dependent on variables that are under Nvidia's control than on my hardware... as I said, part of the reason is my lack of trust in Nvidia. Again, I am sharing my opinion about my purchase decision and that's all. no one has to agree with it.
> 
> Don't know what "respect to engineers" has to do with this, when I am spending my money. Nvidia's HR can take care of respect internal issues on its own


You do not trust the improvements in Maxwell compared to Kepler to be enduring? That sounds like you are making up a reason to justify a Fury X to yourself. The Fury X does not need to be justified. Just be happy with a great GPU and don't make up reasons against Maxwell to tell yourself.

Maxwell has more CUDA cores than Kepler, more ROPs (double), more cache, and more transistors. In fact everything is higher in Maxwell except double precision floating point (which you do not want and the Fury X also dropped) and the number of texture units.

I can see the argument that Fury X'es design is better for longevity (massive compute with a weaker back end) as games will probably go up in shader requirements faster than resolution or geometry requirements but the improvements in Maxwell from Kepler will be at least as useful in the future as now, probably more so.


----------



## Thoth420

Quote:


> Originally Posted by *provost*
> 
> Yeah, and I don't understand or trust those improvements to be enduring, or better yet the performance is too dependent on variables that are under Nvidia's control than on my hardware... as I said, part of the reason is my lack of trust in Nvidia. Again, I am sharing my opinion about my purchase decision and that's all. no one has to agree with it.
> 
> Don't know what "respect to engineers" has to do with this, when I am spending my money. Nvidia's HR can take care of respect internal issues on its own


This is my sentiments exactly...but I trust almost nothing.


----------



## friend'scatdied

Quote:


> Originally Posted by *provost*
> 
> Don't know what "respect to engineers" has to do with this, when I am spending my money. Nvidia's HR can take care of respect internal issues on its own


I mean in general. It strikes me as about as odd and backwards (from a tech enthusiast perspective) as disrespecting Conroe would be (though not as severe).
Quote:


> Originally Posted by *Asmodian*
> 
> Maxwell has more CUDA cores than Kepler, more ROPs (double), more cache, and more transistors. In fact everything is higher in Maxwell except double precision floating point (which you do not want and the Fury X also dropped) and the number of texture units.


I think (hope) he meant GM204 vs. GK110, where the SMMs, scheduling, and texture compression optimizations (among other things like significantly increased clock potential) contribute greatly.


----------



## Themisseble

http://wccftech.com/amd-fury-x-pump-silent-solves-noise-issue/

AMD fixing noisy pump


----------



## Thoth420

Quote:


> Originally Posted by *Themisseble*
> 
> http://wccftech.com/amd-fury-x-pump-silent-solves-noise-issue/
> 
> AMD fixing noisy pump


Good to know...be nicer if they didn't let it get into the wild and end up in a stack of boxes in my room though.


----------



## Ganf

Quote:


> Originally Posted by *Themisseble*
> 
> http://wccftech.com/amd-fury-x-pump-silent-solves-noise-issue/
> 
> AMD fixing noisy pump


Fixed, people are already getting the revised pumps for cards they RMA'd. Retailers/AMD just sold the faulty ones anyways.


----------



## provost

Quote:


> Originally Posted by *friend'scatdied*
> 
> I mean in general. It strikes me as about as odd and backwards (from a tech enthusiast perspective) as disrespecting Conroe would be (though not as severe).
> 
> I think (hope) he meant GM204 vs. GK110, where the SMMs, scheduling, and texture compression optimizations (among other things like significantly increased clock potential)
> contribute greatly.


May be backwards, but time will tell, I guess. I will stick with my decision and evaluate it with indifference in a year's time. We all make judgement calls, and this one has a fairly low risk for me, given the cost, so why not play it out to see if my gut feel is right or wrong... Lol

Yes, I don't know about Kepler1 this or that, since I only have one 690 now which is sitting in a box. But I am comparing Kepler 1.5 (Gk110), since I have 5 of these cards, with Kepler 2 (Maxwell).... Lol
Yeah, I know I am making up some of these architectural naming conventions, but everything on 28mm is old tech to me at this point.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Ganf*
> 
> Fixed, people are already getting the revised pumps for cards they RMA'd. Retailers/AMD just sold the faulty ones anyways.


Really, can you point me to all the people who have received the revised pumps? So far the only one that has turned up is one guy on Youtube, and it seems every single other website is using that one guy on Youtube as their source (as the picture they are using to show the new pump is the Youtube guy's thumbnail for the video)


----------



## thekasafist

On what title? On Titanfall they can be in the low 30s. On Batman Arkham City I get 38 for mins my averages are usually 50s-70s. It really comes down to how well the game is developed honestly lousy coding from rushed games perform poorly regardless of how much hardware you toss at them. Such as Batman Arkham Origin I got mins of 70s FPS at 1440p!







It's crazy what proper coding can do! Honestly having nearly double the FPS for mins on a later title really through me off. It just shows how rushed some games are and how lousy they're coded. Needless to say even with Watchdogs for example I still have very playable FPS maybe around the same as Titanfall. Which doesn't say much for Titanfall by the way. On COD Ghosts (I don't have COD:AW) I get very very good FPS (online) as well. Honestly It's all about the coding if it's lousy then game runs like crap obviously VRAM also plays a huge roll as to whether you have enough or not but in my case my 4GB VRAM is not a limitation. Just wanted to share with ya and let ya know them old 670s work like a champ. I do not recommend however to use a single video card prior to the 900 series or R9 200 series for 1440p. I hope this information helps. If you have any other questions please don't hesitate to ask.


----------



## obababoy

Quote:


> Originally Posted by *Thoth420*
> 
> I also switched because of Nvidias failure drivers. Two whole bad branches and at real bad times...big titles ruined. Not worth the premium for more drivers and earlier releases if they are going to be broken psuedo beta drivers. It is not conspiracy theory at all either: someone show me the last beta driver release from nvidia on their site...pretty sure when I started with my 780 and even when I got my 780Ti it was Beta > Beta > WHQL > Beta > Beta > WHQL rinse repeat for the most part. Now: WHQL(game release) > WHQL(game release) >WHQL(game release) and half of them shouldn't have been certified at all...
> 
> I don't expect AMD to be magically better but with Nvidia part of the extra you pay for is the drivers at least it was for me...long dead. As just a gamer who can live with a 10 fps differential over stability...AMD seems smarter since it's upgrade every year either way...


Not trying to bash your post but you do know the WHQL basically means that windows update has approved the driver. Beta drivers are just as stable..or not stable


----------



## provost

Quote:


> Originally Posted by *Thoth420*
> 
> I also switched because of Nvidias failure drivers. Two whole bad branches and at real bad times...big titles ruined. Not worth the premium for more drivers and earlier releases if they are going to be broken psuedo beta drivers. It is not conspiracy theory at all either: someone show me the last beta driver release from nvidia on their site...pretty sure when I started with my 780 and even when I got my 780Ti it was Beta > Beta > WHQL > Beta > Beta > WHQL rinse repeat for the most part. Now: WHQL(game release) > WHQL(game release) >WHQL(game release) and half of them shouldn't have been certified at all...
> 
> I don't expect AMD to be magically better but with Nvidia part of the extra you pay for is the drivers at least it was for me...long dead. As just a gamer who can live with a 10 fps differential over stability...AMD seems smarter since it's upgrade every year either way...


And, then there is that...








The latest hot fix seemed to have fixed my crashes, but I haven't gamed much, but so far no crashes.

Not sure what strings Scone pulled at Nvidia , but they got the hot fix out which is a good thing. Lol

But, I agree with you, the premium for Nvidia cards is for the most part for their "superior driver support". But, that has been lacking, at least for me and and a number of others who have been experiencing crashes and poor sli scaling in new games.

So, yeah, it all adds up to one's purchase decision to stick with the same, or try a different path .


----------



## sugalumps

Quote:


> Originally Posted by *provost*
> 
> And, then there is that...
> 
> 
> 
> 
> 
> 
> 
> 
> The latest hot fix seemed to have fixed my crashes, but I haven't gamed much, but so far no crashes.
> 
> Not sure what strings Scone pulled at Nvidia , but they got the hot fix out which is a good thing. Lol
> 
> But, I agree with you, the premium for Nvidia cards is for the most part for their "superior driver support". But, that has been lacking, at least for me and and a number of others who have been experiencing crashes and poor sli scaling in new games.
> 
> So, yeah, it all adds up to one's purchase decision to stick with the same, or try a different path .


There is no premium this time, the ti and the fury x are the same price. Not only is there no premium this time, but for the first time in ages nvidia actualy has more vram while having the currently better performing card.


----------



## Thoth420

Quote:


> Originally Posted by *obababoy*
> 
> Not trying to bash your post but you do know the WHQL basically means that windows update has approved the driver. Beta drivers are just as stable..or not stable


I do but the trend shifted away from using feedback from end users with beta releases. Since then I would say about 3 out of 4 "game release drivers" is a buggy piece of junk that never should have gotten out of beta phase. I would say a beta should generally be stable with a few bugs here and there and it's like an easter egg hunt for the users. A signed driver should not BSOD your system while using Chrome or while idling. I don't care who published it...I have dumped tons of money into both AMD and Nvidia camps in the past years...neither is perfect but the games I like tend to run better on AMD.


----------



## provost

Quote:


> Originally Posted by *sugalumps*
> 
> There is no premium this time, the ti and the fury x are the same price. Not only is there no premium this time, but for the first time in ages nvidia actualy has more vram while having the currently better performing card.


I don't disagree with price parity, but I guess one can look at the "premium" in terms of what one gets or doesn't get for the same price.
For me, having HBM is something novel and different, and AIO is a nice touch for the same price. Again, just my opinion.


----------



## sugalumps

Quote:


> Originally Posted by *provost*
> 
> I don't disagree with price parity, but I guess one can look at the "premium" in terms of what one gets or doesn't get for the same price.
> For me, having HBM is something novel and different, and AIO is a nice touch for the same price. Again, just my opinion.


There is a problem with the AIO imo, or well I think there is in my situation. Big air cooler(noctua nh-d15) even in a full tower, becuase of how thick the aio is may not fit. If it does fit which I doubt then you are blowing the hot directly in the AIO from the cpu. Or you could mount it in front, then you are bringing hot air into your case meaning the big air cooler is going to be sucking the warm air in. It's a loose loose with an aio on the gpu using a big air cooler.


----------



## provost

Quote:


> Originally Posted by *sugalumps*
> 
> There is a problem with the AIO imo, or well I think there is in my situation. Big air cooler(noctua nh-d15) even in a full tower, becuase of how thick the aio is may not fit. If it does fit which I doubt then you are blowing the hot directly in the AIO from the cpu. Or you could mount it in front, then you are bringing hot air into your case meaning the big air cooler is going to be sucking the warm air in. It's a loose loose with an aio on the gpu using a big air cooler.


Hmm... I am planning to do a mini itx build with this... gonna have to keep an eye on other mini builds that may get there before me ( and anyone can get there before me, given my rig building skills.







) to see what's up with this issue


----------



## provost

Here is a review by a real life user from another forum

http://forums.overclockers.co.uk/showthread.php?t=18679729

the reason why I found this interesting was because it's from a user whom I always viewed (rightly or wrongly.







) as a huge Nvidia fan.... lol


----------



## Blackops_2

Quote:


> Originally Posted by *provost*
> 
> Here is a review by a real life user from another forum
> 
> http://forums.overclockers.co.uk/showthread.php?t=18679729
> 
> the reason why I found this interesting was because it's from a user whom I always viewed (rightly or wrongly.
> 
> 
> 
> 
> 
> 
> 
> ) as a huge Nvidia fan.... lol


http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/2060#post_24121425

He literally just posted it in the owner's thread FWIW. I thought it was a good review.


----------



## provost

Quote:


> Originally Posted by *Blackops_2*
> 
> http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/2060#post_24121425
> 
> He literally just posted it in the owner's thread FWIW. I thought it was a good review.


oh, ok, I haven't been checking our own owners thread here, as I have not yet received my card, and very few people in the U.S. seem to have them compared to the UK. or may be because mine won't be delivered until 15th -20th, and I assumed that the others are in the same boat as me


----------



## Serandur

Quote:


> Originally Posted by *provost*
> 
> Ok, I will be honest about the reasons why I purchased the Furyx over any Maxwell card.
> 
> Other than being jaded by Nvidia's lack of driver optimizations for GK110 cards
> 
> 
> 
> 
> 
> 
> 
> , the more I looked at the hardware specs of Maxwell (you know, the meat and potatoes, the nuts and bolts... Lol) , the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


Just a little explanation of Maxwell's perceived lacking specs yet high performance and how it really relates to GCN/Fiji and Kepler(*warning, long; TLDR in bold at bottom summarizing, which is also kinda long







)*.

GM200 does not have less ROPs than GK110; 96 vs 48 (double the ROPs) plus GM200 clocks way higher than GK110 hence the ROPs, TMUs, shaders, etc. all get a large percentage boost from those alone. For example, pretend Kepler and Maxwell shader cores, ROPs, and TMUs are identical. 3072 of those hypothetical Kepler shaders/TMUs/ROPs at 1000 MHz would be theoretically equal to 2048 Maxwell shaders/TMUs/ROPs at 1500 MHz. "Compute" performance (in TFLOPS) is proportional to raw shader count and clock speed, though the real deal is defined by far more than that theoretical number.

Disregarding architectural enhancements for now, Maxwell hits 1500-1550 MHz with the same ease GK110 got to 1150-1200 MHz. That's ~30% alone on top of twice the ROPs, not less (96 on GM200 vs 48 on GK110). That's roughly 2.5 times more peak pixel fillrate just looking at it on paper and easily the highest of any GPU on the market (Fiji having 64 ROPs at a lower clock speed).

At stock, even before GM200's massive current OC advantage versus Fiji:










The lesser amount of TMUs isn't an issue partially for the same reason (clock speeds) and partially for another in the case of GCN. GCN TMUs (including Fiji) also only have half their speed with fp16 textures instead of their peak (256 Gtexels/s at 1000 MHz) whereas Maxwell retains its peak rate, putting GM200 behind Fiji at peak TMU performance with int8 textures (theoretically, not by much either at both cards' current typical OC speeds), but significantly better with more complex fp16 textures despite the lesser amount of TMUs. Maxwell's actual Texel/s rate is even higher than the 780 Ti's.










Shaders also benefit directly from raw clockspeed as does peak single-precision compute performance. GM200's reported peak "compute" performance is only measured at the base clock whereas GM200 actually boosts much higher. In contrast, GCN's stated base speed at which "compute" is calculated is its max.

The 980 Ti at 1500 MHz is a nearly 9 TFLOPS peak GPU in single-precision whereas Fiji at 1150 MHz is nearly 10. The difference in theoretical peak rate at both cards' rough OC capabilities isn't that big and the actual, effective compute performance varies significantly at stock with Fury winning some things (ray tracing), the 980 Ti winning others (particle simulation), and a lot of close calls/ties in between. GM200 crushes GK110 in actual compute performance regardless.

Ray tracing:










Particle simulation:










Then GM200 has twice the L2 cache (basically on-die memory) of GK110 and 50% more than Fiji which helps a lot, it's got that delta color compression, and it's got improvements to the shader core design such as reduced arithmetic latencies and improved scheduling to let Maxwell get closer to its peak performance rates than Kepler.

*TLDR: The specs people compare between Fiji, GK110, and GM200 are directly tied to clock speeds and are also quite superficial as there are many other specifications and details of specifications your typical spec sheet doesn't show you like cache.

Maxwell's single-precision compute/shader performance (even the theoretical TFLOPS figure), TMU performance, and ROP performance are way higher than people think because of clock speed, discrepencies between reported theoretical figures and their attainability in reality, and discrepencies in how Nvidia/AMD report their numbers. Maxwell doesn't lack in raw theoretical hardware power at all, Nvidia just undersell it oddly enough and people generalize the theoretical peak rates without considering clock speed differences and architectural specifics (like GCN TMUs' half-rate with FP16) to reality too much.

Maxwell's performance doesn't come from black magic, it comes from real, quantifiable, and measurable hardware improvements which have been thoroughly documented.*


----------



## Blackops_2

Quote:


> Originally Posted by *provost*
> 
> oh, ok, I haven't been checking our own owners thread here, as I have not yet received my card, and very few people in the U.S. seem to have them compared to the UK. or may be because mine won't be delivered until 15th -20th, and I assumed that the others are in the same boat as me


Didn't realize purgatory was in the US lol. I kid. Let us know your findings


----------



## provost

Quote:


> Originally Posted by *Blackops_2*
> 
> Didn't realize purgatory was in the US lol. I kid. Let us know your findings


Since when has it not.









but, I am not good at sharing findings and all that good stuff. If I am happy with a card, then I like, if not, then I don't... ... lol


----------



## Blackops_2

Quote:


> Originally Posted by *provost*


Since when has it not.









but, I am not good at sharing findings and all that good stuff. If I am happy with a card, then I like, if not, then I don't... ... lol

At least OC the sucker







don't even have to take a screenshot just let me know if you can hit 1200, once voltage control is available.


----------



## provost

Quote:


> Originally Posted by *Blackops_2*
> 
> At least OC the sucker
> 
> 
> 
> 
> 
> 
> 
> don't even have to take a screenshot just let me know if you can hit 1200, once voltage control is available.


yeah, yeah, will give it a thought or two...


----------



## alawadhi3000

Quote:


> Originally Posted by *Themisseble*
> 
> You are wrong about that.
> Fury X is going to win in the long run. Just like R9 290X destroyed GTX 780Ti... and now R9 290X (new drivers + better cooling = R9 390X) is destroying GTX 980?.. after two years AMD will be still improving their drivers for GCN core and it will destroy GTX 980Ti... maybe ( both cards have 5.6TFLops)
> 
> Look, most people dont get AMD. Specially NVIDIA fanboys.. AMD is doing best for costumers. GCn is here for 4 years and still battling new Maxwell core. AMD midrange cards are better priced than NVIDIAs. 7970 GHZ is now battling GTX 780 in new games... why would i weant to have new GCN core? Why? to spend more of my money? I think I will let AMD to optimize it.


R9 290X destroying GTX780Ti.






















R9 390X destroying GTX980.























Modern comparison @ TPU.
NVIDIA 353.06 vs AMD 15.5 Beta

1080p:
GTX980>R9 390X>GTX970=GTX780Ti>R9 390>R9 290X>R9 290.

1440p:
GTX980>R9 390X>R9 390=GTX780 Ti=GTX970=R9 290X>R9 290.


----------



## Ha-Nocri

In recent games (Batman and witcher 3) 780ti seems slower than 290. I don't think NV will optimize their drivers for Kepler cards as they did before.




And these 2 games are NV optimized. Expect even bigger difference in other games that will be released.


----------



## alawadhi3000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> In recent games (Batman and witcher 3) 780ti seems slower than 290. I don't think NV will optimize their drivers for Kepler cards as they did before.
> 
> 
> 
> 
> And these 2 games are NV optimized. Expect even bigger difference in other games that will be released.


You are right about Batman, but Witcher 3 was on medium settings.


----------



## Ha-Nocri

Quote:


> Originally Posted by *alawadhi3000*
> 
> You are right about Batman, but Witcher 3 was on medium settings.


Because on Ultra they turned NV features on. Not rly fair to compare with that.


----------



## Themisseble

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Because on Ultra they turned NV features on. Not rly fair to compare with that.


AMD fixed that with new cat.
R9 290 beats GTX 780Ti and R9 280X beats GTX 780.


----------



## provost

Quote:


> Originally Posted by *Themisseble*
> 
> AMD fixed that with new cat.
> R9 290 beats GTX 780Ti and R9 280X beats GTX 780.


Yep, and I will be watching to see how long the 980ti and Titanx lasts against AMD's architecture, , and especially as soon as the low level Pascal card drops.
Since my purchase of Furyx is based on my conviction that Nvidia has chosen to control more and more performance through drivers and software , rather than hardware, so that it can shorten the upgrade cycle of its install base by limiting optimizations, I am very interested to see how my theory pans out over time.

it's fun to take a position and speculate, or else what is there to do on the good ole 28nm.


----------



## Themisseble

Quote:


> Originally Posted by *provost*
> 
> Yep, and I will be watching to see how long the 980ti and Titanx lasts against AMD's architecture, , and especially as soon as the low level Pascal card drops.
> Since my purchase of Furyx is based on my conviction that Nvidia has chosen to control more and more performance through drivers and software , rather than hardware, so that it can shorten the upgrade cycle of its install base by limiting optimizations, I am very interested to see how my theory pans out over time.
> 
> it's fun to take a position and speculate, or else what is there to do on the good ole 28nm.


Eh, fury is still very good card. If Fury X had OC potential ( 1300Mhz or to 1400 Mhz) then it would be even better than GTX 980Ti. First taste of HBM, maybe you will try VR and maybe it will be even faster in DX12 games.
Yes GCN cannot OC well, so even with unlocking V it may not OC good, maybe it will. You still may hit 1350MHz...
Personally I think that AMD will improve performance trough drivers.


----------



## provost

Quote:


> Originally Posted by *Themisseble*
> 
> Eh, fury is still very good card. If Fury X had OC potential ( 1300Mhz or to 1400 Mhz) then it would be even better than GTX 980Ti. First taste of HBM, maybe you will try VR and maybe it will be even faster in DX12 games.
> Yes GCN cannot OC well, so even with unlocking V it may not OC good, maybe it will. You still may hit 1350MHz...
> Personally I think that AMD will improve performance trough drivers.


I am not fussed by current benchmark differences of a few fps here or there, as my bet is that Fiji will outlast Maxwell in performance just as Hawaii has outlasted Gk110s. And, it wouldn't be because Maxwell could not be further optimized (I don't like the Maxwell cards to begin with, so don't know for sure) , but it's because Maxwell would not be optimized to maintain Nvidia's sku management strategy. Again, it's just a theory, so time will tell...... lol


----------



## gamervivek

HBM 'overclocking' is now breaking 20k in firestrike.



Mantle review.

http://www.golem.de/news/grafikkarte-auch-fury-x-rechnet-mit-der-mantle-schnittstelle-flotter-1507-115005.html


----------



## Silent Scone

Quote:


> Originally Posted by *gamervivek*
> 
> HBM 'overclocking' is now breaking 20k in firestrike.
> 
> 
> 
> Mantle review.
> 
> http://www.golem.de/news/grafikkarte-auch-fury-x-rechnet-mit-der-mantle-schnittstelle-flotter-1507-115005.html


That's only at 1130 core isn't it? I score around 16900 on my i5 at 1110 core and stock memory. So that's all being reaped from memory bandwidth?? Jesus Christ. It is a very sensitive synthetic test especially at 1080p, but even so that's massively impressive.

If under one circumstance...that it's with tessellation on


----------



## provost

Quote:


> Originally Posted by *Silent Scone*
> 
> That's only at 1130 core isn't it? I score around 16900 on my i5 at 1110 core and stock memory. So that's all being reaped from memory bandwidth?? Jesus Christ. It is a very sensitive synthetic test especially at 1080p, but even so that's massively impressive.
> 
> If under one circumstance...that it's with tessellation on


Then Futuremark will invalidate the benchmark.... lol
Although my die has already been cast, and my purchase decision has nothing to do with synthetic benchmarks, I certainly hope that AMD is not making up cheesy benchmarks through shady reviewers to justify this card.


----------



## hamzta09

UPDATE : We've confirmed with Robert Hallock, technical PR lead at AMD, that while the GPU-Z tool is reporting an increase in memory frequency in reality the frequency did not change. As HBM's frequency in the Radeon R9 Fury X is determined in hardware and cannot be changed through software.

Interestingly enough we've also heard that R9 Fury X over-volting might be coming sooner rather than later. In fact we've found out that some users have already managed to unlock voltage control on their R9 Fury X cards. And we'll be covering it in detail very soon in a forthcoming article.

http://wccftech.com/amd-radeon-r9-fury-memory-oveclocked-20/


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> UPDATE : We've confirmed with Robert Hallock, technical PR lead at AMD, that while the GPU-Z tool is reporting an increase in memory frequency in reality the frequency did not change. As HBM's frequency in the Radeon R9 Fury X is determined in hardware and cannot be changed through software.
> 
> Interestingly enough we've also heard that R9 Fury X over-volting might be coming sooner rather than later. In fact we've found out that some users have already managed to unlock voltage control on their R9 Fury X cards. And we'll be covering it in detail very soon in a forthcoming article.
> 
> http://wccftech.com/amd-radeon-r9-fury-memory-oveclocked-20/


How many times are you going to post that WCCF article despite all of the owners posting benchmark improvements from OCing the HBM and web reviewers testing it and confirming it?


----------



## sugarhell

Quote:


> Originally Posted by *Ganf*
> 
> How many times are you going to post that WCCF article despite all of the owners posting benchmark improvements from OCing the HBM and web reviewers testing it and confirming it?


Until he believes it


----------



## p4inkill3r

Quote:


> Originally Posted by *Ganf*
> 
> How many times are you going to post that WCCF article despite all of the owners posting benchmark improvements from OCing the HBM and web reviewers testing it and confirming it?


Until we all bow down in silent acquiescence of his superior knowledge.


----------



## hamzta09

Quote:


> Originally Posted by *Ganf*
> 
> How many times are you going to post that WCCF article despite all of the owners posting benchmark improvements from OCing the HBM and web reviewers testing it and confirming it?


Its the first time I linked it. Smartass.

What reviewers? I've not seen an official review overclock the HBM. And HBM alone.

Its also easy to Photoshop 3dmark results.

And why would AMD say the HBM is hardlocked in hardware when they also claim the Fury X is an Overclockers dream? Doesnt make any sense.

Quote:


> Originally Posted by *sugarhell*
> 
> Until he believes it


Quote:


> Originally Posted by *p4inkill3r*
> 
> Until we all bow down in silent acquiescence of his superior knowledge.


Did I just upset all the fanbots?


----------



## Tivan

Quote:


> Originally Posted by *hamzta09*
> 
> I've not seen an official review overclock the HBM. And HBM alone.


hardware.fr did one recently with only mem oc, only core oc, and combined oc.
http://www.hardware.fr/articles/937-26/overclocking-gpu-fiji.html

edit: though we don't truly know if this is really a mem oc or simply overclocking some part that affects performance, that gets affected when trying to OC the memory.

Regardless, I care about performance c:


----------



## hamzta09

Quote:


> Originally Posted by *Tivan*
> 
> hardware.fr did one recently with only mem oc, only core oc, and combined oc.
> http://www.hardware.fr/articles/937-26/overclocking-gpu-fiji.html
> 
> edit: though we don't truly know if this is really a mem oc or simply overclocking some part that affects performance, that gets affected when trying to OC the memory.
> 
> Regardless, I care about performance c:


+ ~1 frame in some titles? Could easily be due to conditions of a run.


----------



## Tivan

Quote:


> Originally Posted by *hamzta09*
> 
> + ~1 frame in some titles? Could easily be due to conditions of a run.


Sure, nobody will disagree with that I hope. Regardless, there's cases where beyond statistical uncertainty, a performance gain was achieved. So that's curious!

edit: Just an 8% OC on the supposed memory too. 4% more performance for that in some cases, and it's at 1440p, while speculation has it that the performance impact would be more profound on 1080p due to that firestrike run, strikes me as very intriguing.

I definitely also want more benchmarks!


----------



## hamzta09

Quote:


> Originally Posted by *Tivan*
> 
> Sure, nobody will disagree with that I hope. Regardless, there's cases where beyond statistical uncertainty, a performance gain was achieved. So that's curious!


I wonder why AMD keeps saying its locked in HW though?


----------



## Tivan

Quote:


> Originally Posted by *hamzta09*
> 
> I wonder why AMD keeps saying its locked in HW though?


Maybe to avoid warranty liability (vs their AIB partners) and dissuade people from it, due to the controller being new and not well tested for degradation.

And you have to enable inofficial OC mode in MSI AB after all, so whatever hardware specification would allow overclocking the memory with regular mode, might be disabled. So they might be stretching their words a little there.

Of course it's still not a 100% confirmed that it's really a memory OC, regardless. (though seems likely)


----------



## Ganf

Quote:


> Originally Posted by *hamzta09*
> 
> + ~1 frame in some titles? Could easily be due to conditions of a run.


That article has only been linked and laughed at at least 7 or 8 times in the last day. I could have sworn half of them came from you. If not, I apologize.

Of course. No one ever expected it to benefit games. It's a pretty big improvement for compute though. One of the OCN members posted OpenCL tests which showed 20% improvement for 20% OC in the memory tasks. I'm trying to find them again.


----------



## mav451

Quote:


> Originally Posted by *hamzta09*
> 
> UPDATE : We've confirmed with *Robert Hallock*, technical PR lead at AMD, that while the GPU-Z tool is reporting an increase in memory frequency in reality the frequency did not change. As HBM's frequency in the Radeon R9 Fury X is determined in hardware and cannot be changed through software.
> 
> Interestingly enough we've also heard that R9 Fury X over-volting might be coming sooner rather than later. In fact we've found out that some users have already managed to unlock voltage control on their R9 Fury X cards. And we'll be covering it in detail very soon in a forthcoming article.
> 
> http://wccftech.com/amd-radeon-r9-fury-memory-oveclocked-20/


Looks like Hallock is indeed the right guy to ask about Nano.
I know I know, that's the other thread, but now I'm definitely curious how Robert responds to that question.

That said, I have to think that many outlets are asking him the same thing I did


----------



## sugarhell

Oh people are like isheeps. They cant understand PR damage control. Amd doesnt want mainstream people to overclock the HBM. Overclockers will find a way no matter what


----------



## hamzta09

Quote:


> Originally Posted by *sugarhell*
> 
> Oh people are like isheeps. They cant understand PR damage control. Amd doesnt want mainstream people to overclock the HBM. Overclockers will find a way no matter what


Damage Control.

"Its an Overclockers dream!"
"YOU CANT OVERCLOCK HBM BECAUSE ITS LOCKED IN HARDWARE!"

Makes sense.


----------



## iSlayer

Quote:


> Originally Posted by *toncij*
> 
> As a programmer doing rendering too, you're spot-on, but I'd rather choose another word rather than "issue". It is a design "issue", a design perk let's say. AMD and NVidia have a different design regarding core grouping, units number, task scheduling perks, etc. For some workloads AMD had always had an upper hand, while NVidia did for other. Then a driver also plays its part, but in general with certain conditions AMD may win over. May... since in current gen games NVidia still wins. AMD started showing advantage with some games that can put their architecture at work...


It's best then to take a look at the professional segment and how AMD and Nvidia stack up.

It's very lopsided depending on the time put into the drivers. Those peek performance capabilities ultimately don't matter a whole lot between different architectures.

From software to software the performance varies widely. For those in IT picking out workstations, the benches are very important.
Quote:


> Originally Posted by *Themisseble*
> 
> You are wrong about that.
> Fury X is going to win in the long run. Just like R9 290X destroyed GTX 780Ti... and now R9 290X (new drivers + better cooling = R9 390X) is destroying GTX 980?.. after two years AMD will be still improving their drivers for GCN core and it will destroy GTX 980Ti... maybe ( both cards have 5.6TFLops)
> 
> Look, most people dont get AMD. Specially NVIDIA fanboys.. AMD is doing best for costumers. GCn is here for 4 years and still battling new Maxwell core. AMD midrange cards are better priced than NVIDIAs. 7970 GHZ is now battling GTX 780 in new games... why would i weant to have new GCN core? Why? to spend more of my money? I think I will let AMD to optimize it.
> 
> http://www.techspot.com/review/917-far-cry-4-benchmarks/page3.html
> 
> look FC4 - great game. 7970 GHz is battling TITAN.
> 
> I care mostly about midrange GPU- I could go with GTX 660 or R9 270X - now R9 270X is about 50% faster in FC4. I dont use MSAA i prefer screen scaling or higher res. with high-med settings. How well does GTX 660 vs 7870 in hgiher res? those who went GTX 660 SLI over 7870 CF were fools. 7750 CF (2x 55W) would beat GTX 660.


I don't buy for theoretically greater performance 3 years later, I buy for greater performance now. 2 years today it won't matter which performs better because both will be seriously dropping quality settings to maintain playable frames. Buying based off of future prospects and theoretical performance is a TERRIBLE idea.

And the Titan against a 7970 is probably a bad choice since Titans were voltage unlocked and had low stock clocks. Besides Kepler overclocking better than Tahiti.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> This driver argument is a bit silly imo.
> 
> I'll preface this by saying in hindsight, the 290x was probably the best buy for a GPU in recent memory (assuming you picked one up for a reasonable price).
> 
> Even though i disagree about the 290x "destroying" the 780Ti, i would say the 290x is the better card. But by the time the 290x caught up to the 780Ti, (and they're still within reach of each other in performance), does anyone even care?
> 
> It took what, over a year for the 290x to match/beat the 780Ti? At this stage, i think the owners of those cards have already gotten what they wanted out of them, and now everyone's looking at the 980Ti/FX. So in the end, i don't think the 290x "won" anything. It's now comparable to a card that had the same performance for over a year.
> 
> Not to mention that the same scenario might not even play out this time with the Fury and the Ti. Last time AMD had more vram, now the opposite is true. I don't think picking a Fury over a Ti based on future drivers is a wise choice.


This, so much this.

GK110 also OCs better than Hawaii, so that also puts a dent in the 290x being better in perf.

As far as value goes aftermarket 290xs haven't been bad. Even now for $250-280 they're a good deal.
Quote:


> Originally Posted by *infranoia*
> 
> Linear Crossfire scaling is going to play right into a very strong dual-Fiji card. I keep harping on about it when it sounds like most mooks around here have sworn off multi-GPUs, but that's only due to Nvidia's 80% market and sucky SLI scaling.
> 
> The 295x2 has been a crazy good performer far longer than it deserved, and I'm not sure anyone should write off the Fiji X2 until it rears its head. Even at 295x2 launch prices it would be a good investment, the final huge re-entry burn for 28nm to last us through the first couple crappy generations of a new process.


The 295x2 was a horrible deal at launch. You could pick up aftermarket 290xs for $500-550, even Lightnings would leave you hundreds leftover. And you could then WC them. Or go 780 Ti x2.

Even as recently as the 295x2 going for $660 it made no sense. 2 290xs with decent aftermarket's won't throttle and you'd save $60. Or 2 970s for the same price and you could still overclock with plenty of headroom to spare. It was only when the 295x2 hit sub-$600 they became worthwhile, like at $550.

Agreed though, CF does have a scaling edge from what we have seen.
Quote:


> Originally Posted by *Darkwizzie*
> 
> If Fury X beats 980ti at 4k, then 4k DSR/etc down to 1080p would mean Fury X still beats 980ti, right?


It should? DSR and VSR are different I think so we would need to see testing though VSR vs. DSR has AMD with the edge and I think VSR would maintain that with the Fury X.
Quote:


> Originally Posted by *maltamonk*
> 
> So you are eluding they did that on purpose?


Whether it is or isn't purposeful it doesn't speak well to AMD. That said, doubt its the conspiracy.


----------



## provost

Quote:


> Originally Posted by *Serandur*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Just a little explanation of Maxwell's perceived lacking specs yet high performance and how it really relates to GCN/Fiji and Kepler(*warning, long; TLDR in bold at bottom summarizing, which is also kinda long
> 
> 
> 
> 
> 
> 
> 
> )*.
> 
> GM200 does not have less ROPs than GK110; 96 vs 48 (double the ROPs) plus GM200 clocks way higher than GK110 hence the ROPs, TMUs, shaders, etc. all get a large percentage boost from those alone. For example, pretend Kepler and Maxwell shader cores, ROPs, and TMUs are identical. 3072 of those hypothetical Kepler shaders/TMUs/ROPs at 1000 MHz would be theoretically equal to 2048 Maxwell shaders/TMUs/ROPs at 1500 MHz. "Compute" performance (in TFLOPS) is proportional to raw shader count and clock speed, though the real deal is defined by far more than that theoretical number.
> 
> Disregarding architectural enhancements for now, Maxwell hits 1500-1550 MHz with the same ease GK110 got to 1150-1200 MHz. That's ~30% alone on top of twice the ROPs, not less (96 on GM200 vs 48 on GK110). That's roughly 2.5 times more peak pixel fillrate just looking at it on paper and easily the highest of any GPU on the market (Fiji having 64 ROPs at a lower clock speed).
> 
> At stock, even before GM200's massive current OC advantage versus Fiji:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The lesser amount of TMUs isn't an issue partially for the same reason (clock speeds) and partially for another in the case of GCN. GCN TMUs (including Fiji) also only have half their speed with fp16 textures instead of their peak (256 Gtexels/s at 1000 MHz) whereas Maxwell retains its peak rate, putting GM200 behind Fiji at peak TMU performance with int8 textures (theoretically, not by much either at both cards' current typical OC speeds), but significantly better with more complex fp16 textures despite the lesser amount of TMUs. Maxwell's actual Texel/s rate is even higher than the 780 Ti's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shaders also benefit directly from raw clockspeed as does peak single-precision compute performance. GM200's reported peak "compute" performance is only measured at the base clock whereas GM200 actually boosts much higher. In contrast, GCN's stated base speed at which "compute" is calculated is its max.
> 
> The 980 Ti at 1500 MHz is a nearly 9 TFLOPS peak GPU in single-precision whereas Fiji at 1150 MHz is nearly 10. The difference in theoretical peak rate at both cards' rough OC capabilities isn't that big and the actual, effective compute performance varies significantly at stock with Fury winning some things (ray tracing), the 980 Ti winning others (particle simulation), and a lot of close calls/ties in between. GM200 crushes GK110 in actual compute performance regardless.
> 
> Ray tracing:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Particle simulation:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Then GM200 has twice the L2 cache (basically on-die memory) of GK110 and 50% more than Fiji which helps a lot, it's got that delta color compression, and it's got improvements to the shader core design such as reduced arithmetic latencies and improved scheduling to let Maxwell get closer to its peak performance rates than Kepler.
> 
> *TLDR: The specs people compare between Fiji, GK110, and GM200 are directly tied to clock speeds and are also quite superficial as there are many other specifications and details of specifications your typical spec sheet doesn't show you like cache.
> 
> Maxwell's single-precision compute/shader performance (even the theoretical TFLOPS figure), TMU performance, and ROP performance are way higher than people think because of clock speed, discrepencies between reported theoretical figures and their attainability in reality, and discrepencies in how Nvidia/AMD report their numbers. Maxwell doesn't lack in raw theoretical hardware power at all, Nvidia just undersell it oddly enough and people generalize the theoretical peak rates without considering clock speed differences and architectural specifics (like GCN TMUs' half-rate with FP16) to reality too much.
> 
> Maxwell's performance doesn't come from black magic, it comes from real, quantifiable, and measurable hardware improvements which have been thoroughly documented.*


[\spoiler]

yeah, TLDR; in case you missed my context playa, I was calling bull on GM204 beating 780 Ti and Titan on the same node.


----------



## Liranan

Quote:


> Originally Posted by *hamzta09*
> 
> Damage Control.
> 
> "Its an Overclockers dream!"
> "YOU CANT OVERCLOCK HBM BECAUSE ITS LOCKED IN HARDWARE!"
> 
> Makes sense.


Wasn't that in reference to the Nano?


----------



## hamzta09

Quote:


> Originally Posted by *Liranan*
> 
> Wasn't that in reference to the Nano?


What was? Overclockers dream or cant OC HBM?


----------



## Serandur

Quote:


> Originally Posted by *provost*
> 
> [\spoiler]
> 
> yeah, TLDR; in case you missed my context playa, I was calling bull on GM204 beating 780 Ti and Titan on the same node.


I'm not sure what that has to do with choosing Fiji over any Maxwell product to replace GK110, as per your words, since GM204 isn't Fiji's Maxwell competitor or GK110's succesor.

But the same reasons apply regardless. GM204 has more ROPs at a higher clock speed than GK110 (64 vs 48, ~30% higher frequency), more powerful and much higher-clocked shaders, comparable TMU performance, more cache, delta color compression techniques like AMD, stronger actual FP32 compute performance, etc.

Nvidia saved a lot of die space vs Kepler simply by cutting out FP64 stuff and packing the transistors closer together. GM204's not that much stronger, but still more advanced in all kinds of areas and corroborating synthetic tests.


----------



## provost

Quote:


> Originally Posted by *Serandur*
> 
> I'm not sure what that has to do with choosing Fiji over any Maxwell product to replace GK110, as per your words, since GM204 isn't Fiji's Maxwell competitor or GK110's succesor.
> 
> But the same reasons apply regardless. GM204 has more ROPs at a higher clock speed than GK110 (64 vs 48, ~30% higher frequency), more powerful and much higher-clocked shaders, comparable TMU performance, more cache, delta color compression techniques like AMD, stronger actual FP32 compute performance, etc.
> 
> Nvidia saved a lot of die space vs Kepler simply by cutting out FP64 stuff and packing the transistors closer together. GM204's not that much stronger, but still more advanced in all kinds of areas and corroborating synthetic tests.


.

I am guessing you have been an AMD man until now. and did not participate in the great GK110 hype ... Well good on you
I am gonna pass on drinking the Nvidia Kool aid of "it's all about high frequency and texture fill rates" on the 970 vs cuda cores on 780 Ti and Titan, and that's the only reason I need not to buy any Maxwell cards.


----------



## blue1512

Quote:


> Originally Posted by *hamzta09*
> 
> Its the first time I linked it. Smartass.
> 
> What reviewers? I've not seen an official review overclock the HBM. And HBM alone.
> 
> Its also easy to Photoshop 3dmark results.
> 
> And why would AMD say the HBM is hardlocked in hardware when they also claim the Fury X is an Overclockers dream? Doesnt make any sense.
> 
> Did I just upset all the fanbots?


Here it is for you. I posted this in the HBM unlock thread ages ago.
http://www.overclock.net/t/1562593/hardware-info-hbm-on-furyx-can-be-overclocked-after-all
http://www.hardware.fr/articles/937-26/overclocking-gpu-fiji.html


----------



## gamervivek

Hallock can say whatever he wants. I believe in Dave.








Quote:


> MCLK overclocking via afterburner is working, but apparently the granularity of the steps is a little coarser than the software tools will state, with the final value being rounded to the closest step.


https://forum.beyond3d.com/posts/1858255/


----------



## Liranan

Quote:


> Originally Posted by *provost*
> 
> .
> 
> I am guessing you have been an AMD man until now. and did not participate in the great GK110 hype ... Well good on you
> I am gonna pass on drinking the Nvidia Kool aid of "it's all about high frequency and texture fill rates" on the 970 vs cuda cores on 780 Ti and Titan, and that's the only reason I need not to buy any Maxwell cards.


Everyone slurping whatever nVidia is serving will continue to claim how great nVidia are even when the Fury outdoes their Titan X's in a few months when nVidia have stopped optimising for their chips after the release of their next.


----------



## Liranan

Quote:


> Originally Posted by *blue1512*
> 
> Here it is for you. I posted this in the HBM unlock thread ages ago.
> http://www.overclock.net/t/1562593/hardware-info-hbm-on-furyx-can-be-overclocked-after-all
> http://www.hardware.fr/articles/937-26/overclocking-gpu-fiji.html


Performance increases linear with clock increase.

Edit: Have nVidia dropped 980Ti prices? I see stock Ti's going for less than Fury X's.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Liranan*
> 
> Performance increases linear with clock increase.
> 
> Edit: Have nVidia dropped 980Ti prices? I see stock Ti's going for less than Fury X's.


Prices look normal to me.


----------



## hamzta09

Quote:


> Originally Posted by *blue1512*
> 
> Here it is for you. I posted this in the HBM unlock thread ages ago.
> http://www.overclock.net/t/1562593/hardware-info-hbm-on-furyx-can-be-overclocked-after-all
> http://www.hardware.fr/articles/937-26/overclocking-gpu-fiji.html


I already replied to you or whoever linked that french article...

Why repeat?

And 1fps isnt much of an improvement and could be entirely coincidental.


----------



## Liranan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Prices look normal to me.


Mostly they're equal in price but some reference model Ti's are considerably cheaper but after market coolers are the same price as Fury X.

AMD need to drop prices on these things but I bet they can't due to the investment in HBM.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Liranan*
> 
> Mostly they're equal in price but some reference model Ti's are considerably cheaper but after market coolers are the same price as Fury X.
> 
> AMD need to drop prices on these things but I bet they can't due to the investment in HBM.


Where are you looking? I just checked Newegg USA and they are the same price as always here. ~$650-$700


----------



## blue1512

Quote:


> Originally Posted by *hamzta09*
> 
> I already replied to you or whoever linked that french article...
> 
> Why repeat?
> 
> And 1fps isnt much of an improvement and could be entirely coincidental.


Great coincident by the way. 8% mem oc offer more than 8% core oc in FarCry 4 and BatmanAO


----------



## Klocek001

actually give the bus width 8% would be a whole ton of extra bandwidth.


----------



## iSlayer

Quote:


> Originally Posted by *provost*
> 
> Ok, I will be honest about the reasons why I purchased the Furyx over any Maxwell card.
> 
> Other than being jaded by Nvidia's lack of driver optimizations for GK110 cards
> 
> 
> 
> 
> 
> 
> 
> 
> , the more I looked at the hardware specs of Maxwell (you know, the meat and potatoes, the nuts and bolts... Lol) , the more I felt uncomfortable with having an inferior card to my GK110s strictly from a hardware perspective (fewer of everything/performance: rops, TMUs, cores , etc. including compute), and less trust in this "vaporware software based performance enhancements" that doesn't correlate to the hardware on the pcb when comparing my GK110s and Maxwell. I know people would argue "architectural efficiency", but after being jaded by my experience of seeing Nvidia drop the GK110 like a hot potato when it has to sell another inferior hardware sku, I am a lot less trusting of this black magic called architectural efficiency that seemingly creates performance out of thin air. I wish well to anyone that bought a Maxwell. But, for me, I may not be able to put a finger on exactly what bothers me about Maxwell, but something just ain't right with that "Maxwell boy" , in my gut....


This isn't how architectures work.
Quote:


> Originally Posted by *Serandur*
> 
> Just a little explanation of Maxwell's perceived lacking specs yet high performance and how it really relates to GCN/Fiji and Kepler(*warning, long; TLDR in bold at bottom summarizing, which is also kinda long
> 
> 
> 
> 
> 
> 
> 
> )*.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GM200 does not have less ROPs than GK110; 96 vs 48 (double the ROPs) plus GM200 clocks way higher than GK110 hence the ROPs, TMUs, shaders, etc. all get a large percentage boost from those alone. For example, pretend Kepler and Maxwell shader cores, ROPs, and TMUs are identical. 3072 of those hypothetical Kepler shaders/TMUs/ROPs at 1000 MHz would be theoretically equal to 2048 Maxwell shaders/TMUs/ROPs at 1500 MHz. "Compute" performance (in TFLOPS) is proportional to raw shader count and clock speed, though the real deal is defined by far more than that theoretical number.
> 
> Disregarding architectural enhancements for now, Maxwell hits 1500-1550 MHz with the same ease GK110 got to 1150-1200 MHz. That's ~30% alone on top of twice the ROPs, not less (96 on GM200 vs 48 on GK110). That's roughly 2.5 times more peak pixel fillrate just looking at it on paper and easily the highest of any GPU on the market (Fiji having 64 ROPs at a lower clock speed).
> 
> At stock, even before GM200's massive current OC advantage versus Fiji:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The lesser amount of TMUs isn't an issue partially for the same reason (clock speeds) and partially for another in the case of GCN. GCN TMUs (including Fiji) also only have half their speed with fp16 textures instead of their peak (256 Gtexels/s at 1000 MHz) whereas Maxwell retains its peak rate, putting GM200 behind Fiji at peak TMU performance with int8 textures (theoretically, not by much either at both cards' current typical OC speeds), but significantly better with more complex fp16 textures despite the lesser amount of TMUs. Maxwell's actual Texel/s rate is even higher than the 780 Ti's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shaders also benefit directly from raw clockspeed as does peak single-precision compute performance. GM200's reported peak "compute" performance is only measured at the base clock whereas GM200 actually boosts much higher. In contrast, GCN's stated base speed at which "compute" is calculated is its max.
> 
> The 980 Ti at 1500 MHz is a nearly 9 TFLOPS peak GPU in single-precision whereas Fiji at 1150 MHz is nearly 10. The difference in theoretical peak rate at both cards' rough OC capabilities isn't that big and the actual, effective compute performance varies significantly at stock with Fury winning some things (ray tracing), the 980 Ti winning others (particle simulation), and a lot of close calls/ties in between. GM200 crushes GK110 in actual compute performance regardless.
> 
> Ray tracing:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Particle simulation:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Then GM200 has twice the L2 cache (basically on-die memory) of GK110 and 50% more than Fiji which helps a lot, it's got that delta color compression, and it's got improvements to the shader core design such as reduced arithmetic latencies and improved scheduling to let Maxwell get closer to its peak performance rates than Kepler.
> 
> 
> 
> *TLDR: The specs people compare between Fiji, GK110, and GM200 are directly tied to clock speeds and are also quite superficial as there are many other specifications and details of specifications your typical spec sheet doesn't show you like cache.
> 
> Maxwell's single-precision compute/shader performance (even the theoretical TFLOPS figure), TMU performance, and ROP performance are way higher than people think because of clock speed, discrepencies between reported theoretical figures and their attainability in reality, and discrepencies in how Nvidia/AMD report their numbers. Maxwell doesn't lack in raw theoretical hardware power at all, Nvidia just undersell it oddly enough and people generalize the theoretical peak rates without considering clock speed differences and architectural specifics (like GCN TMUs' half-rate with FP16) to reality too much.
> 
> Maxwell's performance doesn't come from black magic, it comes from real, quantifiable, and measurable hardware improvements which have been thoroughly documented.*


An excellent post. Rep+
Quote:


> Originally Posted by *provost*
> 
> .
> 
> I am guessing you have been an AMD man until now. and did not participate in the great GK110 hype ... Well good on you
> I am gonna pass on drinking the Nvidia Kool aid of "it's all about high frequency and texture fill rates" on the 970 vs cuda cores on 780 Ti and Titan, and that's the only reason I need not to buy any Maxwell cards.


The 970 has more powerful CUDA cores than the 780 Ti with a considerable improvement to effective memory bandwidth, tessellation and other arch improvements. It's not really news.

This is akin to thinking the i7 990x is superior to an i7 4790k because "I don't care about that more cache and greater IPC koolaid its all about core count".

OCN pls
Quote:


> Originally Posted by *Liranan*
> 
> Everyone slurping whatever nVidia is serving will continue to claim how great nVidia are even when the Fury outdoes their Titan X's in a few months when nVidia have stopped optimising for their chips after the release of their next.


On OCN TX owners probably won't care since the Titan X's OC so well that the perf defecit doesn't matter.

That said, Titan X and 980 Ti owners knew what they were getting into when they purchased. Most want the extra VRAM that the Fury X doesn't offer, the OCing headroom, or other.

Also, Pascal isn't due for a year. It's going on 16nm and its a new architecture with HBM support. Maxwell isn't going away any time soon. Fiji either for that matter.


----------



## Klocek001

Quote:


> Originally Posted by *iSlayer*
> 
> That said, Titan X and 980 Ti owners knew what they were getting into when they purchased. Most want the extra VRAM that the Fury X doesn't offer, the OCing headroom, or other.


cause they live by nvidia's code?


----------



## Ceadderman

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iSlayer*
> 
> That said, Titan X and 980 Ti owners knew what they were getting into when they purchased. Most want the extra VRAM that the Fury X doesn't offer, the OCing headroom, or other.
> 
> 
> 
> cause they live by nvidia's code?
Click to expand...

Odd how we're getting good solid reviews an team Green only concentrate on the VRam. Never mind stock vs stock consistently has Fury competing well the their top end cards.

That cinched it for me. Fury X crossfire or x2. Either way, I will be $1300 poorer and Gaming richer.









~Ceadder


----------



## provost

Quote:


> Originally Posted by *iSlayer*
> 
> This isn't how architectures work.
> 
> *The 970 has more powerful CUDA cores than the 780 Ti* with a considerable improvement to effective memory bandwidth, tessellation and other arch improvements. It's not really news.
> 
> This is akin to thinking the i7 990x is superior to an i7 4790k because "I don't care about that more cache and greater IPC koolaid its all about core count".


Ermm... you are comparing Intel's 32nm node to 22 nm node, disregarding the more powerful cores due to 3D Tri-gate transistors, and 2x transistor density improvement over 32nm. What new tech was introduced with Maxwell and what node shrink that created "more powerful" cuda cores? Nice try, but try again..


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *iSlayer*
> 
> I don't buy for theoretically greater performance 3 years later, I buy for greater performance now. *2 years today it won't matter which performs better because both will be seriously dropping quality settings to maintain playable frames.* Buying based off of future prospects and theoretical performance is a TERRIBLE idea.


What on earth are you talking about? I can play every game I own on absolute maxed settings and 1440p with my two year old Titans at stock clocks, never mind at the 1320MHz they are capable of running at. I have absolutely no doubt that Fury and Titan X/980Ti will still be plenty capable of doing the same two years from now; its just that Fury X might actually be significantly faster than the 980Ti by then if history is any guide...


----------



## Ceadderman

I think that's what he was saying









~Ceadder


----------



## Rei86

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What on earth are you talking about? I can play every game I own on absolute maxed settings and 1440p with my two year old Titans at stock clocks, never mind at the 1320MHz they are capable of running at. I have absolutely no doubt that Fury and Titan X/980Ti will still be plenty capable of doing the same two years from now; its just that Fury X might actually be significantly faster than the 980Ti by then if history is any guide...


But that's the point. No one is gonna give a damn when its "better" than the 980Ti, because we're all gonna be jizzing over the newer 2080/Fury X the 3rd/whateverthehelltheywannacallit X.

At the "rapid" pace of getting newer products and everyone just having a boner for the newer and great tech, Fury X and the Maxwell nVidia line will become irrelevant in that discussion.

No one is gonna care that some dude holding out onto their Fury X is getting XXfps at XX x XX resolution at that point.


----------



## iSlayer

@Majin SSJ Eric wow you have 2 Titans ($1k cards) not including watercooling and you're not having performance issues 2 years down the line with voltage unlocked overclocking?

Who would have freaking thought







. Tell us more about how the sky is blue or water is wet!

@provost I'm not sure what you just said to me but I worry if I expend any more time thinking on it I may developed some kind of brain anyeurism.

Tri-gate technology is to reduce leakage. Transistor shrinking doesn't increase performance directly.

Wow that was a mess. Are you aware of the tick-tock cycle at all?


----------



## Blackops_2

Quote:


> Originally Posted by *iSlayer*
> 
> @Majin SSJ Eric wow you have 2 Titans ($1k cards) not including watercooling and you're not having performance issues 2 years down the line with voltage unlocked overclocking?
> 
> Who would have freaking thought
> 
> 
> 
> 
> 
> 
> 
> . Tell us more about how the sky is blue or water is wet!
> 
> @provost I'm not sure what you just said to me but I worry if I expend any more time thinking on it I may developed some kind of brain anyeurism.
> 
> Tri-gate technology is to reduce leakage. Transistor shrinking doesn't increase performance directly.
> 
> Wow that was a mess. Are you aware of the tick-tock cycle at all?


You did notice he said he could run any game at 1440p maxed at stock clocks? SLI 780s, 780Ti(s), and CF 290/x(s) will max out any game available today in 1440p. Hell my 780 that is on air is maxing every single game out at 1080p without issue. It is struggling some with DSR but that is to be expected. Increase in graphical fidelity and tech required to produce said graphical fidelity is nowhere near where it used to be. I get what your saying though that noone wants to wait a year to see if Fury pans out. That is understandable. I don't think they'll be seriously dropping quality settings in two years though. I haven't yet with my 780s, and didn't with my 7970. I'll likely have to this time around when i get the 7970 back up and running, at 1200/1700 i should get some good performance out of it.

Titans were/are expensive but to act as if later we didn't have 780s, then the 780Ti is trivial. My point is GK110 is still performing alright up to this point and Hawaii is performing better than it ever has. Longevity has it's place for buyers too.


----------



## provost

Quote:


> Originally Posted by *iSlayer*
> 
> @Majin SSJ Eric wow you have 2 Titans ($1k cards) not including watercooling and you're not having performance issues 2 years down the line with voltage unlocked overclocking?
> 
> Who would have freaking thought
> 
> 
> 
> 
> 
> 
> 
> . Tell us more about how the sky is blue or water is wet!
> 
> @provost I'm not sure what you just said to me but I worry if I expend any more time thinking on it I may developed some kind of brain anyeurism.
> 
> Tri-gate technology is to reduce leakage. Transistor shrinking doesn't increase performance directly.
> 
> Wow that was a mess. Are you aware of the tick-tock cycle at all?


Again , nice try.....

Node shrink + higher transistor count/density + arch enhancements such as tri-gate = higher performance.

I will dig up a table tomorrow comparing Gk110 specs with Maxwell, if I get the time, so that you can stop misleading with your comments.

Maxwell on the same 28nm node = only some arch enhancements mainly related to better perf/watt , great for cross leveraging the tech with Tegra, etc., - take out more hardware to free pcb space + majority of driver controlled performance over Gk110 to make this turd shine

And, that's it....









I am not interested in discussing maxwell anymore in the furyx thread...


----------



## iSlayer

@2010rig we MENSA now?

@provost I don't think you understand how any of this works. At all. In the slightest. I'm not even sure how you could be so misled.

Trogate relates to voltage leaking, that doesn't assist performance. Transistor density itself doesn't aid performance. If will reduce voltage needs, but not inherently increase performance. It's how you make usage of the increased transistor density that matters to performance.

http://www.overclock.net/t/1561860/various-amd-radeon-r9-fury-x-reviews#post_24122133

You do know the 285/380, which matches the 280(x) with less resources basically boasts what Maxwell does, right? It isn't freaking witchcraft, architectures matter. This is how GCN and Maxwell can offer similar performance but drastically different power usage.


----------



## Tivan

Quote:


> Originally Posted by *iSlayer*
> 
> Titan X and 980 Ti owners knew what they were getting into when they purchased. Most want the extra VRAM that the Fury X doesn't offer, the OCing headroom, or other.


Perfectly reasonable. Just like FuryX owners know what they are getting when they purchase. Most want the HBM that the TX doesn't offer, or the included AIO, or other.


----------



## Ceadderman

Or both HBM and CLC system. Those unsure of the CLC will likely swap out for a Custom Block made for it. Which is likely what I will do when I finally take the plunge for one of the new HBM cards.







:

Not because I am unsure of CLC but because I have a custom loop.









~Ceadder


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Or both HBM and CLC system. Those unsure of the CLC will likely swap out for a Custom Block made for it. Which is likely what I will do when I finally take the plunge for one of the new HBM cards.
> 
> 
> 
> 
> 
> 
> 
> :
> 
> Not because I am unsure of CLC but because I have a custom loop.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Indeed EK is already making a Fury X Block









I can't afford a custom loop til next year but it is nice to see the option out there.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ceadderman*
> 
> Odd how we're getting good solid reviews an team Green only concentrate on the VRam. Never mind stock vs stock consistently has Fury competing well the their top end cards.
> 
> That cinched it for me. Fury X crossfire or x2. Either way, I will be $1300 poorer and Gaming richer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I think people on both sides are getting entirely too hung up on VRAM. 4 GB is enough if the damn developers would manage their memory usage properly. VRAM is a software thing right now, not a hardware thing. At least that is the way I look at it. GPU manufacturers (all of them) push VRAM because it is cheap and easy to stack on a GPU and hype for marketing.

If anything, the 4GB Fury X at 4K keeping up towards the Titan with 12 GB shows that. If VRAM at 4K was REALLY an issue, you would see something. Now, if the review is to believed, Fury X CF is handily outpacing the Titan X.

Wouldn't do that if it didn't have enough VRAM!


----------



## toncij

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I think people on both sides are getting entirely too hung up on VRAM. 4 GB is enough if the damn developers would manage their memory usage properly. VRAM is a software thing right now, not a hardware thing. At least that is the way I look at it. GPU manufacturers (all of them) push VRAM because it is cheap and easy to stack on a GPU and hype for marketing.
> 
> If anything, the 4GB Fury X at 4K keeping up towards the Titan with 12 GB shows that. If VRAM at 4K was REALLY an issue, you would see something. Now, if the review is to believed, Fury X CF is handily outpacing the Titan X.
> 
> Wouldn't do that if it didn't have enough VRAM!


Well, it would be fine if you were not wrong. It has absolutely nothing to do with "memory usage being proper". Also, VRAM is not cheap, it is actually expensive and not easy to "stack on".

Jesus, where do you pull this out from? Some dark place?

Developers can't do much more than they do. VRAM usage has more to do with GPU driver than game developers who actually, for a whole bunch of other reasons, manage those resources with more care than you would ever guess.


----------



## PostalTwinkie

Quote:


> Originally Posted by *toncij*
> 
> Well, it would be fine if you were not wrong. It has absolutely nothing to do with "memory usage being proper". Also, VRAM is not cheap, it is actually expensive and not easy to "stack on".
> 
> Jesus, where do you pull this out from? Some dark place?
> 
> Developers can't do much more than they do. VRAM usage has more to do with GPU driver than game developers who actually, for a whole bunch of other reasons, manage those resources with more care than you would ever guess.


Yes, VRAM is easy to throw on a card, especially GDDR5. When you compare the cost of the GPU R&D and production itself to that of the individual RAM, it is very cheap. RAM in general is a low cost product.

Yes - software memory heuristics are crap in development. There is next to little memory management in software today, even drivers. Hell, AMD just now (with Fury X) brought on engineers specific for memory management within their driver team.

A person is a fool to think that software today has anywhere near optimal memory management methodology in use.

EDIT:










Developers acting with care!

PLEASE!!!!

I just had to come back and laugh at that.


----------



## toncij

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yes, VRAM is easy to throw on a card, especially GDDR5. When you compare the cost of the GPU R&D and production itself to that of the individual RAM, it is very cheap. RAM in general is a low cost product.
> 
> Yes - software memory heuristics are crap in development. There is next to little memory management in software today, even drivers. Hell, AMD just now (with Fury X) brought on engineers specific for memory management within their driver team.
> 
> A person is a fool to think that software today has anywhere near optimal memory management methodology in use.
> 
> EDIT:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Developers acting with care!
> 
> PLEASE!!!!
> 
> I just had to come back and laugh at that.


VRAM is expensive in total production cost. Also, there are very specific technical reasons (cost included) why AMD did not bring 4GB of VRAM more to current 4 stacks of 1GB.

AMD's optimizations are what NVidia already made. And they did it as a last resort, not something "long due". What they did also doesn't really help much - you can't really compress memory to some exorbitant extent, just be more aggressive with offloading to RAM.

"Heuristics" - sure, use some more terms that sound cool.







Your comments on development show pretty much you have absolutely no idea what you're talking about.


----------



## Tivan

Quote:


> Originally Posted by *toncij*
> 
> AMD's optimizations are what NVidia already made.


Compare memory usage on the fury vs nvidia cards. I think they specifically optimize with hbm in mind, not whatever nvidia is doing to get some more bandwidth. (though AMD has been catching up on that I guess.)
Quote:


> Your comments on development show pretty much you have absolutely no idea what you're talking about.


Why go so low as to resort to ad hominem and claim being knowledgeable in the same breath. c; (edit: You could just show us your knowledge to attack twinkies arguments with numbers and facts. = D Something I, as curious bystander, would appreciate. )
Quote:


> VRAM is expensive in total production cost. Also, there are very specific technical reasons (cost included) why AMD did not bring 4GB of VRAM more to current 4 stacks of 1GB.


I think you're just disagreeing on numbers very slightly with Twinkie here, honestly. say a (very hypothetical) product price is aiming to recover RnD at 50% the price, 50% covering production of the item, the cost of RAM or RnD in that, either might look like a lot, or like very little, depending on one's opinion.


----------



## iLeakStuff

KitGuru`s review of Fury X
980Ti a good deal ahead of Fury X in 1440p like so many reviews show.

I agree with their conclusion:
Quote:


> Both myself and my colleague Allan agree that if AMD can drop the price more towards £449.99 inc vat, we do feel the R9 Fury X has a logical place in the market.


R9 Nano at $450 or a price drop for Fury X to $550 and I will once again agree with AMDs position. $649 is a horrible price for this card


----------



## 364901

Quote:
Originally Posted by *provost* 


> What new tech was introduced with Maxwell and what node shrink that created "more powerful" cuda cores? Nice try, but try again..


He might just not be putting the point across that Nvidia's decision to sub-divide Maxwell into 128 CUDA cores per SMM yielded higher efficiency and throughput than Kepler's 192 CUDA cores per SMX. If that allows Nvidia to eke out more performance because they've cut out any chance of CUDA cores being idle, then that's definitely a welcome change. I recall that people grumbled about this exact thing back in early 2013 with the HD7970 launch.


----------



## iSlayer

Quote:


> Originally Posted by *Ceadderman*
> 
> Odd how we're getting good solid reviews an team Green only concentrate on the VRam. Never mind stock vs stock consistently has Fury competing well the their top end cards.
> 
> That cinched it for me. Fury X crossfire or x2. Either way, I will be $1300 poorer and Gaming richer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


As I mentioned, things like overclocking headroom, feature sets, power usage, aftermarket variants, no AIO, other... matter. It isn't just VRAM. Certainly not for me, 4GBs would be plenty since I do 1080p and mainly need high frame rates.

And most don't care about stock vs. stock as the reference 980 Ti is already winning and aftermarket variants for the same or slightly more than a reference 980 Ti will offer a 10-15% lead on the Fury X. The question is how Fiji overclocks.
Quote:


> Originally Posted by *Blackops_2*
> 
> You did notice he said he could run any game at 1440p maxed at stock clocks? SLI 780s, 780Ti(s), and CF 290/x(s) will max out any game available today in 1440p. Hell my 780 that is on air is maxing every single game out at 1080p without issue. It is struggling some with DSR but that is to be expected. Increase in graphical fidelity and tech required to produce said graphical fidelity is nowhere near where it used to be. I get what your saying though that noone wants to wait a year to see if Fury pans out. That is understandable. I don't think they'll be seriously dropping quality settings in two years though. I haven't yet with my 780s, and didn't with my 7970. I'll likely have to this time around when i get the 7970 back up and running, at 1200/1700 i should get some good performance out of it.
> 
> Titans were/are expensive but to act as if later we didn't have 780s, then the 780Ti is trivial. My point is GK110 is still performing alright up to this point and Hawaii is performing better than it ever has. Longevity has it's place for buyers too.


I'm aware. Again, $2,000 in GPUs. Even on OCN that's not exactly the mainstream. And the 780 was still a $650 GPU at launch which is quite a lot of money.

Regardless, the nature of technology is advancement. Games increase in graphical quality, pursue new tech, etc...
Quote:


> Originally Posted by *Tivan*
> 
> Perfectly reasonable. Just like FuryX owners know what they are getting when they purchase. Most want the HBM that the TX doesn't offer, or the included AIO, or other.


Sames. That HBM is tasty.
Quote:


> Originally Posted by *Ceadderman*
> 
> Or both HBM and CLC system. Those unsure of the CLC will likely swap out for a Custom Block made for it. Which is likely what I will do when I finally take the plunge for one of the new HBM cards.
> 
> 
> 
> 
> 
> 
> 
> :
> 
> Not because I am unsure of CLC but because I have a custom loop.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I'm not a fan of CLCs, if I want water i'm doing it custom. Paying extra for the CLC when i'd just put it underwater isn't great to me.

That said, AMD's temps are undeniably tasty. And the benefits of that CLC are actually important to reducing voltage leaks to keep the power consumption down, i'm surprised that it makes such a big difference and think that added benefit makes it a really cool design decision on AMD's part. Anandtech's review once again gives us information no one else offers.


----------



## PostalTwinkie

Quote:


> Originally Posted by *iSlayer*
> 
> Sames. That HBM is tasty.


......

LN2 it - someone unlocked it already.


----------



## Themisseble

Quote:


> Originally Posted by *iLeakStuff*
> 
> KitGuru`s review of Fury X
> 980Ti a good deal ahead of Fury X in 1440p like so many reviews show.
> 
> I agree with their conclusion:
> R9 Nano at $450 or a price drop for Fury X to $550 and I will once again agree with AMDs position. $649 is a horrible price for this card


AMD is clearly having problems with DX11 benchmarks... cannot utilize more GCN cores. I wonder what happens when DX12 gets here. How well will GTX 980Ti OC-ed at 1400/7500 do against it.

Today review are really bad.. nobody cares why Fury X does so bad on 1080P!...

Nobody even use win 10... of course we are going to play games on win 7.


----------



## EpicOtis13

Quote:


> Originally Posted by *PostalTwinkie*
> 
> ......
> 
> LN2 it - someone unlocked it already.


Even with that little 100mhz OC the memory has received so far, card seems to do much better in benches with OCed VRAM.


----------



## NuclearPeace

I don't get all the hype over HBM. The one that I find fascinating is that by locating the memory so close to the GPU you are saving a lot of energy. Someone pointed out that a Fury X only has a whopping 5% more GB/s per SP than a 290X, and that is probably negated by the 390X 6Gbps stock memory. As of right now, its interesting and fantastic as a technological leap forward but strictly from the numbers nothing much changes.

Other than that I don't like how AMD went about making the Fury X and that comes down to the mandated AIO. I don't like AIOs and the small controversy about release samples having pump whine and buzzing and reports of the 980 Ti Hybrid from EVGA having the same problems definitely didn't help my perception. It feels like they are trying to emulate the "luxury" feeling of the Titan and I really wish AMD would not because the Titan to me isn't something you want to emulate (locked down to a blower cooler, $1000 robbery, etc...). I get that they were trying to keep it in a smaller form factor and keep the power consumption low by keeping it cool (which lowers leakage), but I also want choice. If they offered a Fury X with aftermarket air coolers for $550 this card would be amazing. However at $650, it feels like I am forced to pay extra for an AIO cooler which is something I don't like in a card.


----------



## STEvil

You save power and decrease PCB complexity.

HBM is the future, just wait until it arrives on CPU's and in laptops.


----------



## Forceman

Quote:


> Originally Posted by *NuclearPeace*
> 
> Other than that I don't like how AMD went about making the Fury X and that comes down to the mandated AIO. I don't like AIOs and the small controversy about release samples having pump whine and buzzing and reports of the 980 Ti Hybrid from EVGA having the same problems definitely didn't help my perception. It feels like they are trying to emulate the "luxury" feeling of the Titan and I really wish AMD would not because the Titan to me isn't something you want to emulate (locked down to a blower cooler, $1000 robbery, etc...). I get that they were trying to keep it in a smaller form factor and keep the power consumption low by keeping it cool (which lowers leakage), but I also want choice. I*f they offered a Fury X with aftermarket air coolers for $550 this card would be amazing.* However at $650, it feels like I am forced to pay extra for an AIO cooler which is something I don't like in a card.


Have you not heard of the Fury non-X? Air cooled, $550, coming in two weeks?


----------



## Liranan

Quote:


> Originally Posted by *NuclearPeace*
> 
> I don't get all the hype over HBM. The one that I find fascinating is that by locating the memory so close to the GPU you are saving a lot of energy. Someone pointed out that a Fury X only has a whopping 5% more GB/s per SP than a 290X, and that is probably negated by the 390X 6Gbps stock memory. As of right now, its interesting and fantastic as a technological leap forward but strictly from the numbers nothing much changes.
> 
> Other than that I don't like how AMD went about making the Fury X and that comes down to the mandated AIO. I don't like AIOs and the small controversy about release samples having pump whine and buzzing and reports of the 980 Ti Hybrid from EVGA having the same problems definitely didn't help my perception. It feels like they are trying to emulate the "luxury" feeling of the Titan and I really wish AMD would not because the Titan to me isn't something you want to emulate (locked down to a blower cooler, $1000 robbery, etc...). I get that they were trying to keep it in a smaller form factor and keep the power consumption low by keeping it cool (which lowers leakage), but I also want choice. If they offered a Fury X with aftermarket air coolers for $550 this card would be amazing. However at $650, it feels like I am forced to pay extra for an AIO cooler which is something I don't like in a card.


I'm sure you'd have said the same thing when DDR5 was released by AMD. The 4870 was equal to the 260 at launch but by the end of its life it was equal to the 280 because nVidia had moved on and had stopped caring about their older generation.

Of course this doesn't excuse AMD and I am disappointed in the Fluffy X because on paper it's spectacular. Saying that I had a 4870 and was very pleased with it for the years I had it.


----------



## SpeedyVT

Quote:


> Originally Posted by *Liranan*
> 
> I'm sure you'd have said the same thing when DDR5 was released by AMD. The 4870 was equal to the 260 at launch but by the end of its life it was equal to the 280 because nVidia had moved on and had stopped caring about their older generation.
> 
> Of course this doesn't excuse AMD and I am disappointed in the Fluffy X because on paper it's spectacular. Saying that I had a 4870 and was very pleased with it for the years I had it.


Fluffy X Crossfire is destroying both 980 ti and Titan atm sli.


----------



## EpicOtis13

Quote:


> Originally Posted by *SpeedyVT*
> 
> Fluffy X Crossfire is destroying both 980 ti and Titan atm sli.


Which is exactly why I'm probably buying two fury non x's over 980ti's.


----------



## Ceadderman

We'll see if team Greenies have something to say when x2 launches. So far they're all parroting the same incredible BS.

Frame rates higher? Sure. But you won't notice it without some software for counter. Drop the counter and tell me if you can *SEE* a difference. Truthfully you can't.

More VRAM on 980/980ti and Titan? Check. But again you won't see much of a difference if any compared to Fury. So in effect, who gives a rip?

If you're a fan of team Green, that's fine with me. Has Titan Z is currently a $1600 card. 980ti is in the same price point and stock vs stock Fury is just fine. It's early yet and AMD is still on their first Beta driver.

So imho, they're at the correct price point because while nVidia has them in epeen statistically, Fury is competing well against them IRL gaming situations and *that's* how games are meant to be played.









~Ceadder


----------



## NuclearPeace

Why does everyone need to be put into "team red" or "team green"? The amount of fanboyism and us vs them on this forum is getting to absurd levels. It's becoming less and less enjoyable to discuss hardware on this forum now.


----------



## Ceadderman

Nothing wrong with it imho. If "team" doesn't apply to you because you're using both at some time or other, simply ignore it and carry on.

I am not saying it to be disrespectful toward anybody. I am saying it because my bleeding autocorrect on my S4 is a *********. Case in point nVidia becomes "no idea" because sometimes I forget to tap the arrow before hitting the space.

Sometimes "team" applies though because some people show their stripes the way a skunk shows its stripes.

Apologies if this offends you. It's certainly not meant to. If I wanted to offend, I have other terms that start with nV.


















~Ceadder


----------



## Slink3Slyde

Quote:


> Originally Posted by *Ceadderman*
> 
> Nothing wrong with it imho. If "team" doesn't apply to you because you're using both at some time or other, simply ignore it and carry on.
> 
> I am not saying it to be disrespectful toward anybody. I am saying it because my bleeding autocorrect on my S4 is a *********. Case in point nVidia becomes "no idea" because sometimes I forget to tap the arrow before hitting the space.
> 
> Sometimes "team" applies though because some people show their stripes the way a skunk shows its stripes.
> 
> Apologies if this offends you. It's certainly not meant to. If I wanted to offend, I have other terms that start with nV.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I was hoping to be on Team Fury, hoping that it would blow Titan X out of the water.

Team underwhelmed with less overall performance and VRAM for the same price and being forced to use an AIO I dont like although its got better dual card scaling and maybe improve over time relativelty speaking when Pascal had most likely made that irrelevant.

Team Stark Reality over here. I dont hold affections for corporate entities. I do hope for competition though.

If it had been 100 cheaper all over the world it would have been a win. And why they had to make it reference only I have no idea.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ceadderman*
> 
> We'll see if team Greenies have something to say when x2 launches. So far they're all parroting the same incredible BS.
> 
> Frame rates higher? Sure. But you won't notice it without some software for counter. Drop the counter and tell me if you can *SEE* a difference. Truthfully you can't.
> 
> More VRAM on 980/980ti and Titan? Check. But again you won't see much of a difference if any compared to Fury. So in effect, who gives a rip?
> 
> If you're a fan of team Green, that's fine with me. Has Titan Z is currently a $1600 card. 980ti is in the same price point and stock vs stock Fury is just fine. It's early yet and AMD is still on their first Beta driver.
> 
> So imho, they're at the correct price point because while nVidia has them in epeen statistically, Fury is competing well against them IRL gaming situations and *that's* how games are meant to be played.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder












Iamgine the x2, but instead of two sets of HBM with its own GPU.....

One set of HBM2 with two GPUs going right to it, with DX12 being able to pool VRAM this should be doable.....

??

That would probably be a pretty crazy card. I am fairly certain HBM at higher clocks could feed two GPUs off the same memory pool. Just some wild guessing and fun theory crafting.

EDIT:

Back in the day 3DFX was prototyping a GPU that had VRAM and GPU expansion slots on it. Where you would (in theory, it never came to market) just add GPUs and VRAM as you needed it.

So multiple GPUs on one VRAM pool isn't new by any means.


----------



## pengs

Quote:


> Originally Posted by *NuclearPeace*
> 
> Why does everyone need to be put into "team red" or "team green"? The amount of fanboyism and us vs them on this forum is getting to absurd levels. It's becoming less and less enjoyable to discuss hardware on this forum now.


Yeah it's bad. It's the price of hardware these days, GPU's alone. People feel they need to adopt more than the hardware with such an investment and some of it is adolescence - it reminds me of my childhood and fighting with my friends about Sega vs Nintendo on the bus ride home (every single day)


----------



## BoredErica

I'm on team "better performance".


----------



## toncij

Quote:


> Originally Posted by *Tivan*
> 
> Compare memory usage on the fury vs nvidia cards. I think they specifically optimize with hbm in mind, not whatever nvidia is doing to get some more bandwidth. (though AMD has been catching up on that I guess.)
> Why go so low as to resort to ad hominem and claim being knowledgeable in the same breath. c; (edit: You could just show us your knowledge to attack twinkies arguments with numbers and facts. = D Something I, as curious bystander, would appreciate. )
> I think you're just disagreeing on numbers very slightly with Twinkie here, honestly. say a (very hypothetical) product price is aiming to recover RnD at 50% the price, 50% covering production of the item, the cost of RAM or RnD in that, either might look like a lot, or like very little, depending on one's opinion.


a) because he keeps pushing an idea of "bad memory management" and "guilt" for AMD's memory being somewhat low in size to developers, while he (to me it is apparent) knows jack smith about it. Game developers in all cases tend to compress, combine and minimise resources for performance reasons already. The same thing does with memory usage. Games are probably the only software where developers invest so much time in memory optimisation exactly because of performance. It is manufacturers that keep VRAM amounts at bay due to cost and some technical problem that would generate. And still, VRAM is most of the time, for almost all users, not a problem.
Quote:


> Originally Posted by *CataclysmZA*
> 
> He might just not be putting the point across that Nvidia's decision to sub-divide Maxwell into 128 CUDA cores per SMM yielded higher efficiency and throughput than Kepler's 192 CUDA cores per SMX. If that allows Nvidia to eke out more performance because they've cut out any chance of CUDA cores being idle, then that's definitely a welcome change. I recall that people grumbled about this exact thing back in early 2013 with the HD7970 launch.


Latency is what keeps a lot of theoretical AMD performance low in segments.
Quote:


> Originally Posted by *STEvil*
> 
> You save power and decrease PCB complexity.
> 
> HBM is the future, just wait until it arrives on CPU's and in laptops.


That I want to see. Will be very, very nice.


----------



## iLeakStuff

Quote:


> Originally Posted by *NuclearPeace*
> 
> I don't get all the hype over HBM. The one that I find fascinating is that by locating the memory so close to the GPU you are saving a lot of energy. Someone pointed out that a Fury X only has a whopping 5% more GB/s per SP than a 290X, and that is probably negated by the 390X 6Gbps stock memory. As of right now, its interesting and fantastic as a technological leap forward but strictly from the numbers nothing much changes.
> 
> Other than that I don't like how AMD went about making the Fury X and that comes down to the mandated AIO. I don't like AIOs and the small controversy about release samples having pump whine and buzzing and reports of the 980 Ti Hybrid from EVGA having the same problems definitely didn't help my perception. It feels like they are trying to emulate the "luxury" feeling of the Titan and I really wish AMD would not because the Titan to me isn't something you want to emulate (locked down to a blower cooler, $1000 robbery, etc...). I get that they were trying to keep it in a smaller form factor and keep the power consumption low by keeping it cool (which lowers leakage), but I also want choice. If they offered a Fury X with aftermarket air coolers for $550 this card would be amazing. However at $650, it feels like I am forced to pay extra for an AIO cooler which is something I don't like in a card.


HBM didnt turn out to be quite the GDDR5 killer it was suppose to be in 4K resolutions and such. It failed to deliver there and goes to show that current graphic cards, even the GM200/FuryX behemoths doesnt have enough brute power to fill up the roads and doesnt need bigger roads.
The benefit I see so far is reducing power consumption, but thats only like 20W or so vs GDDR5. And reducing PCB size, which doesnt cut cost much for us customers anyway because PCB is dirt cheap. On the other hand you can build smaller graphic cards though which let you build micro ATX beasts, so thats one positive outcome from it.

I think we need Pascal or 400 series on 16nm FinFETs with vastly more transistors than today and brute power to really start harvesting the power of HBM. These graphic cards may even have 8196bit HBM2, which will perhaps put out roughly 1000GB/s.

That said, HBM is the natural cycle to replace GDDR5 no matter how you look at it. Not just consumer GPUs will benefit from it, but also enterprise which run much more bandwidth intensive tasks which require the bandwidth and cutting of power. And products like APUs where you get much faster memory than sharing with DDR3 system memory or using a small 512kB L2 cache.


----------



## Casey Ryback

Quote:


> Originally Posted by *iLeakStuff*
> 
> HBM didnt turn out to be quite the GDDR5 killer it was suppose to be in 4K resolutions and such. It failed to deliver there
> 
> The benefit I see so far is reducing power consumption, but thats only like 20W or so vs GDDR5.


Shouldn't have jumped on board the hype train then lol, AMD never said it was a DDR5 'killer', they said it was the way of the future and it is clearly superior to DDR5, it's been proven by being smaller and more efficient.

How would the 4096 processor fury, only use 10W more whilst gaming over a 290X (2816 processors), if it only saved 20W?

http://hexus.net/tech/reviews/graphics/84170-amd-radeon-r9-fury-x/?page=13

Where did you get the 20W info from?


----------



## iLeakStuff

Quote:


> Originally Posted by *Casey Ryback*
> 
> Shouldn't have jumped on board the hype train then lol, AMD never said it was a DDR5 'killer', they said it was the way of the future and it is clearly superior to DDR5, it's been proven by being smaller and more efficient.
> 
> How would the 4096 processor fury, only use 10W more whilst gaming over a 290X (2816 processors), if it only saved 20W?
> 
> http://hexus.net/tech/reviews/graphics/84170-amd-radeon-r9-fury-x/?page=13
> 
> Where did you get the 20W info from?


Try to keep up Casey









Here, just this one page will explain where power reductions came from and what they gained using HBM instead of GDDR5
http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/5


----------



## provost

Quote:


> Originally Posted by *CataclysmZA*
> 
> He might just not be putting the point across that Nvidia's decision to sub-divide Maxwell into 128 CUDA cores per SMM yielded higher efficiency and throughput than Kepler's 192 CUDA cores per SMX. If that allows Nvidia to eke out more performance because they've cut out any chance of CUDA cores being idle, then that's definitely a welcome change. I recall that people grumbled about this exact thing back in early 2013 with the HD7970 launch.


I am not denying that architectural improvements are there over Gk110. What I am calling "black magic" is that there is no way that the low range Maxwell beats the upper range GK110 for Kepler on the same 28 mm node without controlled driver performance artificially skewing the performance gap.

This is my judgement and opinion, based on how I understand it, however limited that understanding may be on a technical level. It just does not pass the smell test, and thus, I call it "voodoo magic" or creating performance out of thin air... Lol
I will stick to my opinion and others can propagate or believe whatever they want . Time always proves to be a fair arbiter of such judgment calls and opinions, and my bet is that Maxwell's performance will drop like a rock once Pascal hits. Why? Because so much of the performance is being controlled by drivers vs the hardware itself.
As they used to say, software comes and goes, but hardware is forever ..









So, let's see how this turns out, shall we... Lol


----------



## Casey Ryback

Quote:


> Originally Posted by *iLeakStuff*
> 
> Try to keep up Casey


I already knew about the power limiting tweaks and the of course the thermal savings on a highly cooled card. Didn't know the actual numbers though, a little surprising actually.

Pretty amazing they saved that much power, with only 20-30W directly HBM related.

Still not sure why people are so disappointed with AMD they are doing good things.

Thanks for the link.


----------



## magnek

There's more to it than just power saved from going with HBM:
Quote:


> Another advantage of HBM is that it requires substantially less die space on the host GPU than GDDR5. The physical interfaces, or PHYs, on the chip are simpler, saving space. The external connections to the interposer are arranged at a much finer pitch than they would be for a conventional organic substrate, which means a more densely packed die. Macri hinted that even the data flow inside the GPU itself could be optimized to take advantage of data coming in "in a very concentrated hump.
> 
> Macri did say that GDDR5 consumes roughly one watt per 10 GB/s of bandwidth. That would work out to about 32W on a Radeon R9 290X. If HBM delivers on AMD's claims of more than 35 GB/s per watt, then Fiji's 512 GB/s subsystem ought to consume under 15W at peak. A rough savings of 15-17W in memory power is a fine thing, I suppose, but it's still only about five percent of a high-end graphics cards's total power budget. Then again, the power-efficiency numbers Macri provided only include the power used by the DRAMs themselves. The power savings on the GPU from the simpler PHYs and such may be considerable.


http://techreport.com/review/28294/amd-high-bandwidth-memory-explained/2


----------



## 364901

Quote:


> Originally Posted by *iLeakStuff*
> 
> HBM didnt turn out to be quite the GDDR5 killer it was suppose to be in 4K resolutions and such. It failed to deliver there and goes to show that current graphic cards, even the GM200/FuryX behemoths doesnt have enough brute power to fill up the roads and doesnt need bigger roads.
> The benefit I see so far is reducing power consumption, but thats only like 20W or so vs GDDR5. And reducing PCB size, which doesnt cut cost much for us customers anyway because PCB is dirt cheap. On the other hand you can build smaller graphic cards though which let you build micro ATX beasts, so thats one positive outcome from it.


What HBM ultimately showed is that it allows AMD to build a very big, wide and dense engine without increasing the die size beyond Nvidia's max for GM200. Fiji demonstrates that the GCN architecture first seen in Hawaii isn't ROP-limited, nor was it bandwidth constrained. Kepler, on the other hand, clearly benefited from having more bandwidth and we saw this with the GTX 770 launch. Having memory with lower latency is what GCN needs to shine brighter. Also, you'll notice that Fury X's performance at 1080p and below doesn't stack up very well with GTX 980, which is down to the card simply not having a big enough workload to put all its execution units to work.

IMHO, this is probably why Crossfire scaling is so magnificent. Not only is AMD close to saturating a PCI-Express 3.0 PLX chip on dual Fiji, they also have both GPUs with low latency memory connections. We don't need HBM v2 to realise the benefits of HBM, we just need GPUs that have wider engines and more horsepower.


----------



## Themisseble

Fury X stacking VRAM in CF mode??!! Can someone do a test

https://www.youtube.com/watch?v=XJYWXHOUoFY


----------



## iLeakStuff

Quote:


> Originally Posted by *magnek*
> 
> There's more to it than just power saved from going with HBM:
> http://techreport.com/review/28294/amd-high-bandwidth-memory-explained/2


Quote:


> Originally Posted by *CataclysmZA*
> 
> What HBM ultimately showed is that it allows AMD to build a very big, wide and dense engine without increasing the die size beyond Nvidia's max for GM200. Fiji demonstrates that the GCN architecture first seen in Hawaii isn't ROP-limited, nor was it bandwidth constrained. Kepler, on the other hand, clearly benefited from having more bandwidth and we saw this with the GTX 770 launch. Having memory with lower latency is what GCN needs to shine brighter. Also, you'll notice that Fury X's performance at 1080p and below doesn't stack up very well with GTX 980, which is down to the card simply not having a big enough workload to put all its execution units to work.
> 
> IMHO, this is probably why Crossfire scaling is so magnificent. Not only is AMD close to saturating a PCI-Express 3.0 PLX chip on dual Fiji, they also have both GPUs with low latency memory connections. We don't need HBM v2 to realise the benefits of HBM, we just need GPUs that have wider engines and more horsepower.


Thanks guys. This is something I havent read about before. Very interesting


----------



## sugarhell

Quote:


> Originally Posted by *Themisseble*
> 
> Fury X stacking VRAM in CF mode??!! Can someone do a test
> 
> https://www.youtube.com/watch?v=XJYWXHOUoFY


It is not. Always crossfire doubles the vram but just mirror everything. It is just a sensor "bug"


----------



## Themisseble

Quote:


> Originally Posted by *sugarhell*
> 
> It is not. Always crossfire doubles the vram but just mirror everything. It is just a sensor "bug"


sensor bug?! Probably bug, but not sensor bug. The only think that I noticed was no stuttering in 4K SoM and few other stuff. Like why showing memory usage higher than 4Gb and why combining both of them together? So lets say it software bug it should be researched. But then again Software recognizes 2 cards .. why showing 1 memory usage?


----------



## sugarhell

Quote:


> Originally Posted by *Themisseble*
> 
> sensor bug?!


Every single crossfire doubles the vram on the sensors of AB or fraps.


----------



## Themisseble

Quote:


> Originally Posted by *sugarhell*
> 
> Every single crossfire doubles the vram on the sensors of AB or fraps.


Nope.
Afternburner shows 2 GPUs- same like GTX 980Ti showing 2 VRAms usages or same as R9 290X showing 2 VRAMs usages.

You can see it with GTX 980Ti SLI.. showing 2x of VRAM usage or any other CF/SLI. This is first time that i see that "bug".
I think same as you it has to be a bug.


----------



## sugarhell

Quote:


> Originally Posted by *Themisseble*
> 
> Nope.
> Afternburner shows 2 GPUs- same like GTX 980Ti showing 2 VRAms usages or same as R9 290X showing 2 VRAMs usages.
> 
> You can see it with GTX 980Ti SLI.. showing 2x of VRAM usage or any other CF/SLI. This is first time that i see that "bug".
> I think same as you it has to be a bug.


I have over 6 years crossfire systems. For nvidia it shows the sli limit that means the normal usage. For AMD it shows double the vram.

For example a 980 sli system will show 4gb vram
For a 290x crossfire it will show 8 gb vram

Vram stacking with AFR is impossible


----------



## Spenning

Uhm so I'm not sure what would be the best to go for, Fury x or 980 ti? the 980ti is 7300NOK (around 915 USD) but the Fury x is 6999 nok (876 USD) . Also I'm not sure if the 980 ti will fit in my case, so what to go for?


----------



## p4inkill3r

Quote:


> Originally Posted by *Spenning*
> 
> Uhm so I'm not sure what would be the best to go for, Fury x or 980 ti? the 980ti is 7300NOK (around 915 USD) but the Fury x is 6999 nok (876 USD) . Also I'm not sure if the 980 ti will fit in my case, so what to go for?


Sounds like the Fury would be the correct choice, Tjommi


----------



## ToTheSun!

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spenning*
> 
> Uhm so I'm not sure what would be the best to go for, Fury x or 980 ti? the 980ti is 7300NOK (around 915 USD) but the Fury x is 6999 nok (876 USD) . Also I'm not sure if the 980 ti will fit in my case, so what to go for?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sounds like the Fury would be the correct choice, Tjommi
Click to expand...

Or not









Perhaps he should check if the 980ti actually fits in his case first. Then, he should decide if the increased performance and features are worth $40. The way i see it, if you're spending 800+ on a GPU, the extra performance is well worth $40, but that's my opinion.


----------



## Tivan

Quote:


> Originally Posted by *ToTheSun!*
> 
> Or not
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Perhaps he should check if the 980ti actually fits in his case first. Then, he should decide if the increased performance and features are worth $40. The way i see it, if you're spending 800+ on a GPU, the extra performance is well worth $40, but that's my opinion.


Not to forget the AMD features.

But yeah performance is a pretty big factor!


----------



## mav451

Quote:


> Originally Posted by *iLeakStuff*
> 
> Try to keep up Casey
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here, just this one page will explain where power reductions came from and what they gained using HBM instead of GDDR5
> http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/5


I glossed over this briefly, but after reading it again, I noticed how Anandtech just casually highlights a forum member's work on a main page article








Pretty cool.
Quote:


> An example of the temperature versus power consumption principle on an Intel Core i7-2600K. Image Credit: AT Forums User "Idontcare"


So when are OCN users going to get credited as part of hardware reviews? This can't be the first time a forum user has had their work highlighted on front page articles haha.


----------



## blue1512

Quote:


> Originally Posted by *Spenning*
> 
> Uhm so I'm not sure what would be the best to go for, Fury x or 980 ti? the 980ti is 7300NOK (around 915 USD) but the Fury x is 6999 nok (876 USD) . Also I'm not sure if the 980 ti will fit in my case, so what to go for?


You should also consider that buying nVidia now means a G-sync monitor later (200$ more for 4k). Stick to AMD and help Free sync win the standard war seems to be the better option. Freesync will most likely prevail with VESA backing btw.

And if you want to keep the card for more than 2 years, FuryX is the better option. Current nVidia's offerings only have resource binding tier 2 at best, which is a big advantage in DX12. Also, when Pascal cards with tier 3 and HBM roll out next year, Maxwell will definitely be "Keplered", not future proof at all


----------



## Ceadderman

Considering my 6870s are only now reaching their expiration date, the choice is simple for anyone looking for solid performance but are limited by their budgetary constraints.

Fury @$650
980 @$650 + monitor(s)

I purchase for reasonable or better gaming quality. FPS is great an all but anything over 60fps is not gonna be missed. Especially if there is zero stuttering above that. The human eye won't notice the difference. That's not in any way suggesting that one should intentionally limit ones gaming experience and stick to 1080p gaming.

If you can afford 4k gaming and that's your goal, then nVidia may be the way to go, given the extra performance for $40.

But AMD put their Fury X in good position to make the choice a difficult one, from what I've read. Since 2 cards should give one a reasonably solid 4k experience.









~Ceadder


----------



## dir_d

Has there been a review with 1080p VSR to 4k vs native 4k? I'd like to see how each performs.


----------



## Sheyster

Quote:


> Originally Posted by *Ceadderman*
> 
> I purchase for reasonable or better gaming quality. FPS is great an all but anything over 60fps is not gonna be missed. Especially if there is zero stuttering above that. The human eye won't notice the difference. That's not in any way suggesting that one should intentionally limit ones gaming experience and stick to 1080p gaming.


I'm not sure if I'm understanding you correctly. I can certainly tell the difference between 60 and 120/144 Hz refresh gaming at high FPS. Many others here can as well.


----------



## dmasteR

Quote:


> Originally Posted by *Ceadderman*
> 
> Considering my 6870s are only now reaching their expiration date, the choice is simple for anyone looking for solid performance but are limited by their budgetary constraints.
> 
> Fury @$650
> 980 @$650 + monitor(s)
> 
> I purchase for reasonable or better gaming quality. FPS is great an all but anything over 60fps is not gonna be missed. Especially if there is zero stuttering above that. The human eye won't notice the difference. That's not in any way suggesting that one should intentionally limit ones gaming experience and stick to 1080p gaming.
> 
> If you can afford 4k gaming and that's your goal, then nVidia may be the way to go, given the extra performance for $40.
> 
> But AMD put their Fury X in good position to make the choice a difficult one, from what I've read. Since 2 cards should give one a reasonably solid 4k experience.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Human eyes do notice above 60fps. There's a reason why people like myself after playing 100+hz/fps for so long would gouge my eyes out if I were to play at 60fps/60hz. It feels awful in terms of input lag, and its very blurry when panning your mouse.


----------



## iSlayer

Quote:


> Originally Posted by *PostalTwinkie*
> 
> ......
> 
> LN2 it - someone unlocked it already.


HOT
Quote:


> Originally Posted by *NuclearPeace*
> 
> I don't get all the hype over HBM. The one that I find fascinating is that by locating the memory so close to the GPU you are saving a lot of energy. Someone pointed out that a Fury X only has a whopping 5% more GB/s per SP than a 290X, and that is probably negated by the 390X 6Gbps stock memory. As of right now, its interesting and fantastic as a technological leap forward but strictly from the numbers nothing much changes.
> 
> Other than that I don't like how AMD went about making the Fury X and that comes down to the mandated AIO. I don't like AIOs and the small controversy about release samples having pump whine and buzzing and reports of the 980 Ti Hybrid from EVGA having the same problems definitely didn't help my perception. It feels like they are trying to emulate the "luxury" feeling of the Titan and I really wish AMD would not because the Titan to me isn't something you want to emulate (locked down to a blower cooler, $1000 robbery, etc...). I get that they were trying to keep it in a smaller form factor and keep the power consumption low by keeping it cool (which lowers leakage), but I also want choice. If they offered a Fury X with aftermarket air coolers for $550 this card would be amazing. However at $650, it feels like I am forced to pay extra for an AIO cooler which is something I don't like in a card.


It's one potential future for memory tech. For GPUs, it's likely THE future of tech.
Quote:


> Originally Posted by *STEvil*
> 
> You save power and decrease PCB complexity.
> 
> HBM is the future, just wait until it arrives on CPU's and in laptops.


Also die size is saved from not requiring GDDR5 memory controllers.
Quote:


> Originally Posted by *Liranan*
> 
> I'm sure you'd have said the same thing when DDR5 was released by AMD. The 4870 was equal to the 260 at launch but by the end of its life it was equal to the 280 because nVidia had moved on and had stopped caring about their older generation.
> 
> Of course this doesn't excuse AMD and I am disappointed in the Fluffy X because on paper it's spectacular. Saying that I had a 4870 and was very pleased with it for the years I had it.


The 4870 was one sexy thing. Really needed 1GB variant though, that one shined brightest.
Quote:


> Originally Posted by *Ceadderman*
> 
> We'll see if team Greenies have something to say when x2 launches. So far they're all parroting the same incredible BS.
> 
> Frame rates higher? Sure. But you won't notice it without some software for counter. Drop the counter and tell me if you can *SEE* a difference. Truthfully you can't.


Wot
Quote:


> More VRAM on 980/980ti and Titan? Check. But again you won't see much of a difference if any compared to Fury. So in effect, who gives a rip?


Well we've seen the Fury X running out of VRAM and having significant dips in minimum FPS already, that doesn't bode well for 4k. That said, i'm curious about how it does for 1080p/1440p, as CF Fury X's (stock) if they have the edge on the 980 Ti, they'd be sweet for people pushing 1080p+/120+.
Quote:


> If you're a fan of team Green, that's fine with me. Has Titan Z is currently a $1600 card. 980ti is in the same price point and stock vs stock Fury is just fine. It's early yet and AMD is still on their first Beta driver.


And 980 Ti isn't even a month old yet.
Quote:


> So imho, they're at the correct price point because while nVidia has them in epeen statistically, Fury is competing well against them IRL gaming situations and *that's* how games are meant to be played.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


The Fury X is performing (slightly) worse but that doesn't matter.

u wot
Quote:


> Originally Posted by *NuclearPeace*
> 
> Why does everyone need to be put into "team red" or "team green"? The amount of fanboyism and us vs them on this forum is getting to absurd levels. It's becoming less and less enjoyable to discuss hardware on this forum now.


Ceadder is...something. Let's just say he's either not thinking this through or his thoughts are being influenced.
Quote:


> Originally Posted by *Ceadderman*
> 
> Nothing wrong with it imho. If "team" doesn't apply to you because you're using both at some time or other, simply ignore it and carry on.


You do know that AMD and Nvidia want us to do this stupid Green/Red team nonsense, right? Because it encourages brand loyalty, so you'll just buy regardless of if its best.
Quote:


> I am not saying it to be disrespectful toward anybody. I am saying it because my bleeding autocorrect on my S4 is a *********. Case in point nVidia becomes "no idea" because sometimes I forget to tap the arrow before hitting the space.


Google Keyboard
Quote:


> Originally Posted by *magnek*
> 
> There's more to it than just power saved from going with HBM:
> http://techreport.com/review/28294/amd-high-bandwidth-memory-explained/2


The GDDR5 memory controllers are where the bulk of the energy savings come from.
Quote:


> Originally Posted by *dir_d*
> 
> Has there been a review with 1080p VSR to 4k vs native 4k? I'd like to see how each performs.


http://www.gamersnexus.net/news-pc/1727-amd-catalyst-omega-update-vsr-vs-dsr

Best I can do, though from what i've seen in general VSR scales better than DSR. Both are more impacting than native 4k.


----------



## magnek

Man do you enjoy writing books or something?









You very much remind me of another poster on a certain laptop oriented forum, although I'm quite sure you're not that guy.


----------



## 364901

Quote:


> Originally Posted by *dir_d*
> 
> Has there been a review with 1080p VSR to 4k vs native 4k? I'd like to see how each performs.


I can do one once I have a UHD 4K monitor to work with. That won't come soon or cheap though.


----------



## iSlayer

Quote:


> Originally Posted by *magnek*
> 
> Man do you enjoy writing books or something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You very much remind me of another poster on a certain laptop oriented forum, although I'm quite sure you're not that guy.


I'm not.

I haven't been around to make posts, so i'm doing a bunch of catch up in a lot of places.


----------



## dir_d

Quote:


> Originally Posted by *CataclysmZA*
> 
> I can do one once I have a UHD 4K monitor to work with. That won't come soon or cheap though.


That would be nice, thx


----------



## Themisseble

VR
https://www.youtube.com/watch?v=fLlzb9KEbDY


----------



## BoredErica

Quote:


> Originally Posted by *mav451*
> 
> I glossed over this briefly, but after reading it again, I noticed how Anandtech just casually highlights a forum member's work on a main page article
> 
> 
> 
> 
> 
> 
> 
> 
> Pretty cool.
> So when are OCN users going to get credited as part of hardware reviews? This can't be the first time a forum user has had their work highlighted on front page articles haha.


God dam.

That just reminds me how I should've been on an article for punching people who believe in 1:1 cache ratio parity for Haswell OCing in the face!

Quote:


> Originally Posted by *magnek*
> 
> Man do you enjoy writing books or something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You very much remind me of another poster on a certain laptop oriented forum, although I'm quite sure you're not that guy.


wat

Combined his post fits one paragraph. Contrast: I read all 600 pages in some news thread and replied to every post I wanted to reply to and not a single person responded to that post. Or read it.


----------



## rt123

Quote:


> Originally Posted by *mav451*
> 
> I glossed over this briefly, but after reading it again, I noticed how Anandtech just casually highlights a forum member's work on a main page article
> 
> 
> 
> 
> 
> 
> 
> 
> Pretty cool.
> So when are OCN users going to get credited as part of hardware reviews? This can't be the first time a forum user has had their work highlighted on front page articles haha.


If you had been on At Forums for a while you would know that the "User" in light here is a has been very well know member of that community for years (10+ years from I remember) & did I mention that he actually works in Semi Conductor Industry.?

He always uses detailed scientific methods in the tests he conducts & some of his work is probably superior to the work done by the "Journalists" of various review sites. Thereby, using his work in the article was okay. There are a very few members here on OCN that have that kind of a background.


----------



## SpeedyVT

Quote:


> Originally Posted by *Casey Ryback*
> 
> Shouldn't have jumped on board the hype train then lol, AMD never said it was a DDR5 'killer', they said it was the way of the future and it is clearly superior to DDR5, it's been proven by being smaller and more efficient.
> 
> How would the 4096 processor fury, only use 10W more whilst gaming over a 290X (2816 processors), if it only saved 20W?
> 
> http://hexus.net/tech/reviews/graphics/84170-amd-radeon-r9-fury-x/?page=13
> 
> Where did you get the 20W info from?


He's wrong to say it isn't quite the killer, someone already overclocked the memory by 20% and 9% core oc gaining them a whopping 20% gain over stock. It's a killer in a sense because it allows all parts to be under one hood.

Most of the power consumption is in the pump + cores. When we get a fanned model we'll see it drop way below the 290x. I'm guessing by 10-20% less.


----------



## Apokalipse

According to the Anandtech article, the watercooling is there partially to reduce power consumption; Because higher temperatures means more current leakage, which means more power consumption.
The Nano cards will likely be highly binned dies.


----------



## Sashimi

Quote:


> Originally Posted by *Sheyster*
> 
> I'm not sure if I'm understanding you correctly. I can certainly tell the difference between 60 and 120/144 Hz refresh gaming at high FPS. Many others here can as well.


Thread's been quiet for a while so I thought I'll reply. Yes human eyes can definitely see beyond 60 fps. It is especially easy to notice on fast movements. In fact the difference is apparent if you simply move your mouse left to right on the desktop on a high refresh rate monitor as oppose to a normal 60hz monitor. Obviously it also depends on how sensitive the user's eyes are to movements, but I dare say most people can perceive beyond 60hz even when viewing just medium speed.


----------



## tconroy135

Quote:


> Originally Posted by *Sashimi*
> 
> Thread's been quiet for a while so I thought I'll reply. Yes human eyes can definitely see beyond 60 fps. It is especially easy to notice on fast movements. In fact the difference is apparent if you simply move your mouse left to right on the desktop on a high refresh rate monitor as oppose to a normal 60hz monitor. Obviously it also depends on how sensitive the user's eyes are to movements, but I dare say most people can perceive beyond 60hz even when viewing just medium speed.


I think people who say Humans can't see past 60fps are playing at 100fps on a 60Hz monitor...


----------



## iSlayer

Quote:


> Originally Posted by *tconroy135*
> 
> I think people who say Humans can't see past 60fps are playing at 100fps on a 60Hz monitor...


I can definitely see a difference between 100 fps and 60 fps @ 60Hz.


----------



## Sashimi

Quote:


> Originally Posted by *tconroy135*
> 
> I think people who say Humans can't see past 60fps are playing at 100fps on a 60Hz monitor...


Lol good one.

I should point out alcohol also affects perceivable refresh rate not to mention input lag hahahaha...


----------



## Thoth420

Quote:


> Originally Posted by *Sashimi*
> 
> Thread's been quiet for a while so I thought I'll reply. Yes human eyes can definitely see beyond 60 fps. It is especially easy to notice on fast movements. In fact the difference is apparent if you simply move your mouse left to right on the desktop on a high refresh rate monitor as oppose to a normal 60hz monitor. Obviously it also depends on how sensitive the user's eyes are to movements, but I dare say most people can perceive beyond 60hz even when viewing just medium speed.


I can't believe this is still a debate....and looking at my sig rig...you know my standpoint.


----------



## Casey Ryback

Quote:


> Originally Posted by *Thoth420*
> 
> I can't believe this is still a debate....and looking at my sig rig...you know my standpoint.


MG279Q









And I agree this debate sucks it always goes on forever.


----------



## Sashimi

Quote:


> Originally Posted by *Thoth420*
> 
> I can't believe this is still a debate....and looking at my sig rig...you know my standpoint.


Great buy on the Asus MG279Q man. I bought a Yamakasi Catleap 2b extreme OC a while back. I don't mind doing the OC myself but still wish it was a main brand with proper warranty but none were available at the time. Nevertheless it's one of my most treasured hardware.
Quote:


> Originally Posted by *Casey Ryback*
> 
> MG279Q
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I agree this debate sucks it always goes on forever.


Lol any debate is better than a dead thread. Been 2 days since it was updated haha...just keeping the conversation going.


----------



## Thoth420

Thanks guys. I went through a few of the swifts on release ended up never getting a good one. Then got a cherry Acer G sync xb270hu but decided I wanted to try the fury x. Still have yet to test the mg yet but can't wait.

I can only stand 60hz for media and maybe light browsing...


----------



## edo101

And now I hear from users here that they are getting 15% more performance from Fury cards with the latest 15.7. Oh well its not like the cards were supposed to get better with drivers. We have to buy our next cards now so we can get that performance boost that will be matched by AMD a couple of weeks later and with better driver support down the road









OCN logic here is stupefying when it comes to upgrades


----------



## BoredErica

Quote:


> Originally Posted by *Thoth420*
> 
> Thanks guys. I went through a few of the swifts on release ended up never getting a good one. Then got a cherry Acer G sync xb270hu but decided I wanted to try the fury x. Still have yet to test the mg yet but can't wait.
> 
> I can only stand 60hz for media and maybe light browsing...


I've never seen above 60hz before.


----------



## Sashimi

Quote:


> Originally Posted by *Darkwizzie*
> 
> I've never seen above 60hz before.


Time to take the plunge
Quote:


> Originally Posted by *edo101*
> 
> And now I hear from users here that they are getting 15% more performance from Fury cards with the latest 15.7. Oh well its not like the cards were supposed to get better with drivers. We have to buy our next cards now so we can get that performance boost that will be matched by AMD a couple of weeks later and with better driver support down the road
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCN logic here is stupefying when it comes to upgrades


15%? That would be pretty awesome!


----------



## edo101

Yeah but don't take me word for it, check it out. CFX is fixed, no more CFX flickering especially in Witcher 3, VSR is available, and a bunch of other cool stuff.

And the driver updates will become more steady as I predicted because Fury X is now out and AMD engineers can catch a break.

But 15% is what I am hearing all around. And all you had to do was wait two weeks after the release.


----------



## Blackops_2

I'm almost more interested in DX12 results coming at the end of the month honestly. Though i hope they did get a 15% increase somehow. Still waiting on OC results with volts.


----------



## edo101

Quote:


> Originally Posted by *Blackops_2*
> 
> I'm almost more interested in DX12 results coming at the end of the month honestly. Though i hope they did get a 15% increase somehow. Still waiting on OC results with volts.


Not sure if that'll help with all the non DX 12 games we have. I do know DX 11 performance has increased and lower res performance has also increased with 15.7


----------



## Sashimi

If 15% is true then that would firmly place the Fury X ahead of the 980 Ti wouldn't it? It's good news all around.


----------



## mcg75

Quote:


> Originally Posted by *Sashimi*
> 
> If 15% is true then that would firmly place the Fury X ahead of the 980 Ti wouldn't it? It's good news all around.


15% would put the Fury X ahead of Titan X in most cases not just 980 Ti.

But don't you think if this were true, AMD would have at least noted Fury X performance gains in the 15.7 driver notes?

15% gain would completely change the high end gpu landscape in AMD's favor. We wouldn't be talking about hearsay from random forum users, AMD would be asking reviewers to retest this and it would be a huge news story.


----------



## BoredErica

Quote:


> Originally Posted by *mcg75*
> 
> 15% would put the Fury X ahead of Titan X in most cases not just 980 Ti.
> 
> But don't you think if this were true, AMD would have at least noted Fury X performance gains in the 15.7 driver notes?
> 
> 15% gain would completely change the high end gpu landscape in AMD's favor. We wouldn't be talking about hearsay from random forum users, AMD would be asking reviewers to retest this and it would be a huge news story.


True. And if it was really 15% gains on average, AMD dun goofed by either releasing the Fury X so early that their drivers aren't done or releasing drivers so late, the Fury X has already been released and benchmarked. Most people already formed their opinion.


----------



## Outlaw4lf

It could have been in one game and maybe in 4k something like in Crysis 3...it maybe went from 18fps to 21fps which is 15% @4k, has anybody done before and after benchmarks with Omega or 15.2 to 15.7?


----------



## iLeakStuff

Quote:


> Originally Posted by *edo101*
> 
> And now I hear from users here that they are getting 15% more performance from Fury cards with the latest 15.7. Oh well its not like the cards were supposed to get better with drivers. We have to buy our next cards now so we can get that performance boost that will be matched by AMD a couple of weeks later and with better driver support down the road
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCN logic here is stupefying when it comes to upgrades


15%+ performance? Yeah right.

If you are referring to the latest driver it improve CPU overhead. Nothing to do with the GPU









The GPU score from Firestrike improved 1% from the last driver.








https://www.reddit.com/r/AdvancedMicroDevices/comments/3cmnl5/catalyst_157_overhead_improvements_in_dx11_over/csx3emk


----------



## p4inkill3r

Quote:


> Originally Posted by *edo101*
> 
> And now I hear from users here that they are getting 15% more performance from Fury cards with the latest 15.7. Oh well its not like the cards were supposed to get better with drivers. We have to buy our next cards now so we can get that performance boost that will be matched by AMD a couple of weeks later and with better driver support down the road
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCN logic here is stupefying when it comes to upgrades


If you're speaking of the Shadow of Mordor benches I posted, I believe I borked the settings on my follow up runs. 15.7 results are the same in SOM as they were on the previous drivers @ 1440p.


----------



## Redwoodz

Quote:


> Originally Posted by *mcg75*
> 
> 15% would put the Fury X ahead of Titan X in most cases not just 980 Ti.
> 
> But don't you think if this were true, AMD would have at least noted Fury X performance gains in the 15.7 driver notes?
> 
> 15% gain would completely change the high end gpu landscape in AMD's favor. We wouldn't be talking about hearsay from random forum users, AMD would be asking reviewers to retest this and it would be a huge news story.


Quote:


> Originally Posted by *iLeakStuff*
> 
> 15%+ performance? Yeah right.
> 
> If you are referring to the latest driver it improve CPU overhead. Nothing to do with the GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The GPU score from Firestrike improved 1% from the last driver.
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.reddit.com/r/AdvancedMicroDevices/comments/3cmnl5/catalyst_157_overhead_improvements_in_dx11_over/csx3emk


I seem to remember reading the driver release notes with this clearly stated on the page


http://support.amd.com/en-us/download/desktop?os=Windows+8.1+-+64

And furthermore, EK has released the FuryX full cover waterblock.

http://www.techpowerup.com/214114/ek-radeon-r9-fury-x-water-blocks-now-available.html

Nvidia has no answer for this. Unrivaled performance in a single slot card,best multi-gpu performance.Freesync and crossfire all working with the latest 15.7 drivers released yesterday.No more pump noise,and when the Fury Pro hits it will be the way to go for multi-gpu set-ups for hardly nothing compared to the competition,in small-form factors even.


----------



## blue1512

5% increase for FuryX in Firestrike graphics score with the new driver
http://www.3dmark.com/compare/fs/5348117/fs/5234795


----------



## Casey Ryback

Quote:


> Originally Posted by *blue1512*
> 
> 5% increase for FuryX in Firestrike graphics score with the new driver
> http://www.3dmark.com/compare/fs/5348117/fs/5234795


Nice.


----------



## mcg75

Quote:


> Originally Posted by *Redwoodz*
> 
> I seem to remember reading the driver release notes with this clearly stated on the page.


Where do you see 15% performance improvements on the release notes?

And no, making crossfire profiles work doesn't count.


----------



## blue1512

Quote:


> Originally Posted by *mcg75*
> 
> Where do you see 15% performance improvements on the release notes?
> 
> And no, making crossfire profiles work doesn't count.


Given that the driver can make a 5% boost for FuryX in Firestrike, a 15% boost in 1080p is achievable for certain titles. FuryX did have strange behavior in lower res at launch after all. However 15% in every circumstance seems unrealistic.


----------



## tconroy135

Quote:


> Originally Posted by *blue1512*
> 
> Given that the driver can make a 5% boost for FuryX in Firestrike, a 15% boost in 1080p is achievable for certain titles. FuryX did have strange behavior in lower res at launch after all. However 15% in every circumstance seems unrealistic.


Quote:


> Originally Posted by *mcg75*
> 
> Where do you see 15% performance improvements on the release notes?
> 
> And no, making crossfire profiles work doesn't count.


I think the real issue is why do release a new product without a solid driver configuration for said product. Imagine if the Fury X was beating the Titan X (even without boost) by 2-5% it would be a boon for AMD. Instead they release a product that can't even be overclocked or have voltage added to it properly.


----------



## Casey Ryback

Quote:


> Originally Posted by *tconroy135*
> 
> I think the real issue is why do release a new product without a solid driver configuration for said product. Imagine if the Fury X was beating the Titan X (even without boost) by 2-5% it would be a boon for AMD. Instead they release a product that can't even be overclocked or have voltage added to it properly.


AMD never release a card with solid drivers.

Yet their GPU's are always competitive. At one point I almost believed the trolls on OCN that were saying the fiji cards were garbage, shame on me.

Yet I think back to when has AMD actually released bad performing gpu's in the last 5 years? Fact is they haven't.

There has been some controversial cooling (R9 200 series lol), but at the root of the product they are based upon a powerful graphics solution.

There is no doubt in my mind fury is following suit and there's already driver gains after a couple of weeks, and they are really competing in crossfire scenarios already.

The voltage is not locked on the fiji cards, never has been. They had to release the card asap because the 980ti was running a muck out there with zero competition.

It's more work to update overclocking tools to be able to adjust GCN voltage control, and it will be unlocked in the coming week or two is what I've heard about sapphire trixx (and I assume MSI AB)


----------



## blue1512

Well, look like the boost is really HUGE for FuryX. An owner on OCN has a 25% boost in Farcry 4 1440p
Quote:


> Originally Posted by *royfrosty*
> 
> BOOM!
> 
> This is awesome. The new drivers really worked.
> 
> Farcry 4 on 1440p with all Preset Ultra settings.
> 
> Note that it is just on stock clock. No OC, nor HBM OC.
> 
> 2015-07-09 22:44:11 - FarCry4
> Frames: 7056 - Time: 87953ms - Avg: 80.225 - Min: 62 - Max: 98
> 
> Before on 15.15 drivers.
> 
> 1440P with all Preset Ultra settings.
> 
> 2015-06-26 16:04:54 - FarCry4
> Frames: 5810 - Time: 88234ms - Avg: 65.848 - Min: 51 - Max: 95
> 
> EDIT: Note that the benchmark was taken during the first time when you met Sabal and he asked you to run through the doors to the truck. The benchmark ends when the truck was hit and went off the cliff.


----------



## tconroy135

Quote:


> Originally Posted by *Casey Ryback*
> 
> AMD never release a card with solid drivers.
> 
> Yet their GPU's are always competitive. At one point I almost believed the trolls on OCN that were saying the fiji cards were garbage, shame on me.
> 
> Yet I think back to when has AMD actually released bad performing gpu's in the last 5 years? Fact is they haven't.
> 
> There has been some controversial cooling (R9 200 series lol), but at the root of the product they are based upon a powerful graphics solution.
> 
> There is no doubt in my mind fury is following suit and there's already driver gains after a couple of weeks, and they are really competing in crossfire scenarios already.
> 
> The voltage is not locked on the fiji cards, never has been. They had to release the card asap because the 980ti was running a muck out there with zero competition.
> 
> It's more work to update overclocking tools to be able to adjust GCN voltage control, and it will be unlocked in the coming week or two is what I've heard about sapphire trixx (and I assume MSI AB)


I think though releasing the Fury X without all of these solutions and especially in the limited quantites that have been sold all AMD has done is bolster that NVIDIA propaganda that AMD cannot compete at any price point.


----------



## sugalumps

Quote:


> Originally Posted by *blue1512*
> 
> Well, look like the boost is really HUGE for FuryX. An owner on OCN has a 25% boost in Farcry 4 1440p


Proof?


----------



## Casey Ryback

Quote:


> Originally Posted by *tconroy135*
> 
> I think though releasing the Fury X without all of these solutions and especially in the limited quantites that have been sold all AMD has done is bolster that NVIDIA propaganda that AMD cannot compete at any price point.


Yes and no.

On websites like OCN where expectations are huge, you're right.

On others I've noticed different views though.

They are also sold out, but that doesn't really mean much when in such limited supply of course.

AMD has been competing at price points for a long time, even against maxwell (not so much the 980/980ti though)

Nvidia propoganda has spread far and wide and thanks to good products, a good driver team, along with excellent shills and also marketing, it's going to be a hard act to follow regardless of having competitive products.


----------



## Liranan

Quote:


> Originally Posted by *tconroy135*
> 
> I think the real issue is why do release a new product without a solid driver configuration for said product. Imagine if the Fury X was beating the Titan X (even without boost) by 2-5% it would be a boon for AMD. Instead they release a product that can't even be overclocked or have voltage added to it properly.


nVidia have done this often enough too so I blame both sides for shoddy release drivers.


----------



## iSlayer

Nvidia and AMD have been dropping the ball in drivers as of late. Slow on the AMD side, catastrophic problems on the Nvidia side.
Quote:


> Originally Posted by *iLeakStuff*
> 
> 15%+ performance? Yeah right.
> 
> If you are referring to the latest driver it improve CPU overhead. Nothing to do with the GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The GPU score from Firestrike improved 1% from the last driver.
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.reddit.com/r/AdvancedMicroDevices/comments/3cmnl5/catalyst_157_overhead_improvements_in_dx11_over/csx3emk


Uh, Nvidia saw ~10% boosts with reducing CPU overhead in DX11. 337.52 remember?
Quote:


> Originally Posted by *Redwoodz*
> 
> I seem to remember reading the driver release notes with this clearly stated on the page
> 
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+8.1+-+64
> 
> And furthermore, EK has released the FuryX full cover waterblock.
> 
> http://www.techpowerup.com/214114/ek-radeon-r9-fury-x-water-blocks-now-available.html
> 
> Nvidia has no answer for this. Unrivaled performance in a single slot card,best multi-gpu performance.Freesync and crossfire all working with the latest 15.7 drivers released yesterday.No more pump noise,and when the Fury Pro hits it will be the way to go for multi-gpu set-ups for hardly nothing compared to the competition,in small-form factors even.


That is one beautiful looking card.

Looking forward to comprehensive testing on these improvements.


----------



## Thoth420

I have to agree with Casey. AMD does not make crap and I have most likely played with more gpus from both camps in the past years than most people in this thread.

I can list them if needed...


----------



## Blackops_2

I will say i had a crash while taking a full length DAT yesterday yet again from 353.xx and i was already pissed. Need to just roll back. I probably average 3-5 crashes a week on both PCs.


----------



## blue1512

Quote:


> Originally Posted by *sugalumps*
> 
> Proof?


Wait..... You didn't care to read my post, right?
Quote:


> Originally Posted by *royfrosty*
> 
> BOOM!
> 
> This is awesome. The new drivers really worked.
> 
> Farcry 4 on 1440p with all Preset Ultra settings.
> 
> Note that it is just on stock clock. No OC, nor HBM OC.
> 
> 2015-07-09 22:44:11 - FarCry4
> Frames: 7056 - Time: 87953ms - Avg: 80.225 - Min: 62 - Max: 98
> 
> Before on 15.15 drivers.
> 
> 1440P with all Preset Ultra settings.
> 
> 2015-06-26 16:04:54 - FarCry4
> Frames: 5810 - Time: 88234ms - Avg: 65.848 - Min: 51 - Max: 95
> 
> EDIT: Note that the benchmark was taken during the first time when you met Sabal and he asked you to run through the doors to the truck. The benchmark ends when the truck was hit and went off the cliff.


Here is where he first posted his number. This guy is the one who released the first benchmark of FuryX.
http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633-59.html


----------



## Thoth420

Quote:


> Originally Posted by *Blackops_2*
> 
> I will say i had a crash while taking a full length DAT yesterday yet again from 353.xx and i was already pissed. Need to just roll back. I probably average 3-5 crashes a week on both PCs.


Nvidia drivers have been so fail lately. I was using a wicked old one with my 780ti as it was netting best performance and stability. The Gta/witcher branch wasn't even finished yet somehow managed to all pass cert.

When they stopped releasing betas on their site and started with geforce experience (installed or not just giving a point in time reference) and this game NOT ready driver the quality has basically equated to release day games.

People bash AMD for not having a driver for a broken port day 1 as if that is a sensible way to spend time...I don't even play games til they are patched to final. Patience has proven a virtue.


----------



## Blackops_2

Quote:


> Originally Posted by *Thoth420*
> 
> Nvidia drivers have been so fail lately. I was using a wicked old one with my 780ti as it was netting best performance and stability. The Gta/witcher branch wasn't even finished yet somehow managed to all pass cert.
> 
> When they stopped releasing betas on their site and started with geforce experience (installed or not just giving a point in time reference) and this game NOT ready driver the quality has basically equated to release day games.
> 
> People bash AMD for not having a driver for a broken port day 1 as if that is a sensible way to spend time...I don't even play games til they are patched to final. Patience has proven a virtue.


I had never had a problem with either until now. I don't recall ever having an issue with my 7970 or 4890, or any of my previous Nvidia GPUs except now. Which is quite annoying honestly.


----------



## Thoth420

Quote:


> Originally Posted by *Blackops_2*
> 
> I had never had a problem with either until now. I don't recall ever having an issue with my 7970 or 4890, or any of my previous Nvidia GPUs except now. Which is quite annoying honestly.


I can imagine. Nvidia is touted for their superior drivers but lately they can't even manage stable...if it's not stable performance is 0 in my book.


----------



## gamervivek

Amusing to watch the previous couple of pages.

Someone says Fury improves 15%, when the guy who benchmarked clarifies that it doesn't.









Someone posts benches of 3xx cards to disprove that Fury's scores improved.









And then someone posts an increase in Firestrike of 5%, except the CPU is overclocked 200Mhz more in the new driver test









A 20% boost with just a driver update, I'll keep my fingers crossed unless he is 100% sure he didn't forget to turn on some setting this time around.


----------



## Redwoodz

Quote:


> Originally Posted by *mcg75*
> 
> Where do you see 15% performance improvements on the release notes?
> 
> And no, making crossfire profiles work doesn't count.


Doesn't say 15%,but 10% in Tomb Raider and 7% in Far Cry.Results will vary,but it proves driver improvements are there,and substantial.


----------



## sugalumps

Quote:


> Originally Posted by *blue1512*
> 
> Wait..... You didn't care to read my post, right?
> Here is where he first posted his number. This guy is the one who released the first benchmark of FuryX.
> http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633-59.html


Again, proof?

Not someone saying they are getting a random number percentage better. You cant just claim anything and expect everyone to believe it, give me screen shots of the settings before and after matching. Then screenshots/video proof of the benchmark run both before and after to see the difference.

Also person you are quoting - "*roy*frosty"

I can guess who that is LOL


----------



## blue1512

Quote:


> Originally Posted by *sugalumps*
> 
> Again, proof?
> 
> Not someone saying they are getting a random number percentage better. You cant just claim anything and expect everyone to believe it, *give me screen shots of the settings before and after matching. Then screenshots/video proof of the benchmark run both before and after to see the difference*.
> 
> Also person you are quoting - "*roy*frosty"
> 
> I can guess who that is LOL


Paid me, I will do it for you. Am I that bored???


----------



## Redwoodz

Quote:


> Originally Posted by *sugalumps*
> 
> Again, proof?
> 
> Not someone saying they are getting a random number percentage better. You cant just claim anything and expect everyone to believe it, give me screen shots of the settings before and after matching. Then screenshots/video proof of the benchmark run both before and after to see the difference.
> 
> Also person you are quoting - "*roy*frosty"
> 
> I can guess who that is LOL


The drivers came out yesterday.I'm am absolutely positive you will have all the proof you need in the next few days.


----------



## GorillaSceptre

Quote:


> Originally Posted by *gamervivek*
> 
> Amusing to watch the previous couple of pages.
> 
> Someone says Fury improves 15%, when the guy who benchmarked clarifies that it doesn't.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Someone posts benches of 3xx cards to disprove that Fury's scores improved.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And then someone posts an increase in Firestrike of 5%, except the CPU is overclocked 200Mhz more in the new driver test
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A 20% boost with just a driver update, I'll keep my fingers crossed unless he is 100% sure he didn't forget to turn on some setting this time around.


This.


----------



## Ha-Nocri

well, only ~4 more days till Flurry when all reviewers will use 15.7 drivers, so we'll know for sure. If they don't use the drivers I won't even bother reading the reviews


----------



## PostalTwinkie

Quote:


> Originally Posted by *Darkwizzie*
> 
> I've never seen above 60hz before.


Oh boy...


----------



## Forceman

Quote:


> Originally Posted by *Ha-Nocri*
> 
> well, only ~4 more days till Flurry when all reviewers will use 15.7 drivers, so we'll know for sure. If they don't use the drivers I won't even bother reading the reviews


New rumor is that Fury is launching tomorrow.

http://videocardz.com/57177/amd-radeon-r9-fury-launches-tomorrow


----------



## PostalTwinkie

Quote:


> Originally Posted by *Thoth420*
> 
> I can imagine. Nvidia is touted for their superior drivers but lately they can't even manage stable...if it's not stable performance is 0 in my book.


I haven't had stability issues.............


----------



## mcg75

Quote:


> Originally Posted by *Redwoodz*
> 
> Doesn't say 15%,but 10% in Tomb Raider and 7% in Far Cry.Results will vary,but it proves driver improvements are there,and substantial.


Nobody disputed there were improvements.

But just as it's not fair to downplay them as nothing, it's also not fair to overplay them as something they aren't.

Besides, if they can wring out 5% in each of the next few drivers, that speaks for itself as a great improvement.


----------



## fcman

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I haven't had stability issues.............


People will make up whatever they can to justify a $650 purchase.


----------



## Ganf

Quote:


> Originally Posted by *fcman*
> 
> People will make up whatever they can to justify a $650 purchase.




Just sayin'.....


----------



## fcman

Quote:


> Originally Posted by *Ganf*
> 
> 
> 
> Just sayin'.....


Not sure what you're trying to prove. I never felt the need to talk badly and make up lies about AMD after I bought my card. In fact I am hoping the Fury X ends up getting better with maturity. I found the current nVidia lineup a bit underwhelming other than the 980ti (and only because of the 980ti's price)


----------



## Ceadderman

AMD has never put out a fully capable driver at launch. They want to get their cards in the hands of their buyers first. Anyone with any knowledge of this will wait patiently for the cards to fill the usage market enough and wait for better upcoming results. Shoot, Fury X hasn't been on the market but a little under a month and here we go, new better driver.









~Ceadder


----------



## Ganf

Quote:


> Originally Posted by *fcman*
> 
> Never felt the need to talk badly and make up lies about AMD after I bought my card. In fact I am hoping the Fury X ends up getting better with maturity. I found the current nVidia lineup a bit underwhelming other than the 980ti (and only because of the 980ti's price)


Ahh, but everybody is fair game when it comes to drivers, because you don't have to talk about how there are hundreds of thousands of user configurations and millions of possible software conflicts that could be contributing to any *perceived* problems.









I try... I don't always succeed but I *try* to stay focused on what the two companies are doing right with their drivers instead of peeking over the fence to gloat over what the other guy is getting wrong. That's really the only way to be an adult about it unless you're using both brands in similar setups daily.


----------



## fcman

Quote:


> Originally Posted by *Ganf*
> 
> Ahh, but everybody is fair game when it comes to drivers, because you don't have to talk about how there are hundreds of thousands of user configurations and millions of possible software conflicts that could be contributing to any *perceived* problems.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I try... I don't always succeed but I *try* to stay focused on what the two companies are doing right with their drivers instead of peeking over the fence to gloat over what the other guy is getting wrong. That's really the only way to be an adult about it unless you're using both brands in similar setups daily.


I agree and I won't claim that nVidia is perfect. I have been having some issues with my Titan for the past couple weeks that I believe is driver related. But Thoth420 makes it seem like nVidia drivers are in shambles, which is utterly ridiculous. For the most part that are extraordinarily stable, with a few bugs here and there.


----------



## Cakewalk_S

I wouldn't have expected this so soon but it now feels like all other cards launching without HBM are just old school tech... Obviously AMD can't match NVIDIA in their chip design, but this HBM thing is really cool. I can't wait for Nvidia to take advantage of this tech. And with AMD on the fritz...who knows how much longer they'll be around.


----------



## iinversion

Even if the Fury X was faster than the Titan X I would not be interested.

Until AMD can match or beat Nvidia with CPU overhead I could care less. Good to see they got a slight improvement in that regard with this driver but still a long way to go.


----------



## iSlayer

Quote:


> Originally Posted by *Thoth420*
> 
> I can imagine. Nvidia is touted for their superior drivers but lately they can't even manage stable...if it's not stable performance is 0 in my book.


"A program that produces incorrect results twice as fast is infinitely slower."


----------



## sugalumps

Quote:


> Originally Posted by *blue1512*
> 
> Paid me, I will do it for you. Am I that bored???


So you have zero proof atm is what you are saying?


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ganf*
> 
> Ahh, but everybody is fair game when it comes to drivers, because you don't have to talk about how there are hundreds of thousands of user configurations and millions of possible software conflicts that could be contributing to any *perceived* problems.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I try... I don't always succeed but I *try* to stay focused on what the two companies are doing right with their drivers instead of peeking over the fence to gloat over what the other guy is getting wrong. That's really the only way to be an adult about it unless you're using both brands in similar setups daily.


I like to just take down the fence and look them both in the face.


----------



## Ganf

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I like to just take down the fence and look them both in the face.


It would help if you didn't do this in the buff. And please stop scratching there....


----------



## knightsilver

Any reviews on the R9 Nano yet?


----------



## Forceman

Quote:


> Originally Posted by *knightsilver*
> 
> Any reviews on the R9 Nano yet?


Not even a release date yet.


----------



## Thoth420

Quote:


> Originally Posted by *Ganf*
> 
> 
> 
> Just sayin'.....












Pretty sure the Witcher driver was causing BSOD's in as little as Chrome with hardware accel on and Nvidia confirmed the bug themselves. Glad to hear you are stable...

Also I don't need to justify my purchase especially since it was a gift. If I did have to justify it I would say that well....I paid the same for a 780Ti last year and just look at the staying power that the ....oh wait......









The original Titan holds it's value and performance pretty well...I didn't have enough for a 1000 GPU then or now though.
I just think for what I spent I should have gotten another year but with new lines rolling out annually guess it just isn't in the cards to expect that.


----------



## SKYMTL

Quote:


> Originally Posted by *Redwoodz*
> 
> Doesn't say 15%,but 10% in Tomb Raider and 7% in Far Cry.Results will vary,but it proves driver improvements are there,and substantial.


Just to clarify before this gets out of hand.

The performance improvements are for 200-series cards versus the Omega driver.

After spending all night testing on Fury and Fury X, there is NO DIFFERENCE between the 15.15 and 15.7. As a matter of fact, 15.15 was perfectly stable while with 15.7 I experienced random Windows 8.1 boot locks and crashes. Witcher actually sees a ~5% performance reduction without Hairworks on, while Far Cry 4 sees a slight decrease as well.

It was a complete waste of time. Having this driver come out right before the Fury launch thus giving reviewers very little time to react is 100% marketing. It sows doubt about methodologies and will prompt the usual "well THAT'S why XXX performs so horribly, they used the wrong driver!".


----------



## Sheyster

Quote:


> Originally Posted by *SKYMTL*
> 
> Just to clarify before this gets out of hand.
> 
> The performance improvements are for 200-series cards versus the Omega driver.
> 
> After spending all night testing on Fury and Fury X, there is NO DIFFERENCE between the 15.15 and 15.7. As a matter of fact, 15.15 was perfectly stable while with 15.7 I experienced random Windows 8.1 boot locks and crashes. Witcher actually sees a ~5% performance reduction without Hairworks on, while Far Cry 4 sees a slight decrease as well.
> 
> It was a complete waste of time. Having this driver come out right before the Fury launch thus giving reviewers very little time to react is 100% marketing. It sows doubt about methodologies and will prompt the usual "well THAT'S why XXX performs so horribly, they used the wrong driver!".


Why am I not surprised?







Desperate times call for desperate measures...

Note to self:


----------



## BoredErica

Quote:


> Originally Posted by *Redwoodz*
> 
> Doesn't say 15%,but 10% in Tomb Raider and 7% in Far Cry.Results will vary,but it proves driver improvements are there,and substantial.


Since when are we trusting the performance improvement numbers from driver release notes? As far as I have known, both sides blow up the numbers to unrealistic levels.


----------



## sugalumps

Quote:


> Originally Posted by *SKYMTL*
> 
> Just to clarify before this gets out of hand.
> 
> The performance improvements are for 200-series cards versus the Omega driver.
> 
> After spending all night testing on Fury and Fury X, there is NO DIFFERENCE between the 15.15 and 15.7. As a matter of fact, 15.15 was perfectly stable while with 15.7 I experienced random Windows 8.1 boot locks and crashes. Witcher actually sees a ~5% performance reduction without Hairworks on, while Far Cry 4 sees a slight decrease as well.
> 
> It was a complete waste of time. Having this driver come out right before the Fury launch thus giving reviewers very little time to react is 100% marketing. It sows doubt about methodologies and will prompt the usual "well THAT'S why XXX performs so horribly, they used the wrong driver!".


Why am I not surprised. Not a single bit of proof was given for the original claim, I think it's hilarious that the forum member blue has qouted for the "15%" improvement is called roy. I can take a very easy guess at who that is.


----------



## tconroy135

Quote:


> Originally Posted by *sugalumps*
> 
> Why am I not surprised. Not a single bit of proof was given for the original claim, I think it's hilarious that the forum member blue has qouted for the "15%" improvement is called roy. I can take a very easy guess at who that is.


Bruce Willis a.k.a John McClain from Die Hard?


----------



## BoredErica

Quote:


> Originally Posted by *blue1512*
> 
> Paid me, I will do it for you. Am I that bored???


If you're going to claim something huge like 15%, you need to bring good evidence to back up that claim. Something like "pay me and I'll show you the proof" is just dumb and it looks very bad on you if you have to resort to that kind of comment.

Quote:



> Originally Posted by *sugalumps*
> 
> Again, proof?
> 
> Not someone saying they are getting a random number percentage better. You cant just claim anything and expect everyone to believe it, give me screen shots of the settings before and after matching. Then screenshots/video proof of the benchmark run both before and after to see the difference.
> 
> Also person you are quoting - "*roy*frosty"
> 
> I can guess who that is LOL


I did a little digging and I'm not 100% sure RoyFrost is AMD Roy.

http://www.overclock.net/t/1561790/hi-ocn-im-roy-from-singapore/0_100#post_24081178

Look at this guy. ^

Also has reviewed an Nvidia card.

Also saw a post from RoyFrosty on a Singapore forum, which matches where RoyFrosty says he is from in his intro thread on OCN.

http://boardreader.com/thread/GPU_Review_MSI_GTX960_Gaming_Red_Dragon_puua8X302bg.html

AMD's Roy isn't from Singapore and doesn't live in Singapore.

Well, here are his posts on OCN:

http://www.overclock.net/forums/posts/by_user/id/464806


----------



## Blackops_2

Quote:


> Originally Posted by *tconroy135*
> 
> Bruce Willis a.k.a John McClain from Die Hard?


LOL

Well that sucks


----------



## Redwoodz

Quote:


> Originally Posted by *SKYMTL*
> 
> Just to clarify before this gets out of hand.
> 
> The performance improvements are for 200-series cards versus the Omega driver.
> 
> After spending all night testing on Fury and Fury X, there is NO DIFFERENCE between the 15.15 and 15.7. As a matter of fact, 15.15 was perfectly stable while with 15.7 I experienced random Windows 8.1 boot locks and crashes. Witcher actually sees a ~5% performance reduction without Hairworks on, while Far Cry 4 sees a slight decrease as well.
> 
> It was a complete waste of time. Having this driver come out right before the Fury launch thus giving reviewers very little time to react is 100% marketing. It sows doubt about methodologies and will prompt the usual "well THAT'S why XXX performs so horribly, they used the wrong driver!".


So where is your "PROOF"







The release notes clearly state improvements for R7 and R9 series 200 and up.

Quote:


> Originally Posted by *Darkwizzie*
> 
> Since when are we trusting the performance improvement numbers from driver release notes? As far as I have known, both sides blow up the numbers to unrealistic levels.


Quote:


> Originally Posted by *mcg75*
> 
> Nobody disputed there were improvements.
> 
> But just as it's not fair to downplay them as nothing, it's also not fair to overplay them as something they aren't.
> 
> Besides, if they can wring out 5% in each of the next few drivers, that speaks for itself as a great improvement.


Quote:


> Originally Posted by *sugalumps*
> 
> Why am I not surprised. Not a single bit of proof was given for the original claim, I think it's hilarious that the forum member blue has qouted for the "15%" improvement is called roy. I can take a very easy guess at who that is.


Why is everyone all hyped up over performance increase claims? Like I stated before,the driver came out yesterday,we will all know the exact extent in a few days.Chill!


----------



## royfrosty

Quote:


> Originally Posted by *sugalumps*
> 
> Again, proof?
> 
> Not someone saying they are getting a random number percentage better. You cant just claim anything and expect everyone to believe it, give me screen shots of the settings before and after matching. Then screenshots/video proof of the benchmark run both before and after to see the difference.
> 
> Also person you are quoting - "*roy*frosty"
> 
> I can guess who that is LOL


Its totally fine if you do not intend to believe me. I do not have much rep in here rather than Hwz in Singapore.

Posted some screenshot back then during the first few days after i got my card.

Note that theres isnt any form of benchmark tools in FC4. So there are many ways one could benchmark it.

But i chose to be consistant by going from the first time Sabal meets up with you and asked you to run for the truck once you run out from the door.

http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633.html#post95077300

Read my first 2 pages of my own thread. That was my first finding.


----------



## Forceman

Quote:


> Originally Posted by *Redwoodz*
> 
> So where is your "PROOF"
> 
> 
> 
> 
> 
> 
> 
> The release notes clearly state improvements for R7 and R9 series 200 and up.


My guess is his proof will come in the Fury review he is probably finishing at this moment. The performance claims for the driver are clearly stated as being over the previous Omega drivers, so any gains the 15.15 (or any other beta drivers) made are already baked in.


----------



## SKYMTL

Quote:


> Originally Posted by *Forceman*
> 
> My guess is his proof will come in the Fury review he is probably finishing at this moment. The performance claims for the driver are clearly stated as being over the previous Omega drivers, so any gains the 15.15 (or any other beta drivers) made are already baked in.


Ding ding!









They've had six MONTHS since Omega. If there weren't any benefits in single GPU performance, I'd be really worried.


----------



## friend'scatdied

Kepler hasn't had any _benefits_ in single GPU performance over the past 6 months.









Hoping Fury non-X turns out OK.


----------



## SKYMTL

Quote:


> Originally Posted by *friend'scatdied*
> 
> Kepler hasn't had any _benefits_ in single GPU performance over the past 6 months.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping Fury non-X turns out OK.


I have a 780Ti in one of my systems. There were serious performance increases in GTAV with 350.12 I believe it was and in Witcher 3 with their launch-day driver. Not sure what you are talking about....


----------



## Sheyster

Quote:


> Originally Posted by *SKYMTL*
> 
> I have a 780Ti in one of my systems. There were serious performance increases in GTAV with 350.12 I believe it was and in Witcher 3 with their launch-day driver. Not sure what you are talking about....


Yep, this is indeed true. That driver improved Kepler performance in many other games too.


----------



## Liranan

Quote:


> Originally Posted by *SKYMTL*
> 
> I have a 780Ti in one of my systems. There were serious performance increases in GTAV with 350.12 I believe it was and in Witcher 3 with their launch-day driver. Not sure what you are talking about....


That only happened when it turned out that AMD's cards had improved in performance while nVidia's had stagnated.


----------



## BoredErica

PCper's review of 2 and 3 way scaling between 980ti and FuryX is up in case you don't know.






Surprisingly the frame consistency of Crossfired Furies seem to match that of 980ti. (Reading the article from the link I posted below shows that it's very game dependent though. I would say it's a 50/50 win for either setup but both FuryX and 980ti have very poor scaling on a select 1-2 games so watch out.) I guess I was used to seeing the 295x2 with very bad frame consistency so I was expecting Fury crossfire to be bad as well. With the chart at the bottom of each page for the title they are benchmarking you can clearly see FPS for single, double, and triple GPU along with scaling percentage. You can see how Fury's slower stock vs 980ti stock performance compares with generally better scaling.

Much more info in their written article here:

http://www.pcper.com/reviews/Graphics-Cards/AMD-Fury-X-vs-NVIDIA-GTX-980-Ti-2-and-3-Way-Multi-GPU-Performance/


----------



## rt123

Wasn't AMD's XDMA crossfire always better than Nvidia's SLi. ?


----------



## KyadCK

Quote:


> Originally Posted by *rt123*
> 
> Wasn't AMD's XDMA crossfire always better than Nvidia's SLi. ?


It was the fix for their >1600p problems, but not an nVidia killer. It has served them very well though.


----------



## Blackops_2

I wanted to hear a mortal kombat "fight" initiation lol.

Impressed on the frame variance. AMD has come a long way. With the way Fury scales a multi-GPU setup would be nice. Assuming they can get the drivers out with in decent time.

Is Vega still running his Titans? Just curious.


----------



## Ceadderman

Nope he kicked them like a can down the street and bought a couple Fury X's.









I keed I keed









~Ceadder


----------



## Blackops_2

Quote:


> Originally Posted by *Ceadderman*
> 
> Nope he kicked them like a can down the street and bought a couple Fury X's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keed I keed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Get me all excited like that









I'd like to see the results he could put out on his rig with 4 Fury Xs.


----------



## BoredErica

I keep typing Furies and thinking about Furries, lol.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> PCper's review of 2 and 3 way scaling between 980ti and FuryX is up in case you don't know.
> 
> 
> 
> 
> 
> *Surprisingly the frame consistency of Crossfired Furies seem to match that of 980ti. (Reading the article from the link I posted below shows that it's very game dependent though. I would say it's a 50/50 win for either setup but both FuryX and 980ti have very poor scaling on a select 1-2 games so watch out.) I guess I was used to seeing the 295x2 with very bad frame consistency so I was expecting Fury crossfire to be bad as well*. With the chart at the bottom of each page for the title they are benchmarking you can clearly see FPS for single, double, and triple GPU along with scaling percentage. You can see how Fury's slower stock vs 980ti stock performance compares with generally better scaling.
> 
> Much more info in their written article here:
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Fury-X-vs-NVIDIA-GTX-980-Ti-2-and-3-Way-Multi-GPU-Performance/


Then you are entitled to your opinion of course the article you posted differs from your opinion.
Quote:


> But what about that direct AMD and NVIDIA comparisons? Despite what we might have expected going in, the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti. This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience. In several cases the extra scalability demonstrated by the Fury X allowed its dual-GPU performance to surpass a pair of GTX 980 Ti cards even though in a single GPU configuration the GeForce card was the winner. GRID 2 at 4K is one example of this result as is Bioshock Infinite at 4K. And even in a game like Crysis 3 at 4K where we saw NVIDIA's card scale by a fantastic 84%, AMD's Fury X card scaled by 95%!


To bad he used old drivers 15.15. The new drivers released 2 days ago 15.20 or ccc 15.7 have listed those games as having improved cross fire support. Would be intersesting to see if there are any results negative or positive with the new drivers.


----------



## Blackops_2

Quote:


> Originally Posted by *SKYMTL*
> 
> I have a 780Ti in one of my systems. There were serious performance increases in GTAV with 350.12 I believe it was and in Witcher 3 with their launch-day driver. Not sure what you are talking about....


Witcher 3 was so-so. I got better averages out of older drivers but had to put up with stuttering. 353.30 is worse than 353.06 for TW3 so far. Frames have dropped. I should've kept both my rigs at 350.12.


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> Then you are entitled to your opinion of course the article you posted differs from your opinion.


How so?


----------



## Slaughterem

Quote:


> the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti. This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience.


And your opinion
Quote:


> I would say it's a 50/50 win for either setup


----------



## Thoth420

Quote:


> Originally Posted by *SKYMTL*
> 
> I have a 780Ti in one of my systems. There were serious performance increases in GTAV with 350.12 I believe it was and in Witcher 3 with their launch-day driver. Not sure what you are talking about....


And that driver crashes if your system is on too long. I would say google it but nvidia nukes its driver feedback threads as soon as a new one comes out.

Might be remnants on other threads the issue was pretty widespread and even affected titan x owners all the way down to old Fermis.


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> And your opinion


That is the conclusion of Ryan Shrout, which is his opinion. But looking at the data for 4k I would say it's 50-50. Both Ryan and I are looking at the same data. You have access to the same data, you can draw your own conclusions.



Spoiler: GTA V









In 1440p 980ti wins. In 4k Fury loses, by a lot. Possibly due to lack of vram? Look at the chart axis: The 4k one goes up to 50ms.



Spoiler: BF4









In 1440p, 980ti wins. In 4k 980ti wins.



Spoiler: Bioshock Infinite









In 1440p, Fury wins. In 4k, it's a little hard to tell at the top of that graph there, but I'm going to give the edge to the Fury looking at the 80 to 95 percentile timings.



Spoiler: Crysis 3









In 1440p, 980ti wins. In 4k, 980ti wins.



Spoiler: Grid 2









In 1440p, 980ti wins. In 4k, 980ti wins, but why the heck is the consistency with the single 980ti lower than sli 980ti lol. What the heck.



Spoiler: Metro









In 1440p, Fury wins. In 4k, Fury wins.

I tallied up the number of wins vs losses:

1440: Fury wins 2/6 cases.

4k: Fury wins 2/6 cases.

That's actually less than 50-50.

Metro Last Light is very good for AMD, very bad for Nvidia.

Grid is very good for Nvidia.

GTA V is very good for Nvidia, especially at 4k where the difference is massive.

The rest of the 3 games are close enough ish for either party.


----------



## Blackops_2

Quad-Fire

Vs

Quad-Sli

Sorry if this has already been posted.


----------



## Ceadderman

I figured with no Tess enabled Fury would show really well, since Tess is nVidia's bag. But wow that performance gap is huge. And Fury did a reasonable job against Ti with it enabled.

Too bad this was a Maingear ad.









~Ceadder


----------



## Slaughterem

Quote:


> That is the conclusion of Ryan Shrout, which is his opinion. But looking at the data for 4k I would say it's 50-50. Both Ryan and I are looking at the same data. You have access to the same data, you can draw your own conclusions.


That is Ryan stating the facts. And the facts indicate that Fury X crossfire is better than 980TI SLI. His opinion comes later on. But all of this does not matter because he used 15.15 drivers and not the most current ones 15.20. So no need to continue this discussion.


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> That is Ryan stating the facts. And the facts indicate that Fury X crossfire is better than 980TI SLI. His opinion comes later on. But all of this does not matter because he used 15.15 drivers and not the most current ones 15.20. So no need to continue this discussion.


Wow, I can't believe I just wasted my time carefully putting together my thought process and you just write it off with your biased talk.









I was talking about frame consistency, not overall which is better. I hope you actually read the part of the post you bolded in your quote. You provide no evidence to prove that I am wrong. You say I'm wrong, then just say 'and we're done'. Great. If that's how you're going to roll, keep it to yourself next time.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Thoth420*
> 
> And that driver crashes if your system is on too long. I would say google it but nvidia nukes its driver feedback threads as soon as a new one comes out.
> 
> Might be remnants on other threads the issue was pretty widespread and even affected titan x owners all the way down to old Fermis.


Lol!

Edit: Not even worth it.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> Wow, I can't believe I just wasted my time carefully putting together my thought process and you just write it off with your biased talk.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was talking about frame consistency, not overall which is better. You provide no evidence to prove that I am wrong. You say I'm wrong, then just say 'and we're done'.


There is no need to discuss with you about your opinion, you are entitled to your thought process. There is no bias from me even though you would like to accuse me of this. You are talking about frame consistency and Ryan stated the facts. I will place his quote here again, you can have your opinion and I can have the latitude to trust Ryans facts over your opinion.
Quote:


> But what about that direct AMD and NVIDIA comparisons? Despite what we might have expected going in, the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti. This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience. In several cases the extra scalability demonstrated by the Fury X allowed its dual-GPU performance to surpass a pair of GTX 980 Ti cards even though in a single GPU configuration the GeForce card was the winner. GRID 2 at 4K is one example of this result as is Bioshock Infinite at 4K. And even in a game like Crysis 3 at 4K where we saw NVIDIA's card scale by a fantastic 84%, AMD's Fury X card scaled by 95%!


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> There is no need to discuss with you about your opinion, you are entitled to your thought process. There is no bias from me even though you would like to accuse me of this. You are talking about frame consistency and Ryan stated the facts. I will place his quote here again, you can have your opinion and I can have the latitude to trust Ryans facts over your opinion.


No. All you've done is say "Ryan said Fury had better frame consistency and that is a fact. What you said is your opinion. Therefore, you are wrong because it is just your opinion and Ryan is stating facts." You have yet to prove that's actually the case. I copied every relevant chart for you to show you on a case by case basis which has a better frame consistency. Either you are biased or you have a very, very skewed logic.

If you can point out where my thought process is incorrect based upon the data Ryan and I both look at, I'll be happy to change my mind. But this 'Ryan states facts, you state opinion because Ryan has authority' doesn't fly with me.

The data is in front of you. I don't understand why you rely on what Ryan feels as opposed to the data...

Also I want to highlight, once again, that I never claimed Fury Crossfire is worse or better or equal to 980ti as a whole or in terms of fps. I was talking about frame consistency.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> Wow, I can't believe I just wasted my time carefully putting together my thought process and you just write it off with your biased talk.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was talking about frame consistency, not overall which is better. I hope you actually read the part of the post you bolded in your quote. You provide no evidence to prove that I am wrong. You *say I'm wrong, then just say 'and we're done'. Great. If that's how you're going to roll, keep it to yourself next time.
> *


Keep it to myself LMAO, typical child like response when one can't admit they are wrong about the facts in a review article.


----------



## BoredErica

Quote:


> Originally Posted by *Darkwizzie*
> 
> No. All you've done is say "Ryan said Fury had better frame consistency and that is a fact. What you said is your opinion. Therefore, you are wrong because it is just your opinion and Ryan is stating facts." You have yet to prove that's actually the case. I copied every relevant chart for you to show you on a case by case basis which has a better frame consistency. Either you are biased or you have a very, very skewed logic.
> 
> If you can point out where my thought process is incorrect based upon the data Ryan and I both look at, I'll be happy to change my mind. But this 'Ryan states facts, you state opinion because Ryan has authority' doesn't fly with me.
> 
> The data is in front of you. I don't understand why you rely on what Ryan feels as opposed to the data...
> 
> Also I want to highlight, once again, that I never claimed Fury Crossfire is worse or better or equal to 980ti as a whole or in terms of fps. I was talking about frame consistency.


Quote:


> Originally Posted by *Slaughterem*
> 
> Keep it to myself LMAO, typical child like response when one can't admit they are wrong about the facts in a review article.


Are you actually going to back up your claims with evidence? Argument from authority doesn't cut it, especially when I showed you the data.

If you don't show evidence, you've got nothing apart from your attitude.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> No. All you've done is say "Ryan said Fury had better frame consistency and that is a fact. What you said is your opinion. Therefore, you are wrong because it is just your opinion and Ryan is stating facts." You have yet to prove that's actually the case. I copied every relevant chart for you to show you on a case by case basis which has a better frame consistency. Either you are biased or you have a very, very skewed logic.
> 
> If you can point out where my thought process is incorrect based upon the data Ryan and I both look at, I'll be happy to change my mind. But this 'Ryan states facts, you state opinion because Ryan has authority' doesn't fly with me.
> 
> *The data is in front of you. I don't understand why you rely on what Ryan feels as opposed to the data...
> 
> Also I want to highlight, once again, that I never claimed Fury Crossfire is worse or better or equal to 980ti as a whole or in terms of fps. I was talking about frame consistency.
> *


So I should believe your opinion. I don't care if you change your mind. All I said is that the author of the article has a different opinion than you. I did not give you my opinion on the data, other than the fact he used 15.15 and not the latest drivers. So all your thought process and the data is moot anyway. Can we agree on the fact that he used old drivers? And consequently this does not reflect the performance of Fury X as of today?


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> So I should believe your opinion.


If I look more closely at the data than Ryan does, yes. Who cares where the opinion comes from? All that matters is whether the opinion matches up with the data.

Quote:


> I don't care if you change your mind.


Right back at you, because I know I won't.

Quote:


> All I said is that the author of the article has a different opinion than you.


You said multiple times that Ryan is factually correct, meaning I am factually incorrect because my statement contradicted his.

Quote:


> I did not give you my opinion on the data,


That's what I'm saying. You say I'm wrong, then refuse to look at the data to say why.

Quote:


> other than the fact he used 15.15 and not the latest drivers. So all your thought process and the data is moot anyway. Can we agree on the fact that he used old drivers? And consequently this does not reflect the performance of Fury X as of today?


That's actually a diversion. I'm asking you why you think I am wrong but Ryan is right. Whatever drivers he used is irrelevant. If it is true that the entire test Ryan did was irrelevant (IF I grant that, which I don't but let's ignore that), that still doesn't change whether your original position was correct or not.

If you go through such great pains to dodge all responsibility of backing up your claims, from saying Ryan is just right because he's the head of PCper and I'm just some random guy, to saying the entire thing doesn't matter, one really has to question just how much evidence you have in the first place.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> That's what I'm saying. You say I'm wrong, then refuse to look at the data to say why.
> *That's actually a diversion. I'm asking you why you think I am wrong but Ryan is right. Whatever drivers he used is irrelevant*.


Again your opinion is different from Ryans. If you feel that Ryans opinion is wrong and you are correct then fine maybe take up your process and conclusions with him. My opinion would be towards what Ryan concluded, and it does not matter why I believe this. A diversion? You can't even admit that drivers are relevant? Sad Sad


----------



## BoredErica

Quote:


> Originally Posted by *Slaughterem*
> 
> If you feel that Ryans opinion is wrong and you are correct then fine maybe take up your process and conclusions with him.


Wat?

Whether I complain to Ryan is my business. I'm talking to you, not Ryan. This is irrelevant. Ryan can try to justify his position on his own time, but you have to justify yours. This is again, a diversion away from your burden of proof.

Quote:


> My opinion would be towards what Ryan concluded, and *it does not matter why I believe this.*


Actually it does. It's called a rational conversation, forming an opinion based upon reason and evidence. Both the conclusion and the way the conclusion was reached matters. I wouldn't care if you didn't just label me wrong outright.

Quote:


> A diversion? You can't even admit that drivers are relevant? Sad Sad


More diversion.

So you say I'm wrong, then I clearly list the evidence piece by piece for you, then when I press you for evidence, you handwave by talking about asking Ryan instead when I'm talking to you, not Ryan, and then talk about whether the conclusion matters or not instead of addressing whether it is correct.

You're a loonie, I'm done with you. Welcome to the block list.


----------



## iinversion

Quote:


> Originally Posted by *Slaughterem*
> 
> Again your opinion is different from Ryans. If you feel that Ryans opinion is wrong and you are correct then fine maybe take up your process and conclusions with him. My opinion would be towards what Ryan concluded, and it does not matter why I believe this. A diversion? You can't even admit that drivers are relevant? Sad Sad


Dude the proof is right in front of you, how can you not see it? Opinions don't matter. The *facts* based on that article are:

4/6 tests 980 Ti has better frame consistency than the Fury X. That is not an opinion.

Yes it was not tested on the newest driver but you cannot assume that it would be better on the newer drivers. It could be exactly the same as far as frame consistency is concerned for all you know.


----------



## Slaughterem

Quote:


> Originally Posted by *Darkwizzie*
> 
> Wat?
> Whether I complain to Ryan is my business. I'm talking to you, not Ryan. This is irrelevant. Ryan can try to justify his position, but you have to justifies yours. This is again, a diversion of your burden of proof.
> Actually it does. It's called a rational conversation, forming an opinion based upon reason and evidence. Both the conclusion and the way the conclusion was reached matters. I wouldn't care if you didn't just label me wrong outright.
> 
> So you say I'm wrong, then I clearly list the evidence piece by piece for you, then when I press you for evidence, you handwave by talking about asking Ryan instead when I'm talking to you, not Ryan, and then talk about whether the conclusion matters or not instead of addressing whether it is correct.
> 
> *You're a loonie, I'm done with you*.


That is a rational conversation where you call me a loonie? Name calling is something that is childish. Obviously you do care that I think that you are wrong and you will go to no ends to make me change my mind by saying justify your position. I do not have a need or desire to justify myself to you. If that bothers you then to bad LOL.


----------



## Blameless

Quote:


> Originally Posted by *Darkwizzie*
> 
> In 1440p 980ti wins. In 4k Fury loses, by a lot. Possibly due to lack of vram?


Either lack of VRAM, or a side effect of AMDs drivers being too aggressive about caching and paging out data.

I noticed the 15.15 drivers on my Hawaii parts had noticeably worse consistency in some tests, unless the test was allowed to loop several times before being benched.
Quote:


> Originally Posted by *Darkwizzie*
> 
> In 4k, 980ti wins, but why the heck is the consistency with the single 980ti lower than sli 980ti lol. What the heck.


This is not unheard of. Presumably both NVIDIA and AMD have fairly aggressive frame pacing mechanisms in place for SLI/CFX which likely aren't active in single card setups.
Quote:


> Originally Posted by *Darkwizzie*
> 
> Metro Last Light is very good for AMD, very bad for Nvidia.
> Grid is very good for Nvidia.
> GTA V is very good for Nvidia, especially at 4k where the difference is massive.
> The rest of the 3 games are close enough ish for either party.


Very few people are going to notice variance below about 10% of total frame time, and only the worst titles are likely to have noticeable issues.


----------



## BoredErica

Quote:


> Originally Posted by *Blameless*
> 
> Either lack of VRAM, or a side effect of AMDs drivers being too aggressive about caching and paging out data.


You probably already know this, but Pcper's 4k GTA V tests seem to favor the 980ti more than the Fury with the Fury having a very nasty FPS dip bit after midway in their benchmark. Happens every time at 4k. I believe Ryan said it was due to vram, but I'm not sure if he said that.

Quote:


> I noticed the 15.15 drivers on my Hawaii parts had noticeably worse consistency in some tests, unless the test was allowed to loop several times before being benched.


Interesting, never heard of that.

Quote:


> This is not unheard of. Presumably both NVIDIA and AMD have fairly aggressive frame pacing mechanisms in place for SLI/CFX which likely aren't active in single card setups.


I was worried that their GRID 2 benchmark was screwy somehow because of that. I guess it's nothing then.

Quote:


> Very few people are going to notice variance below about 10% of total frame time, and only the worst titles are likely to have noticeable issues.


Hmm... I'm not 100% sure what that means, can you explain?

So at higher framerates, a higher frame consistency is required for optimal smoothness. So by 10% of total frame time... 60 fps means each frame optimally takes 16.67 ms to draw. So a 10% deviation would mean 1.67 ms difference? Then looking at every single graph, aren't all of the charts showing over 4ms variance in the 90-99th percentile?

If you actually look at the PCper link I linked with the Fury vs 980ti multi-gpu scaling article, you can actually see me trying to ask Ryan this in the comments below.


----------



## Blameless

Quote:


> Originally Posted by *Darkwizzie*
> 
> So by 10% of total frame time... 60 fps means each frame optimally takes 16.67 ms to draw. So a 10% deviation would mean 1.67 ms difference?


That's my rule of thumb, yes.

I'm pretty sensitive to erratic frame times, and I generally find it impossible to percieve deviations this small, even unconsciously.
Quote:


> Originally Posted by *Darkwizzie*
> 
> Then looking at every single graph, aren't all of the charts showing over 4ms variance in the 90-99th percentile?


At some points the variance will be high, which usually occurs during scene changes, asset loading/streaming, or near minimum frame rates. Some of this is inevitable and can generally be ignored.

Real issues show up when you have high frame variance where it otherwise should not be, or persistently inconsistent in frame times. 2ms variance at the 50th percentile is much more worrying than 20ms at the 99th.


----------



## BoredErica

Quote:


> Originally Posted by *Blameless*
> 
> Real issues show up when you have high frame variance where it otherwise should not be, or persistently inconsistent in frame times. 2ms variance at the 50th percentile is much more worrying than 20ms at the 99th.


Oh, I looked at the frame consistency graphs only at the very end because I thought only the extremes are where it mattered. I will put more emphasis at the 50th percentile in the future, thanks.


----------



## Themisseble

http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html?hootPostID=644d57954978e4c605d0509f09f43555&start=14

R9 Fury X does really well against OC-ed GTX 980 TI at 1600P


----------



## Blameless

Quote:


> Originally Posted by *Darkwizzie*
> 
> Oh, I looked at the frame consistency graphs only at the very end because I thought only the extremes are where it mattered. I will put more emphasis at the 50th percentile in the future, thanks.


The percentile values in these graphs mean how much variance a given portion of frames show. If you see 2ms at the 95th percentile, it means that 95% of frames have 2ms or less frame variance.

50th percentile was just an example. Even 80-90th percentile is relevant. However, 99th percentile is pretty meaningless, as that is only the worst 1% of frames.


----------



## Themisseble

4K- R9 Fury

Why does fiji scales so bad.
http://www.bit-tech.net/hardware/graphics/2015/07/10/sapphire-r9-fury-tri-x/3

look at GTX 980 - which is higher clocked as GTX 980Ti
And then you look at R9 390X and Fury X/nonX - AMD having problems with arch, drivers or maybe HBM or DX11?

For perfect scaling Fury X should be around 50% faster than r9 390X.
scaling is only x1.1726 over R9 390X - ( Fury X ) -

so poor.


----------



## mtcn77

Quote:


> Originally Posted by *Darkwizzie*
> 
> That is the conclusion of Ryan Shrout, which is his opinion. But looking at the data for 4k I would say it's 50-50. Both Ryan and I are looking at the same data. You have access to the same data, you can draw your own conclusions.
> 
> 
> Spoiler: GTA V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p 980ti wins. In 4k Fury loses, by a lot. Possibly due to lack of vram? Look at the chart axis: The 4k one goes up to 50ms.
> 
> 
> Spoiler: BF4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p, 980ti wins. In 4k 980ti wins.
> 
> 
> Spoiler: Bioshock Infinite
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p, Fury wins. In 4k, it's a little hard to tell at the top of that graph there, but I'm going to give the edge to the Fury looking at the 80 to 95 percentile timings.
> 
> 
> Spoiler: Crysis 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p, 980ti wins. In 4k, 980ti wins.
> 
> 
> Spoiler: Grid 2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p, 980ti wins. In 4k, 980ti wins, but why the heck is the consistency with the single 980ti lower than sli 980ti lol. What the heck.
> 
> 
> Spoiler: Metro
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 1440p, Fury wins. In 4k, Fury wins.
> 
> I tallied up the number of wins vs losses:
> 
> 1440: Fury wins 2/6 cases.
> 4k: Fury wins 2/6 cases.
> 
> That's actually less than 50-50.
> 
> Metro Last Light is very good for AMD, very bad for Nvidia.
> Grid is very good for Nvidia.
> GTA V is very good for Nvidia, especially at 4k where the difference is massive.
> The rest of the 3 games are close enough ish for either party.


Mind you, frame variance is 'SD' and you need to consider the averages, too: (mean ± SD); not just |SD|. Frame percentiles tell the whole story.


Spoiler: PcPer 2K & 4K SLI-CF benchmarks






Spoiler: Grand Theft Auto 5










Spoiler: Battlefield 4










Spoiler: Bioshock Infinite










Spoiler: Crysis 3










Spoiler: Grid 2










Spoiler: Metro Last Light


----------



## Themisseble

I question - offtopic

HBM + APU - Would you even need DDR to run APU HSA with 8GB or more HBM on it? How small could APU socket be?


----------



## mtcn77

Quote:


> Originally Posted by *Themisseble*
> 
> I question - offtopic
> 
> HBM + APU - Would you even need DDR to run APU HSA with 8GB or more HBM on it? How small could APU socket be?


1 answer - offtopic

HBM + APU - 'insta' purchase. Look at the Fury, it looks just the same as the X. My inference is bandwidth helps in minimum frame rates and cores elevate the average frame rate threshold.


----------



## Themisseble

Quote:


> Originally Posted by *mtcn77*
> 
> 1 answer - offtopic
> 
> HBM + APU - 'insta' purchase. Look at the Fury, it looks just the same as the X. My inference is bandwidth helps in minimum frame rates and cores elevate the average frame rate threshold.


APU is heavily bottlenecked by DDR3 which only 22-30 GB/s...
I just wonder how small packed could be with very powerfull APU inside without DDR RAM only HBM. maybe new gen consoles will arrive without DDR3. And efficiency, especially with 14nm + 300-400mm^2 chip could be big deal for APU users.


----------



## NuclearPeace

APUs are more bottlenecked by their bad memory controllers than DDR3. Their IMCs are about half as fast as Intel's.


----------



## mtcn77

Quote:


> Originally Posted by *NuclearPeace*
> 
> APUs are more bottlenecked by their bad memory controllers than DDR3. Their IMCs are about half as fast as Intel's.


Intel Iris Pro line employs L4. It is a point of contention with more gpu cores on APUs and more cache on Iris Pro, I'm afraid.


----------



## Sheyster

Quote:


> Originally Posted by *Slaughterem*
> 
> That is Ryan stating the facts. And the facts indicate that Fury X crossfire is better than 980TI SLI. His opinion comes later on. But all of this does not matter because he used 15.15 drivers and not the most current ones 15.20. So no need to continue this discussion.


Great, so it's marginally better in some aspects. Too bad you have to wait god knows how long to get new CFX game support.









I'm the kind of guy who takes the day off work when an exciting new game is released. I have no interest in waiting X weeks/months for AMD to update their ish.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Themisseble*
> 
> I question - offtopic
> 
> HBM + APU - Would you even need DDR to run APU HSA with 8GB or more HBM on it? How small could APU socket be?


Yes, we will get to this point eventually. Socket might get a little bigger, but overall footprint would come down.


----------



## Woundingchaney

Quote:


> Originally Posted by *fewness*
> 
> Is this a truly representative scenario for TitanX and 980Ti SLI users? The cards often throttle due to temps to 1000MHz in SLI?


I cant speak for all users, but with a custom fan curve my cards do not throttle. I run games and benchmark @1350mhz solidly on both my cards.


----------



## Xuper

Here :

AMD Radeon R9 Fury X on triple 4K displays(11,520x2160)


----------



## Thoth420

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Lol!
> 
> Edit: Not even worth it.


^Ironic since you felt the need to comment about not commenting.


----------



## Siezureboy

Guys stop arguing you're making the stocks drop


----------



## BoredErica

Your phone's picture is huge.


----------



## iLeakStuff

HOLY CRAP. ^

AMD is just $0.03 from being worth the least in *AMD`s entire history*











http://finance.yahoo.com/echarts?s=AMD+Interactive#{%22range%22:%22max%22,%22allowChartStacking%22:true}


----------



## Ceadderman

After market reports were low, I'm reasonably sure that it's affected everyone not just AMD.









~Ceadder


----------



## iLeakStuff

You are right but AMD have fallen 25% over the last month.

Intel fallen 10%
Nvidia 10%
Both have begun to climb up but not AMD


----------



## Xuper

Quote:


> Originally Posted by *iLeakStuff*
> 
> HOLY CRAP. ^
> 
> AMD is just $0.03 from being worth the least in *AMD`s entire history*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{%22range%22:%22max%22,%22allowChartStacking%22:true}


So What ? You're Really Worry?


----------



## Casey Ryback

Quote:


> Originally Posted by *iLeakStuff*
> 
> You are right but AMD have fallen 25% over the last month.
> 
> Intel fallen 10%
> Nvidia 10%
> Both have begun to climb up but not AMD


Give it a rest dude stop being a shill.


----------



## Sycksyde

Quote:


> Originally Posted by *iLeakStuff*
> 
> HOLY CRAP. ^
> 
> AMD is just $0.03 from being worth the least in *AMD`s entire history*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{%22range%22:%22max%22,%22allowChartStacking%22:true}


Yep AMD is teetering on the edge of oblivion, no way would I buy an AMD card then be stuck with no driver support at all because the company has gone bust.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Sycksyde*
> 
> Yep AMD is teetering on the edge of oblivion, no way would I buy an AMD card then be stuck with no driver support at all because the company has gone bust.


AMD is still safe.


----------



## AKA1

I am pretty happy with my fury x. I wish it clocked a little higher. But 1125/555 isn't bad at all. Yes I did OC the memory. I want my voltage unlocked


----------



## dir_d

Quote:


> Originally Posted by *Sycksyde*
> 
> Yep AMD is teetering on the edge of oblivion, no way would I buy an AMD card then be stuck with no driver support at all because the company has gone bust.


That is one of the most uneducated thoughts I have ever read.


----------



## PostalTwinkie

Quote:


> Originally Posted by *AKA1*
> 
> I am pretty happy with my fury x. I wish it clocked a little higher. But 1125/555 isn't bad at all. Yes I did OC the memory. I want my voltage unlocked


Did you see noticeable gains with the memory OC?


----------



## Xuper

As I Wrote My Post over Review AMD Fury Strix : Link

Quite Different Than this , Maybe AMD Locked OC for Fury X? I remembered AMD said those Vram Can pull over 400 amp.


----------



## sugalumps

So now that the dust has settled how much did the new drivers improve performance? Is it 15%-20% like some people(blue/roy) claimed?


----------



## iinversion

Quote:


> Originally Posted by *sugalumps*
> 
> So now that the dust has settled how much did the new drivers improve performance? Is it 15%-20% like some people(blue/roy) claimed?


It seems to be 1-3% more performance, but from what I read they are a unstable mess.


----------



## curlyp

I hope I'm not too far off topic, but as a previous Fury X owner I wanted to reach out to others who have this card and other AMD cards.

I originally purchased the Fury X day one, but was not fully satisfied with it over my GTX 970. I've been slowly trying to get back to an AMD card since I had the MSI Lightning 7970s and tired of Nvidia's "nickel and dimming" (just my opinion).

I want to up grade my Gcard for my new 4K monitoring I'm ordering;however I'm not sure which route to take. Others on this thread are not happy about the overall performance of the card and the fact that it cannot be overclocked yet.

So, here's what I'm thinking. Should I wait to purchase dual non-Fury X when released, dual 390X, dual 290X, 295X2, or stick with Nvidia?

I'm hoping the non-Fury X will be a better bang for the buck than the Fury X was.

Thanks!


----------



## maltamonk

Quote:


> Originally Posted by *curlyp*
> 
> I hope I'm not too far off topic, but as a previous Fury X owner I wanted to reach out to others who have this card and other AMD cards.
> 
> I originally purchased the Fury X day one, but was not fully satisfied with it over my GTX 970. I've been slowly trying to get back to an AMD card since I had the MSI Lightning 7970s and tired of Nvidia's "nickel and dimming" (just my opinion).
> 
> I want to up grade my Gcard for my new 4K monitoring I'm ordering;however I'm not sure which route to take. Others on this thread are not happy about the overall performance of the card and the fact that it cannot be overclocked yet.
> 
> So, here's what I'm thinking. Should I wait to purchase dual non-Fury X when released, dual 390X, dual 290X, 295X2, or stick with Nvidia?
> 
> I'm hoping the non-Fury X will be a better bang for the buck than the Fury X was.
> 
> Thanks!


Best advice is to not be an early adopter. They will always suffer the most bugs, pay the most, and have the best chance for disappointment.


----------



## curlyp

Quote:


> Originally Posted by *maltamonk*
> 
> Best advice is to not be an early adopter. They will always suffer the most bugs, pay the most, and have the best chance for disappointment.


Thanks for the advice. With that said, which one of these do you recommend?

dual 390X, dual 290X, 295X2, or stick with Nvidia?


----------



## Tivan

Quote:


> Originally Posted by *sugalumps*
> 
> So now that the dust has settled how much did the new drivers improve performance? Is it 15%-20% like some people(blue/roy) claimed?


15.7 are the drivers that bring all the features/improvements to 200 series that were previously on 15.15/300 series only.
Still no fix for GTA5/etc.


----------



## Noufel

Quote:


> Originally Posted by *curlyp*
> 
> I hope I'm not too far off topic, but as a previous Fury X owner I wanted to reach out to others who have this card and other AMD cards.
> 
> I originally purchased the Fury X day one, but was not fully satisfied with it over my GTX 970. I've been slowly trying to get back to an AMD card since I had the MSI Lightning 7970s and tired of Nvidia's "nickel and dimming" (just my opinion).
> 
> I want to up grade my Gcard for my new 4K monitoring I'm ordering;however I'm not sure which route to take. Others on this thread are not happy about the overall performance of the card and the fact that it cannot be overclocked yet.
> 
> So, here's what I'm thinking. Should I wait to purchase dual non-Fury X when released, dual 390X, dual 290X, 295X2, or stick with Nvidia?
> 
> I'm hoping the non-Fury X will be a better bang for the buck than the Fury X was.
> 
> Thanks!


Fury cfx isn't a bad idea at all especialy the triX one


----------



## Sheyster

Quote:


> Originally Posted by *maltamonk*
> 
> Best advice is to not be an early adopter. They will always suffer the most bugs, pay the most, and have the best chance for disappointment.


Not so all the time. My Titan X experience has been pristine since day 1. I bought the card on release day.

IMHO the card to get right now (since the bios has now been "cracked") is the MSI 6G. The Strix 980 Ti and Classy seem to be getting mixed reviews.


----------



## NiteNinja

I do think this'll pave the way for a new cooling style of GPU's. Lets face it, air cooling your graphics card is long overdue to be retired.

And look how small the card actually is! Those like myself who use Mini ITX Form factored PC's will rejoice in this new design.

I do feel that the Fury X, is not really a "flagship" per-say, as the 295x2 is still AMD's king card, but the Fury X is more of a test to see how well this cooling design takes off.

I do hope that they can also shrink the size of a dual GPU card such as the 295x2 to fit in smaller form factored PC's using this new design. Just imagine bringing in SLI/Crossfire performance in a small box to a convention or LAN party.


----------



## curlyp

Quote:


> Originally Posted by *Sheyster*
> 
> Not so all the time. My Titan X experience has been pristine since day 1. I bought the card on release day.
> 
> IMHO the card to get right now (since the bios has now been "cracked") is the MSI 6G. The Strix 980 Ti and Classy seem to be getting mixed reviews.


Thank you for your opinion.


----------



## Ceadderman

If you're gonna buy a GPU you *could* get 290 cheaper. But 390 while the same card on its surface is optimized to perform better.

Personally, if you've got the money to wait it out, I would wait til Fury x2 launches and then see where it's at performancewise. 390 will be available and fury X as well. So unless there is a tugging on your pockets of epic proportions that's likely the way to go.









~Ceadder


----------



## Sheyster

Quote:


> Originally Posted by *Ceadderman*
> 
> Personally, if you've got the money to wait it out, I would wait til Fury x2 launches and then see where it's at performancewise. 390 will be available and fury X as well. So unless there is a tugging on your pockets of epic proportions that's likely the way to go.


This is coming for a previous AMD dual GPU card owner, sold right here on OCN (see trade history).

The only way I would consider a dual AMD card is if AMD hires a new driver Dev team. Otherwise, no thank you.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Sheyster*
> 
> This is coming for a previous AMD dual GPU card owner, sold right here on OCN (see trade history).
> 
> The only way I would consider a dual AMD card is if AMD hires a new driver Dev team. Otherwise, no thank you.


They added two new memory management engineers.


----------



## SpeedyVT

Quote:


> Originally Posted by *PostalTwinkie*
> 
> They added two new memory management engineers.


GREAT NEWS! They'll need it as that they need to make the memory access wide not fast. I don't even think with our current driver using the full bit width.


----------



## Ceadderman

Quote:


> Originally Posted by *Sheyster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Personally, if you've got the money to wait it out, I would wait til Fury x2 launches and then see where it's at performancewise. 390 will be available and fury X as well. So unless there is a tugging on your pockets of epic proportions that's likely the way to go.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is coming for a previous AMD dual GPU card owner, sold right here on OCN (see trade history).
> 
> The only way I would consider a dual AMD card is if AMD hires a new driver Dev team. Otherwise, no thank you.
Click to expand...

Keyword: *wait*

But let's just go with where you went.

Fury X single card is giving reasonable performance comparisons against nVidia's top tier cards with only minor issues being the pump whine and sometimes coil whine.

They're going to get those sorted out.

Now we *should* see a better Fury x2 as a result. It's essentially new architecture so I'm reasonably sure that x2 is gonna evolve to be an absolute *BEAST* of a card. And I really cannot wait to see the benches.

Being on a pair of 6870s, it's time for me to upgrade. But I am waiting to see what I can upgrade to as well as hoarding my money for my build mod. At this point I could get r9 390 Nitro or Fury X. I could care less about stock cooling. I'm going EK. But if x2 turns out to be what I believe it is then I will wait and get that when I can better afford it.









But, I did suggest to wait until we know what we have in front of us.









~Ceadder


----------



## SpeedyVT

Quote:


> Originally Posted by *Ceadderman*
> 
> Keyword: *wait*
> 
> But let's just go with where you went.
> 
> Fury X single card is giving reasonable performance comparisons against nVidia's top tier cards with only minor issues being the pump whine and sometimes coil whine.
> 
> They're going to get those sorted out.
> 
> Now we *should* see a better Fury x2 as a result. It's essentially new architecture so I'm reasonably sure that x2 is gonna evolve to be an absolute *BEAST* of a card. And I really cannot wait to see the benches.
> 
> Being on a pair of 6870s, it's time for me to upgrade. But I am waiting to see what I can upgrade to as well as hoarding my money for my build mod. At this point I could get r9 390 Nitro or Fury X. I could care less about stock cooling. I'm going EK. But if x2 turns out to be what I believe it is then I will wait and get that when I can better afford it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, I did suggest to wait until we know what we have in front of us.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Crossfire scalability is crazy good. Especially if you've got the processor to handle it.

I'm also getting the sense that the bus isn't quite getting utilized correctly with the current drivers. I think buying anything new straight off is a glutton for punishment.


----------



## Ceadderman

Agreed. If one cannot wait for a couple months to see what we have going on here then 390 Nitro is my advice. It's newer and better than 290. And should Fury X2 be what I think it is and owner wishes to further upgrade, he won't take a bath in resale to offset that upgrade. But me I will wait.









~Ceadder


----------



## SpeedyVT

Quote:


> Originally Posted by *Ceadderman*
> 
> Agreed. If one cannot wait for a couple months to see what we have going on here then 390 Nitro is my advice. It's newer and better than 290. And should Fury X2 be what I think it is and owner wishes to further upgrade, he won't take a bath in resale to offset that upgrade. But me I will wait.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


It's not like anything out today is exponentially better than yesterdays cards. At least warranting the sale for their price. Got my 290 last Christmas for 250$, thanks Dad!


----------



## dmasteR

Quote:


> Originally Posted by *Ceadderman*
> 
> Keyword: *wait*
> 
> But let's just go with where you went.
> 
> Fury X single card is giving *reasonable performance comparisons* against nVidia's top tier cards with only minor issues being the pump whine and sometimes coil whine.
> 
> They're going to get those sorted out.
> 
> Now we *should* see a better Fury x2 as a result. It's essentially new architecture so I'm reasonably sure that x2 is gonna evolve to be an absolute *BEAST* of a card. And I really cannot wait to see the benches.
> 
> Being on a pair of 6870s, it's time for me to upgrade. But I am waiting to see what I can upgrade to as well as hoarding my money for my build mod. At this point I could get r9 390 Nitro or Fury X. I could care less about stock cooling. I'm going EK. But if x2 turns out to be what I believe it is then I will wait and get that when I can better afford it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, I did suggest to wait until we know what we have in front of us.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yes at 4K. 1080p/1440p is a different story for the vast majority of games, which is what the vast majority also uses.


----------



## iLeakStuff

Quote:


> Originally Posted by *Sheyster*
> 
> This is coming for a previous AMD dual GPU card owner, sold right here on OCN (see trade history).
> 
> The only way I would consider a dual AMD card is if AMD hires a new driver Dev team. Otherwise, no thank you.


Whats wrong with dual cards from AMD? X2 is in my maybe-buy list


----------



## Ceadderman

Quote:


> Originally Posted by *dmasteR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Keyword: *wait*
> 
> But let's just go with where you went.
> 
> Fury X single card is giving *reasonable performance comparisons* against nVidia's top tier cards with only minor issues being the pump whine and sometimes coil whine.
> 
> They're going to get those sorted out.
> 
> Now we *should* see a better Fury x2 as a result. It's essentially new architecture so I'm reasonably sure that x2 is gonna evolve to be an absolute *BEAST* of a card. And I really cannot wait to see the benches.
> 
> Being on a pair of 6870s, it's time for me to upgrade. But I am waiting to see what I can upgrade to as well as hoarding my money for my build mod. At this point I could get r9 390 Nitro or Fury X. I could care less about stock cooling. I'm going EK. But if x2 turns out to be what I believe it is then I will wait and get that when I can better afford it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, I did suggest to wait until we know what we have in front of us.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes at 4K. 1080p/1440p is a different story for the vast majority of games, which is what the vast majority also uses.
Click to expand...

I *am* 1080p atm. I got too many irons in the fire to stress about 4k or 1440. I plan to go 4K but I upgrade one thing at a time.









~Ceadder


----------



## blue1512

So this is what happens when nVidia's features turned off




Wonder if there would be a reviews like this for FuryX vs 980Ti


----------



## Orthello

Quote:


> Originally Posted by *blue1512*
> 
> So this is what happens when nVidia's features turned off
> Wonder if there would be a reviews like this for FuryX vs 980Ti


Although i like the eye candy its not known as Nvidia "Gimpworks" for nothing , excessive massive amounts tesselation etc when .. yeah its even hard for me to justify having some of those features on. I turn off Hair works in Witcher 3 for 3d vision frame rates and hardly notice the loss in visual .. hairworks seems to constantly jitter on objects like the hair has a life of its own even after the object is dead - this is particularly noticeable in far cry 4.

Always remember the crysis 2 analysis , they had all this water super tesselated , the problem was you could not see the water , it was covered by 80% of the land (and there was no range culling) so yeah sometimes you have to wonder why the load is put on the GPU if the benefits are minor.

HardOCP and a few others have pointed this out that removing NV features gives dramatic performance boost to Fury etc. The HBAO+ is definately worth it IMHO but the hair works is pretty meh at the moment.


----------



## curlyp

Quote:


> Originally Posted by *Ceadderman*
> 
> If you're gonna buy a GPU you *could* get 290 cheaper. But 390 while the same card on its surface is optimized to perform better.
> 
> Personally, if you've got the money to wait it out, I would wait til Fury x2 launches and then see where it's at performancewise. 390 will be available and fury X as well. So unless there is a tugging on your pockets of epic proportions that's likely the way to go.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Lol, I'm just itching to upgrade since the Fury X didn't pan out! Several people have commented on waiting so I think I will. Is there a time frame for the Fury x2 to come out? I suspect it with be a dual GPU card?
Quote:


> Originally Posted by *Ceadderman*
> 
> Keyword: *wait*
> 
> But let's just go with where you went.
> 
> Fury X single card is giving reasonable performance comparisons against nVidia's top tier cards with only minor issues being the pump whine and sometimes coil whine.
> 
> They're going to get those sorted out.
> 
> Now we *should* see a better Fury x2 as a result. It's essentially new architecture so I'm reasonably sure that x2 is gonna evolve to be an absolute *BEAST* of a card. And I really cannot wait to see the benches.
> 
> Being on a pair of 6870s, it's time for me to upgrade. But I am waiting to see what I can upgrade to as well as hoarding my money for my build mod. At this point I could get r9 390 Nitro or Fury X. I could care less about stock cooling. I'm going EK. But if x2 turns out to be what I believe it is then I will wait and get that when I can better afford it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, I did suggest to wait until we know what we have in front of us.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


When I had the Fury X I did not experience any pump or coil whine. I must have been lucky!









Hopefully the Fury X2 pans out


----------



## Tojara

Quote:


> Originally Posted by *curlyp*
> 
> Lol, I'm just itching to upgrade since the Fury X didn't pan out! Several people have commented on waiting so I think I will. Is there a time frame for the Fury x2 to come out? I suspect it with be a dual GPU card?


That's the point.







Should be out by the end of the year, IIRC.


----------



## Ceadderman

Sposed to be out in November in time for Black Friday. Though I doubt anyone but Amazon will give a discount on it. If it's not out by then, it's likely December.









~Ceadder


----------



## alcal

I'm super late to the party on realizing the implications of this, but I think that by including a Gentle Typhoon, AMD is kind of showing that they do care about the enthusiast market. Those fans aren't cheap and I'm not sure on how strong the supply chain is anymore, but the fact that they used the single most legendary radiator fan kind of makes me think that they still pay attention and they are trying to cater to us.

They could have still had incredible thermals and performance with a less exotic fan, especially in benchmarks, but the GT is also a subjective silence king, since the noise it does make is a non-noticeable pitch.


----------



## forthedisplay

They're Nidec fans, Scythe just bought and branded them. I'm not sure if they ever really went out of production, but you couldn't just order them in a "fan, please" -manner but rather thousands at a time would be the minimum order, Alibaba-style.

Afaik R9 295X2 had an issue where the radiator wouldn't be able to keep up the whole card cool enough in a longer period of use. Sure that's a whole different load of heat altogether but I would think they've learned their lessons. The lower TDP that you get with the pump helps, too.


----------



## friend'scatdied

Quote:


> Originally Posted by *alcal*
> 
> I'm super late to the party on realizing the implications of this, but I think that by including a Gentle Typhoon, AMD is kind of showing that they do care about the enthusiast market. Those fans aren't cheap and I'm not sure on how strong the supply chain is anymore, but the fact that they used the single most legendary radiator fan kind of makes me think that they still pay attention and they are trying to cater to us.


Then you open up the top plate and see the Cooler Master AIO.









Though they probably had the agreement with CM set up from the 295X2.


----------



## blue1512

Quote:


> Originally Posted by *friend'scatdied*
> 
> Then you open up the top plate and see the Cooler Master AIO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though they probably had the agreement with CM set up from the 295X2.


295x2 used Aseteck with round block. Square block used in FuryX is more space efficient, but sadly it's CM's exclusive.


----------



## friend'scatdied

Quote:


> Originally Posted by *blue1512*
> 
> 295x2 used Aseteck with round block. Square block used in FuryX is more space efficient, but sadly it's CM's exclusive.


Oops. Mixed it up with the FX-9590 which I think ussd a CM AIO.


----------



## Sashimi

Major retailers here in Australia seem to have dropped prices on the Fury X.

Fury is also out. Priced at the same as GTX 980 but beats it in most cases by 5-10% albeit uses over 50% more power, but I think that's a small issue since unless you run your GPUs at max load for long hours everyday.

It seems AMD's latest range of GPU is finally at where they're supposedly be.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> Major retailers here in Australia seem to have dropped prices on the Fury X.
> 
> Fury is also out. Priced at the same as GTX 980 but beats it in most cases by 5-10% albeit uses over 50% more power, but I think that's a small issue since unless you run your GPUs at max load for long hours everyday.
> 
> It seems AMD's latest range of GPU is finally at where they're supposedly be.


What stores? I can't find it.

Checked PCCG/Umart and static ice search engine.

(Talking fury here, not X)


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> What stores? I can't find it.
> 
> Checked PCCG/Umart and static ice search engine.
> 
> (Talking fury here, not X)


Don't see Fury yet, but the RRP is apparently the same as the 980. From past history it usually translates pretty well into AUD at similar ratio.

X is now at $999 as oppose to over $1050 as it was released. 980 Ti remains at $1050-$1100.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sashimi*
> 
> Don't see Fury yet, but the RRP is apparently the same as the 980. From past history it usually translates pretty well into AUD at similar ratio.
> 
> X is now at $999 as oppose to over $1050 as it was released. 980 Ti remains at $1050-$1100.


Damn, so fury is here but it's not here?









Fury X was always $999 (well the ones with 2 year warranty ie sapphire/XFX)


----------



## Sashimi

Quote:


> Originally Posted by *Casey Ryback*
> 
> Damn, so fury is here but it's not here?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury X was always $999 (well the ones with 2 year warranty ie sapphire/XFX)


I probably wasn't quite up to date with prices then. I'm sure the Fury X first appear at $1050 for pre-orders but might have dropped in prices as soon as reviews came out showing it's not exactly at 980 Ti level on raw performance.

Still not seeing Fury today.


----------



## Ceadderman

Quote:


> Originally Posted by *Sashimi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Casey Ryback*
> 
> Damn, so fury is here but it's not here?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury X was always $999 (well the ones with 2 year warranty ie sapphire/XFX)
> 
> 
> 
> I probably wasn't quite up to date with prices then. I'm sure the Fury X first appear at $1050 for pre-orders but might have dropped in prices as soon as reviews came out showing it's not exactly at 980 Ti level on raw performance.
> 
> Still not seeing Fury today.
Click to expand...

If you look for them, there are reviews showing that Fury XxFire out performs 980 Ti SLi.









~Ceadder


----------



## Orthello

Quote:


> Originally Posted by *Ceadderman*
> 
> If you look for them, there are reviews showing that Fury XxFire out performs 980 Ti SLi.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Love to see a couple of the Asus Fury under waterblocks , those custom PCBs and some unlocked voltage would make for a pretty good 4k gaming rig i reckon.

Maybe more likely the saphire Fury will get more support re waterblocks as its ref PCB however.

Geez what is the latest on unlocked voltage ?


----------



## Sashimi

Quote:


> Originally Posted by *Orthello*
> 
> Love to see a couple of the Asus Fury under waterblocks , those custom PCBs and some unlocked voltage would make for a pretty good 4k gaming rig i reckon.
> 
> Maybe more likely the saphire Fury will get more support re waterblocks as its ref PCB however.
> 
> Geez what is the latest on unlocked voltage ?


EK is making Fury X full blocks. If you're going multi GPU might as well get Fury X if budget allows since they scale so well. But yeah the current OC potential is still a massive let down. Guess it's AMD's usual practice to fix things over time.


----------



## teambigred

Well the Sapphire Fury Tri-X is up for order here in Australia.... $950. Making it only $50 less than the Fury X or the Palit Jestream 980 Ti. Sort of takes the 'value' proposition out of the air-cooled version. For $50 less you wouldn't bother I would think.


----------



## Casey Ryback

Quote:


> Originally Posted by *teambigred*
> 
> Well the Sapphire Fury Tri-X is up for order here in Australia.... $950. Making it only $50 less than the Fury X or the Palit Jestream 980 Ti. Sort of takes the 'value' proposition out of the air-cooled version. For $50 less you wouldn't bother I would think.


That looks like a pre-order gouge price to me.

Should be closer to $800.

$950 makes no sense when you can get full fiji with an AIO for $999.


----------



## teambigred

Quote:


> Originally Posted by *Casey Ryback*
> 
> That looks like a pre-order gouge price to me.
> 
> Should be closer to $800.
> 
> $950 makes no sense when you can get full fiji with an AIO for $999.


Exactly. Although... the pricing on Freesync vs G-Sync monitors is way off as well, and they've remained as such (and are selling).

The MG279Q was US$599 and here $999. The Acer XB270HU was US$799 and here $999.

Have to remember that maths works differently in Australia. When the RRP is $200 less, you don't pass that on whatsoever; just price them the same. When it's $100 less, like the Fury, you pass on only half of it. Sure makes sense to me...

Wouldn't surprise me if that $950 holds for at least a while.


----------



## speedyeggtart

Quote:


> Originally Posted by *teambigred*
> 
> Exactly. Although... the pricing on Freesync vs G-Sync monitors is way off as well, and they've remained as such (and are selling).
> 
> The MG279Q was US$599 and here $999. The Acer XB270HU was US$799 and here $999.
> 
> Have to remember that maths works differently in Australia. When the RRP is $200 less, you don't pass that on whatsoever; just price them the same. When it's $100 less, like the Fury, you pass on only half of it. Sure makes sense to me...
> 
> Wouldn't surprise me if that $950 holds for at least a while.


Maybe because AU automatically charges import/GST which gets applied to retail making the price appear higher?


----------



## Ceadderman

Quote:


> Originally Posted by *speedyeggtart*
> 
> Quote:
> 
> 
> 
> Originally Posted by *teambigred*
> 
> Exactly. Although... the pricing on Freesync vs G-Sync monitors is way off as well, and they've remained as such (and are selling).
> 
> The MG279Q was US$599 and here $999. The Acer XB270HU was US$799 and here $999.
> 
> Have to remember that maths works differently in Australia. When the RRP is $200 less, you don't pass that on whatsoever; just price them the same. When it's $100 less, like the Fury, you pass on only half of it. Sure makes sense to me...
> 
> Wouldn't surprise me if that $950 holds for at least a while.
> 
> 
> 
> Maybe because AU automatically charges import/GST which gets applied to retail making the price appear higher?
Click to expand...

No offense Mates but am sure glad I don't live in upsidedown land.









Though it won't be too long before my Government hits us with a tax surcharge for imported purchases. Sure hope everyone gets out an votes correctly in 2016. Cause it will happen if this reign of terror continues...

... WHAT'S IT MATTER!


















~Ceadder


----------



## FreeElectron

Quote:


> Originally Posted by *Ceadderman*
> 
> Agreed. If one cannot *wait for a couple months* to see what we have going on here then 390 Nitro is my advice. It's newer and better than 290. And should Fury X2 be what I think it is and owner wishes to further upgrade, he won't take a bath in resale to offset that upgrade. But me I will wait.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I've been away for a while..
Can you further explain what should we wait for?


----------



## sugalumps

Nothing, that is what they have been spouting for a while now. Dont get a 980ti wait for the fury x it will blow the ti away. When that didn't pan out it's wait on this, wait on that. Goal posts for ever being moved on the red side.


----------



## Sheyster

Quote:


> Originally Posted by *FreeElectron*
> 
> I've been away for a while..
> Can you further explain what should we wait for?


Wait for nothing. Life's too damn short.


----------



## Dhoulmagus

Quote:


> Originally Posted by *Sheyster*
> 
> Wait for nothing. Life's too damn short.


This post coupled with the fact that I've been awake for 30 hours just gave me a very visceral vision of myself pulling the trigger on crossfire Fury X.

Damned sheysters trying to take my money...


----------



## Maximization

Firy crossfire competes with titan x sli
Quote:


> Originally Posted by *Serious_Don*
> 
> This post coupled with the fact that I've been awake for 30 hours just gave me a very visceral vision of myself pulling the trigger on crossfire Fury X.
> 
> Damned sheysters trying to take my money...


after seeing this review i pulled trigger, 2 more weeks to go for delivery from amazon, 2 saphires i want

http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/


----------



## Dhoulmagus

Quote:


> Originally Posted by *Maximization*
> 
> Firy crossfire competes with titan x sli
> after seeing this review i pulled trigger, 2 more weeks to go for delivery from amazon, 2 saphires i want
> 
> http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/


Blarghh. When will it be my turn to >60 fps 4k


----------



## DADDYDC650

I'd go Crossfire Fury X but I can't get over how gameworks cripples these cards and only 4GB of memory.


----------



## Dhoulmagus

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd go Crossfire Fury X but I can't get over how gameworks cripples these cards and only 4GB of memory.


Well.. That's why it's called Nvidia Gameworks. Honestly I've just been keen to turning off every single Nvidia feature for many years like PhysX.. It doesn't add or subtract from the game for me.. Like The Witcher.. Ok there is a cool hair feature, I looked at it, hair kind of flowed and looked kind of cool.. OK turned it off, tripled my frame rate, went back to the same game!

The innovations are cool, it's annoying that it has to be proprietary but anything they actually make that is worthwhile will eventually be remade in an open source model.

The 4GB thing I understand though, couple progress with texture laziness and I think using over 4GB is going to become pretty much the norm at 4K in another year or two. If you're the type that upgrades constantly I guess it's fine, but for me I think it's something I'll struggle to look over next year when we have new GPUs, HBM2, and 8GB of it for the same price.


----------



## Silent Scone

Precisely, even some NVIDIA users disable GW features. They're definitely not what one could deem game critical. They might be patch work effects through abstract middleware layers, but it's one of the only redeeming things regarding visual fidelity besides resolution that we have currently. It's not like the developers seemingly care to cater in terms of these features, at least not with substantial push from the only two vendors we have.


----------



## iSlayer

Nvidia user here, never used GameWorks. Have used parts of it, HBAO+ (I couldn't really see a difference, just left it on because it was "max") in Blood Dragon and PhysX in a few different games, never a GameWorks game. Largely because most have been deepfried dog.

Outside of the features being added to Killing Floor 2, i've yet to see it be a big deal.


----------



## PostalTwinkie

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd go Crossfire Fury X but I can't get over how gameworks cripples these cards and only 4GB of memory.


You must be looking at something we aren't, because the 4GB of VRAM seems to be doing fine, especially in Crossfire where Fury X is keeping up and beating Titan X. If VRAM was even remotely an issue, we would be seeing it already.

Oh, and the bench I am talking about is at *5*K.
Quote:


> Originally Posted by *iSlayer*
> 
> Nvidia user here, never used GameWorks. Have used parts of it, HBAO+ (I couldn't really see a difference, just left it on because it was "max") in Blood Dragon and PhysX in a few different games, never a GameWorks game. Largely because most have been deepfried dog.
> 
> Outside of the features being added to Killing Floor 2, i've yet to see it be a big deal.


Yup.

Having three desktops in my home right now, mine with the 780 Ti and the other two with AMD cards......can't say I understand the Gameworks issue. The fix seemed pretty damn simple to me all these years;

_"Oh, look, the developer did another piss-poor job implementing a feature. I will just turn it off so it isn't an issue real fast."_

Yet people struggle with that basic concept, and then get pissed when they use their chosen piece of hardware to run software not designed for it. Just turn it off; developers do bugger all implementing Gameworks half the time anyways. Don't worry AMD, you aren't the only ones that get tanked by it!


----------



## Xuper

Any Review With Windows 10 ? Can't wait to see result.


----------



## Maximization

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'd go Crossfire Fury X but I can't get over how gameworks cripples these cards and only 4GB of memory.


look at 1:10 mark

https://www.youtube.com/watch?v=XJYWXHOUoFY

its stacking memory it appears when programmed for it.


----------



## DADDYDC650

Quote:


> Originally Posted by *Maximization*
> 
> look at 1:10 mark
> 
> https://www.youtube.com/watch?v=XJYWXHOUoFY
> 
> its stacking memory it appears when programmed for it.


Never heard of any memory stacking when using multiple cards..... pretty sure that's a DX12 feature and even then it has to be coded into the game.


----------



## iinversion

Quote:


> Originally Posted by *DADDYDC650*
> 
> Never heard of any memory stacking when using multiple cards..... *pretty sure that's a DX12 feature and even then it has to be coded into the game*.


That is correct.


----------



## Themisseble

How much L2 cache does tonga core has? I saw in carrizo slides only 512kb... some people says 1MB of L2 some even more.


----------



## Thoth420

Almost ready to join...I have had a Fury X XFX from release but my build is slow going. IRL and it's all new hardware. I should finally have it finished tonight to test out.

Sadly I don't have another PC just my Sager for now. Someone made me a killer offer on my last build (less the drives ).

Hoping for no pump noise or whine...I never bothered to open up the gpu because I think the sticker/etching is not indicative of a different revision.


----------



## curlyp

I apologize in advance of this has already been covered (lacking in my thread reading!). When does the R9 Fury release? I thought it was supposed to come out July 14? I've googled it but cannot find anything on it. Thanks!


----------



## p4inkill3r

Quote:


> Originally Posted by *curlyp*
> 
> I apologize in advance of this has already been covered (lacking in my thread reading!). When does the R9 Fury release? I thought it was supposed to come out July 14? I've googled it but cannot find anything on it. Thanks!


It is available for purchase right now: http://www.nowinstock.net/computers/videocards/amd/r9fury/


----------



## Hattifnatten

Reviews were up on the 10th, and the card launched on the 14th.


----------



## rluker5

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You must be looking at something we aren't, because the 4GB of VRAM seems to be doing fine, especially in Crossfire where Fury X is keeping up and beating Titan X. If VRAM was even remotely an issue, we would be seeing it already.
> 
> Oh, and the bench I am talking about is at *5*K.
> Yup.
> 
> Having three desktops in my home right now, mine with the 780 Ti and the other two with AMD cards......can't say I understand the Gameworks issue. The fix seemed pretty damn simple to me all these years;
> 
> _"Oh, look, the developer did another piss-poor job implementing a feature. I will just turn it off so it isn't an issue real fast."_
> 
> Yet people struggle with that basic concept, and then get pissed when they use their chosen piece of hardware to run software not designed for it. Just turn it off; developers do bugger all implementing Gameworks half the time anyways. Don't worry AMD, you aren't the only ones that get tanked by it!


3GB still working well for me at 4k. They can turn up the vram useage in tests, (and nearly invisible effects) but I've yet to come to a game where I can't turn it down and still get smooth performance (due to ram limits) with everything visually relevant in the game still intact.
If you can't see the difference when you enable something then it doesn't matter.

Always thought GCN would come out on top when they got the consoles. Haven't seen that yet, but I also haven't seen them left behind. I think this card still has a lot if improvement left.


----------



## SpeedyVT

https://www.reddit.com/r/pcmasterrace/comments/3c2kiw/bug_or_nv_cheating_with_optimized_image_quality/

I found something interesting.


----------



## Forceman

Quote:


> Originally Posted by *SpeedyVT*
> 
> https://www.reddit.com/r/pcmasterrace/comments/3c2kiw/bug_or_nv_cheating_with_optimized_image_quality/
> 
> I found something interesting.


It was discussed before. It's nothing, at most a game bug with BF4. There's a whole thread about it.

http://www.overclock.net/t/1563386/texture-filtering-quality-thread-amd-gcn-vs-nvidia-maxwell-and-kepler/30_30#post_24127745

and here:

http://www.overclock.net/t/1563118/ocuk-possible-differences-in-rendered-iq-between-titanx-980ti-and-amd-kepler-cards/0_30

and here:

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/2130_30#post_24126398

It's not a new discovery.


----------



## SpeedyVT

Quote:


> Originally Posted by *Forceman*
> 
> It was discussed before. It's nothing, at most a game bug with BF4. There's a whole thread about it.
> 
> http://www.overclock.net/t/1563386/texture-filtering-quality-thread-amd-gcn-vs-nvidia-maxwell-and-kepler/30_30#post_24127745
> 
> and here:
> 
> http://www.overclock.net/t/1563118/ocuk-possible-differences-in-rendered-iq-between-titanx-980ti-and-amd-kepler-cards/0_30
> 
> and here:
> 
> http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/2130_30#post_24126398
> 
> It's not a new discovery.


Little slow on that discovery... well. It's not just BF4 though. Some games like sleeping dogs.


----------



## Xuper

New AMD Driver From Microsoft for Windows 10:

;15.20.1012.0000 15.20.1012 150224a2
;15.20.1012.0001 15.20.1012 150227n
;15.20.1012.0002 15.20.1012 150227n 
;15.20.1012.0003 15.20 main wNxt 150311a
;15.20.1012.0004 15.20 main wNxt 150318a
;15.20.1018.0000 15.20.1018 150326a
;15.20.1018.0001 15.20 main wNxt 150401a
;15.20.1018.0002 15.20 main wNxt 150408a
;15.20.1018.0003 15.20 main wNxt 150422a
;15.20.1018.0004 15.20 main wNxt 150422a
;15.20.1023.0000 15.20.1023 150428a
;15.20.1023.0002 15.20 main 150505a 
;15.20.1023.0003 15.20 main 150514a1 
;15.20.1023.0004 15.20 main 150514a1 
;15.20.1023.0005 15.20 main 150522a 
;15.20.1023.0006 15.20 main 150528a 
;15.20.1023.0007 15.20 150602a 
;15.20.1023.0008 15.20 150609a 
;15.20.1023.0009 15.20 150617a 
;15.20.1023.0010 15.20 150624a
;15.20.1045.0000 15.20 150701a
;15.20.1046.0001 15.20 150707a
;15.20.1046.0002 15.20 150715a
;-----------------------------------------------
;----------1507161531-15.20-150715a-184226E.12
; AMD display information file
;
; Installation INF for the AMD display driver.
; Copyright(C) AMD 2007-2015
;-----------------------------------------------
; PX - PX Proxy mode
; PR - PX Proxy Ready

[Version]
Signature="$Windows NT$"
Provider=%ATI%
ClassGUID={4D36E968-E325-11CE-BFC1-08002BE10318}
Class=Display
DriverVer=07/15/2015, 15.200.1046.0002
;;LayoutFile=layout.inf
CatalogFile=C0184226.CAT

I wonder it works on Fury X ?


----------



## curlyp

Quote:


> Originally Posted by *p4inkill3r*
> 
> It is available for purchase right now: http://www.nowinstock.net/computers/videocards/amd/r9fury/


Thank you very much.
Quote:


> Originally Posted by *Hattifnatten*
> 
> Reviews were up on the 10th, and the card launched on the 14th.


Thanks as well. Is there a respectable site for the reviews of this card?


----------



## Casey Ryback

Quote:


> Originally Posted by *curlyp*
> 
> Thank you very much.
> Thanks as well. Is there a respectable site for the reviews of this card?


http://www.overclock.net/t/1564303/various-amd-r9-fury-reviews


----------



## iLeakStuff

Seems like AMD may have started to get a little better production of Fury X cards.
Deal with UMC seems to have helped









http://www.nowinstock.net/computers/videocards/amd/r9furyx/full_history.php


----------



## PostalTwinkie

Quote:


> Originally Posted by *iLeakStuff*
> 
> Seems like AMD may have started to get a little better production of Fury X cards.
> Deal with UMC seems to have helped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.nowinstock.net/computers/videocards/amd/r9furyx/full_history.php


Or people just stopped buying their busted ass card.

"Overclockers Dream"

EDIT:

Thursday a Sapphire R9 290X Tri-X shows up at my door, was a far better purchase than any Fury product on the market. $250 after rebate, going right in this box sitting next to me for my brother.


----------



## iLeakStuff

Not everyone buys their cards to overclock them though PostalTwinkie.


----------



## iSlayer

Idk why said people would be on overclock.net then


----------



## iLeakStuff

Quote:


> Originally Posted by *iSlayer*
> 
> Idk why said people would be on overclock.net then


I`m fairly new here but to me it seems like just a regular tech forum that have both overclockers and people who just enjoy their hardware, stock or not


----------



## iLeakStuff

Stocks for AMD is shooting upwards too








http://finance.yahoo.com/echarts?s=AMD+Interactive#{%22range%22:%225d%22,%22allowChartStacking%22:true}


----------



## SpeedyVT

Quote:


> Originally Posted by *iLeakStuff*
> 
> Stocks for AMD is shooting upwards too
> 
> 
> 
> 
> 
> 
> 
> 
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{%22range%22:%225d%22,%22allowChartStacking%22:true}


China Console Market...

I should've bought.


----------



## Ceadderman

^Just went up from 1.89 to 1.90. So it could be a reasonable investment for 1000 shares. I wouldn't expect it to make one rich, but for a short term investment AMD could give you a reasonable return to turn your investment into a step up placeholder stock.









@iSlayer...

Pardon me for stating this, but 290x isn't in the league of these busted ass cards. It shows well as it should since it was top of the pile for awhile.

The "busted ass" however can be attributed to percentage of 3/5% product margins from what I have seen reported. Voltage locked(temporarily) and can be software clocked via MSi Afterburner.

I really don't see why people expect BRAND NEW tech(HBM) to be perfected by the day of launch. There will always be bugs and kinks you *may* run into with new tech. But Fury X indeed works. It may not meet your expectations, but it's certainly not "busted ass" imho.









~Ceadder


----------



## PostalTwinkie

Quote:


> Originally Posted by *iLeakStuff*
> 
> Not everyone buys their cards to overclock them though PostalTwinkie.


I didn't say everyone does, and that doesn't matter.

AMD said "It is an overclocker's dream", and that is flat out a lie. It is a complete dog in every way when it comes to overclocking. They made a comment that was specific to one demographic, and it was a complete lie.

I also would argue that the people in the already small group that buy the card to begin with, that won't OC, are an extreme minority themselves.

Quote:


> Originally Posted by *Ceadderman*
> 
> Pardon me for stating this, but 290x isn't in the league of these busted ass cards. It shows well as it should since it was top of the pile for awhile.
> 
> The "busted ass" however can be attributed to percentage of 3/5% product margins from what I have seen reported. Voltage locked(temporarily) and can be software clocked via MSi Afterburner.
> 
> I really don't see why people expect BRAND NEW tech(HBM) to be perfected by the day of launch. There will always be bugs and kinks you *may* run into with new tech. But Fury X indeed works. It may not meet your expectations, but it's certainly not "busted ass" imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Have you not seen the temps on the thing? Software isn't going to fix that.


----------



## Ceadderman

54-64c gaming and benchmarking? How's that bad for an AIO solution?

If you want cooler, get a block.









Am I missing something? What temperature issues are we talking about?









~Ceadder


----------



## Tivan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> AMD said "It is an overclocker's dream", and that is flat out a lie.


The core stays cool no matter how much you overclock it, how's that for an OC dream! c:


----------



## Themisseble

WoW
http://finance.yahoo.com/echarts?s=AMD+Interactive#{"range":"5d","allowChartStacking":true}

What happened?


----------



## SpeedyVT

Quote:


> Originally Posted by *Themisseble*
> 
> WoW
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{"range":"5d","allowChartStacking":true}
> 
> What happened?


China and the console market.


----------



## iLeakStuff

Quote:


> Originally Posted by *Themisseble*
> 
> WoW
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{"range":"5d","allowChartStacking":true}
> 
> What happened?


Windows 10 def helps AMD because it means new notebook sales that carry AMD newest Carrizo APUs.

Plus like Speedy says, China just removed ban from consoles and they all run AMD APUs. Rumor has it that Nintendo will use AMD APU in their upcoming console too.


----------



## svenge

Quote:


> Originally Posted by *Themisseble*
> 
> WoW
> http://finance.yahoo.com/echarts?s=AMD+Interactive#{"range":"5d","allowChartStacking":true}
> 
> What happened?


Dead cat bounce.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ceadderman*
> 
> 54-64c gaming and benchmarking? How's that bad for an AIO solution?
> 
> If you want cooler, get a block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Am I missing something? What temperature issues are we talking about?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Sure, completely locked down and out of the box the temps are OK. Unlock the voltage and start to OC the thing, like the "Overclocker's dream" that it is. Not only does power draw go through the roof, but the individual components really start heating up.

Discussion happening in a couple of threads now on it, the voltage scaling thread is a good one.

This card is a pile.


----------



## NuclearPeace

Apparently pumps begin to wear faster and/or die prematurely if they are heated above 70c. I doubt that's going to be the pump temperature even though the core is running at 70c. The waterblock isnt a perfect heatsink and there is still going to be some latent heat energy from the GPU core that isn't reaching the actual pump.


----------



## PostalTwinkie

Quote:


> Originally Posted by *NuclearPeace*
> 
> Apparently pumps begin to wear faster and/or die prematurely if they are heated above 70c. I doubt that's going to be the pump temperature even though the core is running at 70c. The waterblock isnt a perfect heatsink and there is still going to be some latent heat energy from the GPU core that isn't reaching the actual pump.


The 95c VRMs (hard OC) that make some form(ish) of contact with the copper pipe are going to contribute to those temperatures in the loop as well. If AMD managed to make decent enough contact to cool them (as intended) it will contribute greatly to those loop temps.


----------



## xxdarkreap3rxx

Quote:


> Originally Posted by *PostalTwinkie*
> 
> AMD said "It is an overclocker's dream", and that is flat out a lie.


They said "dream" not "reality"


----------



## Randomdude

Ordered a Sapphire R9 Fury X, should have it within the next couple weeks (safe measure of time). I always had a gut feeling about this card and the green side isn't about to sway me now.

Will be a very weird configuration with an E5450 and that GPU, but I hope it won't be too horribly bottle-necked. I will leave my first impressions when I get them.


----------



## iinversion

Quote:


> Originally Posted by *Randomdude*
> 
> Ordered a Sapphire R9 Fury X, should have it within the next couple weeks (safe measure of time). I always had a gut feeling about this card and the green side isn't about to sway me now.
> 
> Will be a very weird configuration with an E5450 and that GPU, but I hope it won't be too horribly bottle-necked. I will leave my first impressions when I get them.


Bottlenecked is an understatement.

You could have picked up a mid range card and still been bottlenecking it. Basically with that CPU it wouldn't matter if you had a Fury X or a R9 270, you're going to be bottlenecking so hard they would seem to have the same performance in games.

LGA 775 also lacks newer PCI-E gen.. You could be two generations behind depending on your board and that will be another bottleneck.


----------



## iSlayer

Quote:


> Originally Posted by *iLeakStuff*
> 
> I`m fairly new here but to me it seems like just a regular tech forum that have both overclockers and people who just enjoy their hardware, stock or not


Being on OCN and not OCing is basically a sin. The tag line is "the pursuit of performance".
Quote:


> Originally Posted by *Ceadderman*
> 
> @iSlayer...
> 
> Pardon me for stating this, but 290x isn't in the league of these busted ass cards. It shows well as it should since it was top of the pile for awhile.
> 
> The "busted ass" however can be attributed to percentage of 3/5% product margins from what I have seen reported. Voltage locked(temporarily) and can be software clocked via MSi Afterburner.


Dunno what comment by me that is in reference to. Can't see.
Quote:


> I really don't see why people expect BRAND NEW tech(HBM) to be perfected by the day of launch. There will always be bugs and kinks you *may* run into with new tech. But Fury X indeed works. It may not meet your expectations, but it's certainly not "busted ass" imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


It need not be perfect but locked memory, no VRM cooling, poor scaling with voltage and bad overclocking at stock combined with being called "an overclockers dream" isn't going to win AMD points.

It's like when that Intel VP insulted everyone by saying 4790ks do 5GHz easily on air and anyone that can't needs to be educated on overclocking.
Quote:


> Originally Posted by *Ceadderman*
> 
> 54-64c gaming and benchmarking? How's that bad for an AIO solution?
> 
> If you want cooler, get a block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Am I missing something? What temperature issues are we talking about?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


VRMs do 95c stock like the Titan X which further inhibits the OCing on the Fury X. The backplate on it does nothing.
Quote:


> Originally Posted by *Tivan*
> 
> The core stays cool no matter how much you overclock it, how's that for an OC dream! c:


Guessing sarcasm.
Quote:


> Originally Posted by *iLeakStuff*
> 
> Windows 10 def helps AMD because it means new notebook sales that carry AMD newest Carrizo APUs.
> 
> Plus like Speedy says, China just removed ban from consoles and they all run AMD APUs. Rumor has it that Nintendo will use AMD APU in their upcoming console too.


Carrizo laptops seem hard to come by. Intel also has processors in laptops y'know







.

As for the consoles, its a short term bump. Consoles are low margin and not about to save AMD. 16nm GPUs will mark the death of the current consoles. They're already on par in ports with $400 custom builds.


----------



## NexusRed

Quote:


> Originally Posted by *iSlayer*
> 
> Being on OCN and not OCing is basically a sin. The tag line is "the pursuit of performance".


Lol didn't know OCN turned in a religion. I have a 4690K at stock speeds. I'll overclock whenever it's convient too. Just because we have the capabilities to OC doesn't mean we have too. It's best to get your " sin" notion out of your mind and just be happy that we are all in the same boat.


----------



## SpeedyVT

Quote:


> Originally Posted by *NexusRed*
> 
> Lol didn't know OCN turned in a religion. I have a 4690K at stock speeds. I'll overclock whenever it's convient too. Just because we have the capabilities to OV doesn't mean we have too. It's best to get your " sin" notion out of your mind and just be happy that we are all in the same boat.


I think undervolting counts. Getting as much as possible so little.


----------



## iSlayer

That wasn't meant to be taken so seriously.

As for undervolting, it definitely does. More power or power per dollar, volt, or watt is always good.


----------



## Randomdude

Quote:


> Originally Posted by *iinversion*
> 
> Bottlenecked is an understatement.
> 
> You could have picked up a mid range card and still been bottlenecking it. Basically with that CPU it wouldn't matter if you had a Fury X or a R9 270, you're going to be bottlenecking so hard they would seem to have the same performance in games.
> 
> LGA 775 also lacks newer PCI-E gen.. You could be two generations behind depending on your board and that will be another bottleneck.


Honestly?

I guess I knew all that when I bit the bullet. Seeing how much I actually spent, I could've gotten something north of a Sandy Bridge i5, upgraded motherboard and got new ram as well. But I would've had way less reason to tinker with it, I wouldn't have felt the same attachment to the system. I've never tinkered with my hardware before, wanted to, but never had the option. I'd love to overclock this Xeon (another thing I like is that I will get to play with a Xeon chip, modifying bios with micro codes for it, some minor hardware tweaks and so on - all new stuff for me, really) and see what it can do with a CLC (CM SEIDON) and how it'll compare to some newer systems, plus the SSD should really help get this old platform back to life.

Using a newer platform I wouldn't have bothered with any of that as I wouldn't have had a reason to, seeing as it would already be running as good as it can on 1080p, desktop operations and World of Warcraft (only game I play) for me to be able to do anything to cause a worthwhile difference in real-life operations. For example, I will need to overclock the Xeon to stream with good quality options and at a high resolution, I won't have to do that with a 2600k, it's already good enough. And at the end of the day, it's all a hobby and you should do what pleases you, right? If I'm trying to justify myself, find the logical path, I couldn't - and that path, to me, it wouldn't have brought the same satisfaction. I should've bought a laptop instead in hindsight, it would've been much more useful. But I wanted to scratch an itch! And become more involved with this hobby.

Either way, I wouldn't feel the bottleneck even if it was there, I imagine - I never knew better. Coming from a stock E6300, no SSD and a really, really old 1950. The contrast should be pretty substantial!

It does have an older generation PCI Express, but it doesn't seem to hurt it too much.

PCI-E gen. comparisons below:






Also, in this video with a GTX 670, which is a way slower card but worth noting the example - the CPU is never at 100% on all cores (in the whole video it was fluctuating between the high 30's, low 40's and up to the high 80's), and the GPU utilization is 99% during the whole video. And the CPU is on 3,6GHz, not higher.


----------



## iinversion

Quote:


> Originally Posted by *Randomdude*
> 
> Honestly?
> 
> I guess I knew all that when I bit the bullet. Seeing how much I actually spent, I could've gotten something north of a Sandy Bridge i5, upgraded motherboard and got new ram as well. But I would've had way less reason to tinker with it, I wouldn't have felt the same attachment to the system. I've never tinkered with my hardware before, wanted to, but never had the option. I'd love to overclock this Xeon (another thing I like is that I will get to play with a Xeon chip, modifying bios with micro codes for it, some minor hardware tweaks and so on - all new stuff for me, really) and see what it can do with a CLC (CM SEIDON) and how it'll compare to some newer systems, plus the SSD should really help get this old platform back to life.
> 
> Using a newer platform I wouldn't have bothered with any of that as I wouldn't have had a reason to, seeing as it would already be running as good as it can on 1080p, desktop operations and World of Warcraft (only game I play) for me to be able to do anything to cause a worthwhile difference in real-life operations. For example, I will need to overclock the Xeon to stream with good quality options and at a high resolution, I won't have to do that with a 2600k, it's already good enough. And at the end of the day, it's all a hobby and you should do what pleases you, right? If I'm trying to justify myself, find the logical path, I couldn't - and that path, to me, it wouldn't have brought the same satisfaction. I should've bought a laptop instead in hindsight, it would've been much more useful. But I wanted to scratch an itch! And become more involved with this hobby.
> 
> Either way, I wouldn't feel the bottleneck even if it was there, I imagine - I never knew better. Coming from a stock E6300, no SSD and a really, really old 1950. The contrast should be pretty substantial!
> 
> It does have an older generation PCI Express, but it doesn't seem to hurt it too much.
> 
> PCI-E gen. comparisons below:
> 
> 
> 
> 
> 
> 
> Also, in this video with a GTX 670, which is a way slower card but worth noting the example - the CPU is never at 100% on all cores (in the whole video it was fluctuating between the high 30's, low 40's and up to the high 80's), and the GPU utilization is 99% during the whole video. And the CPU is on 3,6GHz, not higher.


I'm not going to comment on everything you said but it does indeed look like there is not very much difference between PCI-E 1.1 and 3.0, HOWEVER, as the performance of the card increases so the the difference so I would expect greater differences than what you see there.

In addition since you mentioned you only play WoW, I feel like at this time AMD is a bad choice for that game.



As you can see, even a GTX 670 is quite a bit ahead of a 290X. I imagine a Fury X would probably be on par with a 670 or 680 in WoW. This is because of AMD's DX11 driver CPU overhead as WoW and many other MMORPG's depend heavily on the CPU. AMD is pretty far behind Nvidia in terms of CPU driver overhead at this time. Who knows what will happen in the future.

I understand the upgrade itch, but at the same time buying a Fury X to play WoW at 1080p w/ PCI-E 1.1 and an older CPU like that just isn't logical at all to me for a few reasons.


Fury/Fury X are notably bad at 1080p for their price compared to Nvidia equivalents.
Fury/Fury X is already going to be a bottleneck with your CPU.
AMD DX 11 CPU driver overhead issue as mentioned before. This is relevant because you play only WoW where this matters a lot and on top of that you already have an older/weaker CPU which isn't helping you to overcome the driver overhead.
If I were you and I only played WoW and had your CPU, I'd probably pick up a 960 or 970 and call it a day. They will still be a bottleneck, but you'll get more FPS in WoW while saving a bunch of money.

Like you said though, you could have just picked up a laptop and that would have been fine for your uses anyway.


----------



## Randomdude

Well, I'm not sure if you'd believe me, but I mostly wanted to go with AMD because they're the underdog and I believe in putting my money where my mouth is. Especially if I won't notice a difference in my use, I believe in that. That played a huge part in my decision on vendor, and I also thought that since I was going to do a useless purchase anyway (would've been a 980Ti, I wanted a high-end card and those were same price - but I'd have only done that had Nvidia been the ones with the 20% market share), in real use it would practically perform the same given the CPU, 1080p and 60Hz, probably in every game. I forgot to mention, I'm playing on Wrath of the Lich King







. Purely because of my Player versus Player preferences. Yeah, given DX9, ancient graphics and much lower population and less CPU tasks, I'd wager that an nVidia 630 would run it at 60FPS







It really does make no sense what I did though, does it >_>

Thank you for your feedback on the decision, I'm unsure of it myself, but I hope what I did is at least the least bit relatable.


----------



## friend'scatdied

Correct me if I'm wrong, but isn't WoW almost completely CPU-bottlenecked in general?

For WoW Fury X seems like $650 wasted given the E5450..


----------



## Randomdude

Yes, anything above a GT630 is wasted money on WotLK. But I didn't want a 630.


----------



## magnek

Quote:


> Originally Posted by *svenge*
> 
> Dead cat bounce.


That was my first thought as well, and I came *this* close to shorting a few thousand shares, thinking it must've been a short squeeze since there was absolutely no positive news pertaining to AMD that day. Luckily I didn't.
Quote:


> Originally Posted by *NuclearPeace*
> 
> Apparently pumps begin to wear faster and/or die prematurely if they are heated above 70c. I doubt that's going to be the pump temperature even though the core is running at 70c. The waterblock isnt a perfect heatsink and there is still going to be some latent heat energy from the GPU core that isn't reaching the actual pump.


Coolant temp will always be lower than core temp and usually by quite a bit. For example gaming temps on my 4930K is around 45-55C, and the 970s typically reach between low 40s to 50C. The water temp however has never went above 35C even during the hottest load scenarios.
Quote:


> Originally Posted by *xxdarkreap3rxx*
> 
> They said "dream" not "reality"


Ha good one.


----------



## Sheyster

Quote:


> Originally Posted by *NexusRed*
> 
> Lol didn't know OCN turned in a religion. I have a 4690K at stock speeds. I'll overclock whenever it's convient too. Just because we have the capabilities to OC doesn't mean we have too. It's best to get your " sin" notion out of your mind and just be happy that we are all in the same boat.


You have a K series CPU, you also have a custom loop, and you're NOT overclocked?

Alrighty then...


----------



## Ceadderman

Only plays WoW, gets Fury X?









~Ceadder


----------



## Farih

Quote:


> Originally Posted by *iinversion*
> 
> I'm not going to comment on everything you said but it does indeed look like there is not very much difference between PCI-E 1.1 and 3.0, HOWEVER, as the performance of the card increases so the the difference so I would expect greater differences than what you see there.
> 
> In addition since you mentioned you only play WoW, I feel like at this time AMD is a bad choice for that game.
> 
> 
> 
> As you can see, even a GTX 670 is quite a bit ahead of a 290X. I imagine a Fury X would probably be on par with a 670 or 680 in WoW. This is because of AMD's DX11 driver CPU overhead as WoW and many other MMORPG's depend heavily on the CPU. AMD is pretty far behind Nvidia in terms of CPU driver overhead at this time. Who knows what will happen in the future.
> 
> I understand the upgrade itch, but at the same time buying a Fury X to play WoW at 1080p w/ PCI-E 1.1 and an older CPU like that just isn't logical at all to me for a few reasons.
> 
> 
> Fury/Fury X are notably bad at 1080p for their price compared to Nvidia equivalents.
> Fury/Fury X is already going to be a bottleneck with your CPU.
> AMD DX 11 CPU driver overhead issue as mentioned before. This is relevant because you play only WoW where this matters a lot and on top of that you already have an older/weaker CPU which isn't helping you to overcome the driver overhead.
> If I were you and I only played WoW and had your CPU, I'd probably pick up a 960 or 970 and call it a day. They will still be a bottleneck, but you'll get more FPS in WoW while saving a bunch of money.
> 
> Like you said though, you could have just picked up a laptop and that would have been fine for your uses anyway.


That benchmark is posted alot but doesnt hold a good complete picture of the truth.
The benchmark is taken on a live realm i think, how many players was there in each test ? (live actual players on your screen is what impacts WoW performance)

I have a 960 OC (1480mhz 24/7) that runs WoW at 1080P with shadows to good and MSAAx4
This keeps 60fps locked most of the times except for big multiplayer battles. (CPU problem there)
Any setting higher make's the 960 crap out.

Other PC got a 290x OC (1150mhz 24/7) that runs WoW at *1800P* with shadows at ultra and MSAAx2
Same FPS as the 960.
At 1440P it can run with MSAAx8 or even SMAA.
The 960 cant even do 1440p with proper settings









Yes for the same price an Nvidia card is better in WoW but the difference are not nearly as how this benchmark shows them.
Quote:


> Originally Posted by *Ceadderman*
> 
> Only plays WoW, gets Fury X?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


To play WoW above 1080p at "ultra" with some AA you DO need something rather powerfull.
A 970/980 would be my best bed, but a Fury would do fine to IMO.
Trust me your computer (in sig) cant even run it in all its glory on 1080p without AA, its not the same game from 2005 anymore.


----------



## Randomdude

Quote:


> Originally Posted by *Ceadderman*
> 
> Only plays WoW, gets Fury X?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


If it doesn't affect your freedom and I've already explained why I did it to a person who asked - what's the point of this comment? Rather redundant, wouldn't you say? This hobby is becoming unpleasant to share here for me because of people like you.


----------



## erocker

Quote:


> Originally Posted by *Randomdude*
> 
> If it doesn't affect your freedom and I've already explained why I did it to a person who asked - what's the point of this comment? Rather redundant, wouldn't you say? This hobby is becoming unpleasant to share here for me because of people like you.


It's alright man, he apparently uses his GPU for posting snide judgmental comments in internet forums. No worries mate.


----------



## Ceadderman

Quote:


> Originally Posted by *Randomdude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Only plays WoW, gets Fury X?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If it doesn't affect your freedom and I've already explained why I did it to a person who asked - what's the point of this comment? Rather redundant, wouldn't you say? This hobby is becoming unpleasant to share here for me because of people like you.
Click to expand...

Meant no offense. I just don't understand their need for a high powered GPU for a game like WoW. You could get a cheaper GPU if that's their only game being played in 1080p. Seems like a massive investment for a game you pay to play imho. Or even as posted earlier upgrading your surroundING hardware would be a better investment. But that's just me.

If it's what you want to do, it's your money and your investment. Nothing I say or do changes anything.









~Ceadder


----------



## iinversion

Quote:


> Originally Posted by *Farih*
> 
> That benchmark is posted alot but doesnt hold a good complete picture of the truth.
> The benchmark is taken on a live realm i think, how many players was there in each test ? (live actual players on your screen is what impacts WoW performance)
> 
> I have a 960 OC (1480mhz 24/7) that runs WoW at 1080P with shadows to good and MSAAx4
> This keeps 60fps locked most of the times except for big multiplayer battles. (CPU problem there)
> Any setting higher make's the 960 crap out.
> 
> Other PC got a 290x OC (1150mhz 24/7) that runs WoW at *1800P* with shadows at ultra and MSAAx2
> Same FPS as the 960.
> At 1440P it can run with MSAAx8 or even SMAA.
> The 960 cant even do 1440p with proper settings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes for the same price an Nvidia card is better in WoW but the difference are not nearly as how this benchmark shows them.
> To play WoW above 1080p at "ultra" with some AA you DO need something rather powerfull.
> A 970/980 would be my best bed, but a Fury would do fine to IMO.
> Trust me your computer (in sig) cant even run it in all its glory on 1080p without AA, its not the same game from 2005 anymore.


I have never played WoW and never plan to so I know nothing about it.

Could the large difference be because of what they were doing when they pulled those FPS? When you look at the benchmark it says Azeroths final battle or whatever. I'm guessing that is some super intense battle or whatever.

I compared WoT with someone else that had a Fury. Went into a custom battle and took note of our FPS in same locations. Depending on the location/scene the FPS was either very similar or highly in my 980's favor. I'm guessing WoW will produce similar results.


----------



## STEvil

My gf runs wow on a 7850k and 750ti 1080p buttery smooth. Fsaa isnt maxed but everything else is. Even the 7850k by itself will run wow acceptably with a bit of an overclock to the gpu and nb.

Whats interesting is my friend is using my old gtx670 and it wont hardly run wow at all recently, but i think it might be dying too given that it's an asus dcu-ii card (they don't seem to last long, this will be the 4th I've had for on me).


----------



## Ceadderman

Afaik, there are no massively mutiplayer games optimized to require a high end card nor do they benefit from them. Frame rates are beneficial but it's only necessary in the most optimal of gaming imho. That being single player/mutiplayer gaming that isn't subscription based.









~Ceadder


----------



## Farih

Quote:


> Originally Posted by *iinversion*
> 
> I have never played WoW and never plan to so I know nothing about it.
> 
> Could the large difference be because of what they were doing when they pulled those FPS? When you look at the benchmark it says Azeroths final battle or whatever. I'm guessing that is some super intense battle or whatever.
> 
> I compared WoT with someone else that had a Fury. Went into a custom battle and took note of our FPS in same locations. Depending on the location/scene the FPS was either very similar or highly in my 980's favor. I'm guessing WoW will produce similar results.


Yes it is all about what they were doing and especially at what time.
They played "Azeroth's Final Battle" wich is just the intro to the new "Outlands" through the Dark Portal. (WoD)
Thats also on live servers where there are other people playing, other players is what affects your performance in WoW.
WoW is very CPU based.

How many players was on the screen when testing ?
Its almost impossible to test WoW properly.

Yes Nvidia cards do better in WoW but not nearly so much better as this benchmark wants to show.
Quote:


> Originally Posted by *Ceadderman*
> 
> Afaik, there are no massively mutiplayer games optimized to require a high end card nor do they benefit from them. Frame rates are beneficial but it's only necessary in the most optimal of gaming imho. That being single player/mutiplayer gaming that isn't subscription based.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Mostly they dont no BUT it depends how you play it.
I like to downsample and use AA, even WoW can tax my GPU heavely then and there are much heavier/better looking MMO's then WoW around to.
Quote:


> Originally Posted by *STEvil*
> 
> My gf runs wow on a 7850k and 750ti 1080p buttery smooth. Fsaa isnt maxed but everything else is. Even the 7850k by itself will run wow acceptably with a bit of an overclock to the gpu and nb.
> 
> Whats interesting is my friend is using my old gtx670 and it wont hardly run wow at all recently, but i think it might be dying too given that it's an asus dcu-ii card (they don't seem to last long, this will be the 4th I've had for on me).


Your GF playing in the old parts still ?
No way your going to have solid FPS on a 7850K and 750ti at max settings (without AA) in all new area's
Go try to play a round of Ashran









Anyways.... enough about WoW and back to topic ? (Its hard for me to stop once i start talking WoW







)


----------



## STEvil

No, she's probably got a couple thousand hours in wow.. lol.


----------



## SpeedyVT

Quote:


> Originally Posted by *Farih*
> 
> Yes it is all about what they were doing and especially at what time.
> They played "Azeroth's Final Battle" wich is just the intro to the new "Outlands" through the Dark Portal. (WoD)
> Thats also on live servers where there are other people playing, other players is what affects your performance in WoW.
> WoW is very CPU based.
> 
> How many players was on the screen when testing ?
> Its almost impossible to test WoW properly.
> 
> Yes Nvidia cards do better in WoW but not nearly so much better as this benchmark wants to show.
> Mostly they dont no BUT it depends how you play it.
> I like to downsample and use AA, even WoW can tax my GPU heavely then and there are much heavier/better looking MMO's then WoW around to.
> Your GF playing in the old parts still ?
> No way your going to have solid FPS on a 7850K and 750ti at max settings (without AA) in all new area's
> Go try to play a round of Ashran
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways.... enough about WoW and back to topic ? (Its hard for me to stop once i start talking WoW
> 
> 
> 
> 
> 
> 
> 
> )


7850k has way more IPC than the old FX processors. It's just thread lacking. Well in the instruction set WoW is most dependent on. SSE.


----------



## iinversion

Quote:


> Originally Posted by *SpeedyVT*
> 
> 7850k has way more IPC than the old FX processors. It's just thread lacking. Well in the instruction set WoW is most dependent on. SSE.


If by way more you mean ~10%. That is still less than Nehalem.. Hardly way more..


----------



## SpeedyVT

Quote:


> Originally Posted by *iinversion*
> 
> If by way more you mean ~10%. That is still less than Nehalem.. Hardly way more..


Nehalem is really slow now. Kaveri is 10% guesstimate faster than Nehalem. While it does suffer in some performance issues comparatively. Then again we're comparing 2009 chip to 2014. Significantly different NM structures. The A10-7700k is weaker in multi-threading to the i7-960, but faster in single threads.

-5% multi and +10% single for Kaveri over Nehalem.

Even Piledriver is as fast as Nehalem in single.

Piledriver
http://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8300+Eight-Core

Nehalem
http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7+960+%40+3.20GHz

Kaveri
https://cpubenchmark.net/cpu.php?cpu=AMD+A10-7700K+APU

Intel's largest IPC jump was from the i7-900 series to i7-2600k. Every iteration from then on was about 5% improvement. Clock for clock. The newer ones became better because they could achieve higher overclocks.

A lot of the IPC improvements in the newer chips came from newer instruction sets. Games like WoW still depend on older sets. So ideally you shouldn't see too much of a landslide in WoW. However! In SC2 that's a different story altogether, newer instruction sets involved.


----------



## iinversion

Quote:


> Originally Posted by *SpeedyVT*
> 
> Nehalem is really slow now. Kaveri is 10% guesstimate faster than Nehalem. While it does suffer in some performance issues comparatively. Then again we're comparing 2009 chip to 2014. Significantly different NM structures. The A10-7700k is weaker in multi-threading to the i7-960, but faster in single threads.
> 
> -5% multi and +10% single for Kaveri over Nehalem.
> 
> Even Piledriver is as fast as Nehalem in single.
> 
> Piledriver
> http://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8300+Eight-Core
> 
> Nehalem
> http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7+960+%40+3.20GHz
> 
> Kaveri
> https://cpubenchmark.net/cpu.php?cpu=AMD+A10-7700K+APU
> 
> Intel's largest IPC jump was from the i7-900 series to i7-2600k. Every iteration from then on was about 5% improvement. Clock for clock. The newer ones became better because they could achieve higher overclocks.
> 
> A lot of the IPC improvements in the newer chips came from newer instruction sets. Games like WoW still depend on older sets. So ideally you shouldn't see too much of a landslide in WoW. However! In SC2 that's a different story altogether, newer instruction sets involved.


You are wrong and those are horrible ways to compare chips.

Bulldozer was around 5-10% slower than Deneb at the same clock
Piledriver is about 1-2% slower than Deneb at the same clock
Steamroller(Kaveri) is about 10% faster than Piledriver at the same clock
Nehalem is around 20-25% faster than Deneb at the same clock

Based on this, how do you figure Kaveri is 10% faster than Nehalem clock for clock? That would be the same as saying Nehalem is equal to Deneb clock for clock and we all know that is not true. Unless you are accounting for only when the new instruction sets are used, otherwise any other time what I just said above is accurate.

http://cpu.userbenchmark.com/Compare/Intel-Core-i7-880-vs-AMD-A10-7700K-APU-R7-Graphics/m15321vsm9888

Even clocked 300MHz slower it is still 27% faster than Kaveri single thread on average.


----------



## SpeedyVT

Quote:


> Originally Posted by *iinversion*
> 
> You are wrong and those are horrible ways to compare chips.
> 
> Bulldozer was around 5-10% slower than Deneb at the same clock
> Piledriver is about 1-2% slower than Deneb at the same clock
> Steamroller(Kaveri) is about 10% faster than Piledriver at the same clock
> Nehalem is around 20-25% faster than Deneb at the same clock
> 
> Based on this, how do you figure Kaveri is 10% faster than Nehalem clock for clock? That would be the same as saying Nehalem is equal to Deneb clock for clock and we all know that is not true. Unless you are accounting for only when the new instruction sets are used, otherwise any other time what I just said above is accurate.
> 
> http://cpu.userbenchmark.com/Compare/Intel-Core-i7-880-vs-AMD-A10-7700K-APU-R7-Graphics/m15321vsm9888
> 
> Even clocked 300MHz slower it is still 27% faster than Kaveri single thread on average.


Sorry I can't trust that site.


----------



## iinversion

Quote:


> Originally Posted by *SpeedyVT*
> 
> Sorry I can't trust that site.


More accurate than the links you linked. Regardless, I don't base my information off that site. What I stated is pretty common knowledge.

http://www.anandtech.com/bench/product/192?vs=1491

Again, Nehalem clocked 300MHz lower and scoring 23% faster in single thread. If AnandTech isn't reputable then I don't know what to tell you.

Steamroller is hardly faster than the Core2 architecture in terms of IPC.


----------



## SpeedyVT

Quote:


> Originally Posted by *iinversion*
> 
> More accurate than the links you linked. Regardless, I don't base my information off that site. What I stated is pretty common knowledge.
> 
> http://www.anandtech.com/bench/product/192?vs=1491
> 
> Again, Nehalem clocked 300MHz lower and scoring 23% faster in single thread. If AnandTech isn't reputable then I don't know what to tell you.
> 
> Steamroller is hardly faster than the Core2 architecture in terms of IPC.


Problem with CPU benchmarks are that they are not representation of actual use. While I'm wrong to use passmark to make my point, various other websites are wrong too. CPUs do not work like benchmarks stress them. Often a CPU is subjected to several different instruction sets simultaneously and thread them accordingly. In a benchmark it focuses on the exact stress levels of it's own key instructions. This improperly depicts generalized use and real life performance. The greatest problem in CPUs is how fast can they switch from one instruction set to the next.


----------



## Slink3Slyde

Read this article earlier today on Techspot. Its comparing ten years of Intel CPU's but they have a 8350 and an A10 7870k thrown in there for comparison. Synthetics, applications, encoding and gaming tested. Go ahead fellas.

http://www.techspot.com/article/1039-ten-years-intel-cpu-compared/page2.html

Meanwhile, back closer to topic: I'm waiting for price drops on Fury (X)'s or 980's, if it takes too long I think my 780 can hold me out until Pascal. I've had an itchy trigger finger for to long now.


----------



## Themisseble

Quote:


> Originally Posted by *iinversion*
> 
> More accurate than the links you linked. Regardless, I don't base my information off that site. What I stated is pretty common knowledge.
> 
> http://www.anandtech.com/bench/product/192?vs=1491
> 
> Again, Nehalem clocked 300MHz lower and scoring 23% faster in single thread. If AnandTech isn't reputable then I don't know what to tell you.
> 
> Steamroller is hardly faster than the Core2 architecture in terms of IPC.


can you tell me what exactly affect on IPC of CPU?
- INTEGER perf.?
- Speed and size of L1,L2,L3?
- FPU perf?
- Instructions?

If you think that I5 Haswell is 2x faster than FX 4300 in something like BF4 or LoL or any other normal game, yes then you are wrong.
I can tell you that FX 4300 5GHz will do extreme well against i5 4460 3.2/3.4GHz. - It may be slower or match or even beat i5.

Seriously, I can see that FX 6300 does pretty well on AMD card against INTEL on AMD card. You might be surprised...


----------



## iSlayer

AMD anything, higher IPC than Nehalem.

Ahahhahahahahaa I wish. They might not be one foot in the grave if that were the case.

FX 4300 matching a stock i5 with a 5GHz overclock. Yah, alright there themissable. I'm sure you have some bogus or cherrypicked benches to back up that claim.

@2010rig if you need to laugh the page or three has ya covered.


----------



## SpeedyVT

Quote:


> Originally Posted by *Themisseble*
> 
> can you tell me what exactly affect on IPC of CPU?
> - INTEGER perf.?
> - Speed and size of L1,L2,L3?
> - FPU perf?
> - Instructions?
> 
> If you think that I5 Haswell is 2x faster than FX 4300 in something like BF4 or LoL or any other normal game, yes then you are wrong.
> I can tell you that FX 4300 5GHz will do extreme well against i5 4460 3.2/3.4GHz. - It may be slower or match or even beat i5.
> 
> Seriously, I can see that FX 6300 does pretty well on AMD card against INTEL on AMD card. You might be surprised...


Quote:


> Originally Posted by *iSlayer*
> 
> AMD anything, higher IPC than Nehalem.
> 
> Ahahhahahahahaa I wish. They might not be one foot in the grave if that were the case.
> 
> FX 4300 matching a stock i5 with a 5GHz overclock. Yah, alright there themissable. I'm sure you have some bogus or cherrypicked benches to back up that claim.
> 
> @2010rig if you need to laugh the page or three has ya covered.


What it comes down to is that in a real world environment the synthetic benchmarks often mean nothing. iSlayer is right, but Kaveri has better context switching than as old as Nehalem which gives it a snappier feel and quicker response. It can also have more background threads due to cache size. To say Kaveri is weaker is just as wrong to say it's stronger. However I would never call them equals.


----------



## Themisseble

Quote:


> Originally Posted by *iSlayer*
> 
> AMD anything, higher IPC than Nehalem.
> 
> Ahahhahahahahaa I wish. They might not be one foot in the grave if that were the case.
> 
> FX 4300 matching a stock i5 with a 5GHz overclock. Yah, alright there themissable. I'm sure you have some bogus or cherrypicked benches to back up that claim.
> 
> @2010rig if you need to laugh the page or three has ya covered.


That depends on game, but I can tell that. I tested it on my own.
did few benchmarks in BF4 where i5 3.4GHz did a lot better than FX 4300 5.0GHz - not a big deal while FX was hitting 125-130FPS, i5 did 150-160FPS. In MOBA like LOL FX at 5.0GHz was faster... that depends on game ofcourse. Some games will need more FPU some not.

iSlayer you can laugh but you cant prove. Whenever game was using 4 cores FX was loosing... modules cripple AMD perf.


----------



## Bytales

I just ordered a Fury X, and a 2280 Samsung pci xpress 3.0 4x SSD, in the next weekend im going to install windows 10 and im going to do some API test in dx 12 with the futuremark benchmark, has anyone done it yet to see what kinds of number the fury x scores ?


----------



## provost

Quote:


> Originally Posted by *Bytales*
> 
> I just ordered a Fury X, and a 2280 Samsung pci xpress 3.0 4x SSD, in the next weekend im going to install windows 10 and im going to do some API test in dx 12 with the futuremark benchmark, has anyone done it yet to see what kinds of number the fury x scores ?


Yeah, that would be good to find out. Hope you share your findings.

Every time I come to these news threads all I ever see are advocates of one side making same old arguments against whatever product they are not promoting and conversely the advocates of the other side vehemently defending whatever product they happen to be promoting. One gigantic promotional and counter promotional thread after thread with little real world/real user information being broadcasted without polluting every opinion with an undercurrent of obvious bias trigger words and terms. it's almost down to a formula at this point.

Would love to see a sanitized informational thread anywhere on the web, that can save consumers time (and money) without having to hear sales pitch and counter sales pitch over and over again.


----------



## magnek

Believe it or not OCN is actually quite tame compared to most other sites, where some of the users I truly would describe as being "rabid".


----------



## Thoth420

Quote:


> Originally Posted by *magnek*
> 
> Believe it or not OCN is actually quite tame compared to most other sites, where some of the users I truly would describe as being "rabid".


I agree 100%. I'm a novice builder and only usually OC my GPU yet this is my primary forum because the community is vast, knowledgeable and mostly very friendly and helpful.


----------



## Liranan

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Read this article earlier today on Techspot. Its comparing ten years of Intel CPU's but they have a 8350 and an A10 7870k thrown in there for comparison. Synthetics, applications, encoding and gaming tested. Go ahead fellas.
> 
> http://www.techspot.com/article/1039-ten-years-intel-cpu-compared/page2.html
> 
> Meanwhile, back closer to topic: I'm waiting for price drops on Fury (X)'s or 980's, if it takes too long I think my 780 can hold me out until Pascal. I've had an itchy trigger finger for to long now.


Thank you, my trusted 8320 will last me another few years at least.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Liranan*
> 
> Thank you, my trusted 8320 will last me another few years at least.


Yea, you should be fine a couple of years at least. I just moved from Sandy and it wasn't for performance gains, just a part failure and wanting something to do.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Liranan*
> 
> Thank you, my trusted 8320 will last me another few years at least.


No problem, just happened to read it earlier yesterday and come across that conversation later on.

I also came across this which I hadn't seen before yesterday.

http://www.techspot.com/article/942-five-generations-amd-radeon-graphics-compared/page9.html

Interesting to see how up until the Fury's AMD cards have actually been catching up in performance generation to generation with Nvidia, at least at stock clocks. Original 7970 came first and wasnt far off the 680 at its release even with its crazy low stock clocks.

A step to far for GCN I think, unfortunately. Before anyone jumps on me, Fury's aren't bad, it's been said many times though in different ways here: When you release after the competition and price the same you cannot under perform them and expect a great success.

I still might buy a Fury though if I can find one eventually


----------



## Bytales

That means my dual 12 core CPUs, which since i bought them are 500 EUROs more expensive each, will last me for quite a few more years.
I had felt it in my urine that having so much cores will someday pay for itself, now we have windows 10 with dx12 which is said to use every core available. That probably means that my Dual E5-2690v3 will probably be usefull somehow.

1)We are going to need to test this to see if there are any differences.
So we need users with Windows 10 and R9 Fury X and somehow test something to see if there is palpable difference (fps, or Draw calls) when going from 2 4 8 16 and above that threads! (for instance my rig has 12x2x2=48 threads)

2)The second thing that needs to get tested, and for that i need another R9 Fury X, is the following.
I heard that this card in crossfire supposedly uses less video memory than when a single card is used.
This coupled with the amazing crossfire scalling will probably make the R9 Fury X worthy off purchasing in pairs. If that is true, it can perhaps mitigate the low memory issue of the fury X, as in 4gb is to low compared to 6 of the 980ti, 8 of the 390x and 12 of the titan X


----------



## Offler

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Read this article earlier today on Techspot. Its comparing ten years of Intel CPU's but they have a 8350 and an A10 7870k thrown in there for comparison. Synthetics, applications, encoding and gaming tested. Go ahead fellas.
> 
> http://www.techspot.com/article/1039-ten-years-intel-cpu-compared/page2.html
> 
> Meanwhile, back closer to topic: I'm waiting for price drops on Fury (X)'s or 980's, if it takes too long I think my 780 can hold me out until Pascal. I've had an itchy trigger finger for to long now.


Nice comparison. I would like to see Phenoms, which were being sold at that time and i would like to see OLD benchmark (without AVX)

On topic:
Any scores of FuryX from normal users?


----------



## 033Y5

Quote:


> Originally Posted by *blue1512*
> 
> On the topic of HBM, the fact is AMD only locked it in CCC. It can be overclocked after all.
> 
> 
> 
> And guys at Guru3D are overvolting the core like hell.
> So far so good for "overclocker's dream"


Quote:


> Originally Posted by *blue1512*
> 
> Another HBM overclock
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


why is there a difference in ROPs, pixal fillrate and bandwidth are they both fury x's or is one fury and one fury x


----------



## Liranan

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yea, you should be fine a couple of years at least. I just moved from Sandy and it wasn't for performance gains, just a part failure and wanting something to do.


Even if Zen is a massive leap from the BD uArch to SB/IB IPC or IPS that jump is still not enough to warrant an upgrade. The only problem I have with my CPU is power consumption. It uses truly massive amounts of wattage at stock and even more when OC'd. There is no way I believe that a CPU that uses 160-180W at stock uses only 220W OC'd by a gigahertz.

Late at night I sit in the dark rubbing my precioussss.


----------



## iinversion

Quote:


> Originally Posted by *Liranan*
> 
> Even if Zen is a massive leap from the BD uArch to SB/IB IPC or IPS that jump is still not enough to warrant an upgrade. The only problem I have with my CPU is power consumption. It uses truly massive amounts of wattage at stock and even more when OC'd. There is no way I believe that a CPU that uses 160-180W at stock uses only 220W OC'd by a gigahertz.
> 
> Late at night I sit in the dark rubbing my precioussss.


I have seen both FX 81xx and 83xx pull 400-500W by themselves after PSU efficency calculations when OC'd to 5GHz or higher.

It is truly ridiculous.


----------



## Themisseble

Any CPu can use more than 400W if you want so... but you will need good cooling. Basically I run 4.5GHz and use less than stock. Depends on how good your CHIP is and where is best spot for you CPU/GPU.

Lets say 3.5GHz = 100W
4.4GHz = 125W
4.5GHz = 175W
4.6GHz = 250W
... silicon...

So comparing OC power consumption and different silicon is just stupid.


----------



## iSlayer

Not literally Themisseble, just close to enough to hamper your point.

Pushing say, a 4790k to use 400w of power would require LN2. Pushing an FX 9590 to 400 watts can probably be done on air since it does 300+ at stock.

You just have to adjust for what is realistically possible on air/water/LN2.


----------



## KarathKasun

FX-9590 is barely able to be cooled with ~$100 air coolers, and Its ~250w @ stock. You have to measure at the CPU socket to find its power draw alone. You are looking at total system power load to get 300-400w (actually 330w at the wall in reviews). MB uses power, RAM uses power, GPU uses some...

You would be hard pressed to air cool 400w without screaming 250 CFM fans.

Also, watts are watts. There are no Intel or AMD specific watts outside of TDP specs. A highly overclocked, air cooled i7 can easily push the power envelope into the high 100's or low 200's. Which just happens to be a bit under the FX-9590. Which also seems to be around the upper bound of the ability of air cooling to maintain reasonable temperatures (with reasonable noise levels) for a single chip.


----------



## Themisseble

Quote:


> Originally Posted by *iSlayer*
> 
> Not literally Themisseble, just close to enough to hamper your point.
> 
> Pushing say, a 4790k to use 400w of power would require LN2. Pushing an FX 9590 to 400 watts can probably be done on air since it does 300+ at stock.
> 
> You just have to adjust for what is realistically possible on air/water/LN2.


No entirely true. I7 4790K with OC can push 300W+ on air cooled.
CMT is more efficient than SMT we all know that. But here mor ethan that and we cannot compare 32nm vs 22nm. Even 32nm sandy is better than piledriver 32nm.
You can see that FX 4300 at 4.8GHz will use just little more power than stock.

I can push FX 6300 to 1.55V with 30-40$ air cooler.
Check this out
No PHYSX needed
http://www.ign.com/videos/2015/08/06/17-minutes-of-explosive-crackdown-3-gameplay-gamescom-2015


----------



## Slink3Slyde

Quote:


> Originally Posted by *Themisseble*
> 
> No entirely true. I7 4790K with OC can push 300W+ on air cooled.
> CMT is more efficient than SMT we all know that. But here mor ethan that and we cannot compare 32nm vs 22nm. Even 32nm sandy is better than piledriver 32nm.
> You can see that FX 4300 at 4.8GHz will use just little more power than stock.
> 
> I can push FX 6300 to 1.55V with 30-40$ air cooler.
> Check this out
> No PHYSX needed
> http://www.ign.com/videos/2015/08/06/17-minutes-of-explosive-crackdown-3-gameplay-gamescom-2015


Power usage is related to cooling? That's a new one on me if so. I thought P=VI.. I could be wrong, my physics knowledge is limited to a college course in electronics 15 years ago I haven't really used since.









However, here's a 4790k at a very unreasonable 1.45 volts @ 4.8 ghz on Anand using 225 watts that makes me question that statement.

http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/2


----------



## Forceman

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Power usage is related to cooling? That's a new one on me if so. I thought P=VI.. I could be wrong, my physics knowledge is limited to a college course in electronics 15 years ago I haven't really used since.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, here's a 4790k at a very unreasonable 1.45 volts @ 4.8 ghz on Anand using 225 watts that makes me question that statement.
> 
> http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/2


Higher temps means higher resistance, which means more current draw, which means more power use.


----------



## Themisseble

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Power usage is related to cooling? That's a new one on me if so. I thought P=VI.. I could be wrong, my physics knowledge is limited to a college course in electronics 15 years ago I haven't really used since.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, here's a 4790k at a very unreasonable 1.45 volts @ 4.8 ghz on Anand using 225 watts that makes me question that statement.
> 
> http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/2


Yeah stock = 4.6GHz .. cmon.


----------



## Tojara

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Power usage is related to cooling? That's a new one on me if so. I thought P=VI.. I could be wrong, my physics knowledge is limited to a college course in electronics 15 years ago I haven't really used since.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, here's a 4790k at a very unreasonable 1.45 volts @ 4.8 ghz on Anand using 225 watts that makes me question that statement.
> 
> http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/2


The power usage on processors scales reasonably linearly with the frequency multiplied by the square of voltage. You can't isolate the processor so it isn't linear, without other components it usually falls within 10% margin of error from stock to max OC on 24/7 cooling solutions which is accurate enough for most purposes. And yes, power usage is related to cooling, in practice everything you insert as electricity comes out as heat. Most modern processors can however exceed TDP even if simply coming from idle/lower load simply due to the fact that they are not near the operating temperature range which takes a some time to catch up. Peak power will almost certainly be over TDP, but that only really matters for the power supply. How the processor outputs a various amount heat over small fractions of a second doesn't matter for cooling due to the length of the time and possibility of storing heat.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Forceman*
> 
> Higher temps means higher resistance, which means more current draw, which means more power use.


Interesting I found a forum post on Anand where someone tested a 2600k and found a difference of about 25 watts between power consumption at ~65c and 98C @ 3 Ghz. I guess that difference in power usage would increase at higher multipliers at higher temps but I cant see it pushing a quad core Intel chip anywhere near a FX9590 8 at any sort of 24/7 voltage?
Quote:


> Originally Posted by *Themisseble*
> 
> Yeah stock = 4.6GHz .. cmon.


Intel doesnt have 'stock' voltage, it has Auto. Auto voltage is way higher then any chip needs at stock, my 3570k takes 1.3 volts at 3.6 ghz across all cores at 'stock' with XMP enabled, I run 24/7 at 1.328 volts 4.4 ghz under load.

Devils canyon also turbos one core up to 4.4 ghz or 4.2 ghz across all cores so 4.6 isnt a huge jump, take a look at the voltages people are using in the owners club for reference.

http://www.overclock.net/t/1490324/the-intel-devils-canyon-owners-club
Quote:


> Originally Posted by *Tojara*
> 
> The power usage on processors scales reasonably linearly with the frequency multiplied by the square of voltage. You can't isolate the processor so it isn't linear, without other components it usually falls within 10% margin of error from stock to max OC on 24/7 cooling solutions which is accurate enough for most purposes. And yes, power usage is related to cooling, in practice everything you insert as electricity comes out as heat. Most modern processors can however exceed TDP even if simply coming from idle/lower load simply due to the fact that they are not near the operating temperature range which takes a some time to catch up. Peak power will almost certainly be over TDP, but that only really matters for the power supply. How the processor outputs a various amount heat over small fractions of a second doesn't matter for cooling due to the length of the time and possibility of storing heat.


I'm a bit dozy but a little confused by your post. If the square of the voltage ,multiplied by the frequency gives you power usage, then we are using megamegawatts of power on our CPU's?

I do understand that heat is related to electricity and also that TDP doesnt equal actual power usage. To be fair as I said I'm no physicist, I dont mean to revel in my ignorance but there's a bit of apathy on my part at the moment to take the time to put in the extra reading to understand what you are saying further, no offence, I did try for a while


----------



## AmericanLoco

Quote:


> Originally Posted by *Forceman*
> 
> Higher temps means higher resistance, which means more current draw, which means more power use.


Just a small correction, higher temperatures generally result in higher resistance which would _reduce_ power consumption, since less current is flowing. Higher temperatures increase _leakage_ in semiconductors. Leakage is current tunneling through the insulating layers inside a chip, this gets worse the hotter the chip gets.


----------



## KarathKasun

Quote:


> Originally Posted by *AmericanLoco*
> 
> Just a small correction, higher temperatures generally result in higher resistance which would _reduce_ power consumption, since less current is flowing. Higher temperatures increase _leakage_ in semiconductors. Leakage is current tunneling through the insulating layers inside a chip, this gets worse the hotter the chip gets.


Just... no on the first part.

Higher resistance at an equal voltage requires more amperage to operate. Therefore higher power usage. It also means more power gets dumped as heat. Power usage only drops if you maintain a constant amperage, in that case voltage would drop off quickly. This is why you end up with a condition called thermal runaway if cooling fails and there are no safeguards.

Leakage is another factor altogether. Leakage is occurring through a much higher resistance material, and thus increases temperatures even more than simple heat/resistance would.


----------



## Forceman

Quote:


> Originally Posted by *AmericanLoco*
> 
> Just a small correction, higher temperatures generally result in higher resistance which would _reduce_ power consumption, since less current is flowing.


For simplicity I left out a few steps, but it's what Karath said, the higher resistance causes voltage to drop, and then the voltage regulator increases the amperage to bring the voltage back to what it is supposed to be. That's where the increased current flow comes from. Otherwise you'd just crash because of the voltage reduction.


----------



## iSlayer

http://www.anandtech.com/show/8426/the-intel-haswell-e-cpu-review-core-i7-5960x-i7-5930k-i7-5820k-tested/3

Haswell-E can go up to 400 watts...but you'd need to be on water and pushing more volts than is 24/7 recommended.


----------

